this repo has no description
1# Cloudflare Durable Objects PDS Implementation Plan
2
3> **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task.
4
5**Goal:** Build a minimal AT Protocol PDS on Cloudflare Durable Objects with zero dependencies.
6
7**Architecture:** Each user gets their own Durable Object with SQLite storage. A router Worker maps DIDs to Objects. All crypto uses Web Crypto API (P-256 for signing, SHA-256 for hashing).
8
9**Tech Stack:** Cloudflare Workers, Durable Objects, SQLite, Web Crypto API, no npm dependencies.
10
11---
12
13## Task 1: Project Setup
14
15**Files:**
16- Create: `package.json`
17- Create: `wrangler.toml`
18- Create: `src/pds.js`
19
20**Step 1: Initialize package.json**
21
22```json
23{
24 "name": "cloudflare-pds",
25 "version": "0.1.0",
26 "private": true,
27 "scripts": {
28 "dev": "wrangler dev",
29 "deploy": "wrangler deploy",
30 "test": "node test/run.js"
31 }
32}
33```
34
35**Step 2: Create wrangler.toml**
36
37```toml
38name = "atproto-pds"
39main = "src/pds.js"
40compatibility_date = "2024-01-01"
41
42[[durable_objects.bindings]]
43name = "PDS"
44class_name = "PersonalDataServer"
45
46[[migrations]]
47tag = "v1"
48new_sqlite_classes = ["PersonalDataServer"]
49```
50
51**Step 3: Create minimal src/pds.js skeleton**
52
53```javascript
54export class PersonalDataServer {
55 constructor(state, env) {
56 this.state = state
57 this.sql = state.storage.sql
58 }
59
60 async fetch(request) {
61 return new Response('pds running', { status: 200 })
62 }
63}
64
65export default {
66 async fetch(request, env) {
67 const url = new URL(request.url)
68 const did = url.searchParams.get('did')
69
70 if (!did) {
71 return new Response('missing did param', { status: 400 })
72 }
73
74 const id = env.PDS.idFromName(did)
75 const pds = env.PDS.get(id)
76 return pds.fetch(request)
77 }
78}
79```
80
81**Step 4: Verify it runs**
82
83Run: `npx wrangler dev`
84Test: `curl "http://localhost:8787/?did=did:plc:test"`
85Expected: `pds running`
86
87**Step 5: Commit**
88
89```bash
90git init
91git add -A
92git commit -m "feat: initial project setup with Durable Object skeleton"
93```
94
95---
96
97## Task 2: CBOR Encoding
98
99**Files:**
100- Modify: `src/pds.js`
101
102Implement minimal deterministic CBOR encoding. Only the types AT Protocol uses: maps, arrays, strings, bytes, integers, null, booleans.
103
104**Step 1: Add CBOR encoding function**
105
106Add to top of `src/pds.js`:
107
108```javascript
109// === CBOR ENCODING ===
110// Minimal deterministic CBOR (RFC 8949) - sorted keys, minimal integers
111
112function cborEncode(value) {
113 const parts = []
114
115 function encode(val) {
116 if (val === null) {
117 parts.push(0xf6) // null
118 } else if (val === true) {
119 parts.push(0xf5) // true
120 } else if (val === false) {
121 parts.push(0xf4) // false
122 } else if (typeof val === 'number') {
123 encodeInteger(val)
124 } else if (typeof val === 'string') {
125 const bytes = new TextEncoder().encode(val)
126 encodeHead(3, bytes.length) // major type 3 = text string
127 parts.push(...bytes)
128 } else if (val instanceof Uint8Array) {
129 encodeHead(2, val.length) // major type 2 = byte string
130 parts.push(...val)
131 } else if (Array.isArray(val)) {
132 encodeHead(4, val.length) // major type 4 = array
133 for (const item of val) encode(item)
134 } else if (typeof val === 'object') {
135 // Sort keys for deterministic encoding
136 const keys = Object.keys(val).sort()
137 encodeHead(5, keys.length) // major type 5 = map
138 for (const key of keys) {
139 encode(key)
140 encode(val[key])
141 }
142 }
143 }
144
145 function encodeHead(majorType, length) {
146 const mt = majorType << 5
147 if (length < 24) {
148 parts.push(mt | length)
149 } else if (length < 256) {
150 parts.push(mt | 24, length)
151 } else if (length < 65536) {
152 parts.push(mt | 25, length >> 8, length & 0xff)
153 } else if (length < 4294967296) {
154 parts.push(mt | 26, (length >> 24) & 0xff, (length >> 16) & 0xff, (length >> 8) & 0xff, length & 0xff)
155 }
156 }
157
158 function encodeInteger(n) {
159 if (n >= 0) {
160 encodeHead(0, n) // major type 0 = unsigned int
161 } else {
162 encodeHead(1, -n - 1) // major type 1 = negative int
163 }
164 }
165
166 encode(value)
167 return new Uint8Array(parts)
168}
169```
170
171**Step 2: Add simple test endpoint**
172
173Modify the fetch handler temporarily:
174
175```javascript
176async fetch(request) {
177 const url = new URL(request.url)
178 if (url.pathname === '/test/cbor') {
179 const encoded = cborEncode({ hello: 'world', num: 42 })
180 return new Response(encoded, {
181 headers: { 'content-type': 'application/cbor' }
182 })
183 }
184 return new Response('pds running', { status: 200 })
185}
186```
187
188**Step 3: Verify CBOR output**
189
190Run: `npx wrangler dev`
191Test: `curl "http://localhost:8787/test/cbor?did=did:plc:test" | xxd`
192Expected: Valid CBOR bytes (a2 65 68 65 6c 6c 6f 65 77 6f 72 6c 64 63 6e 75 6d 18 2a)
193
194**Step 4: Commit**
195
196```bash
197git add src/pds.js
198git commit -m "feat: add deterministic CBOR encoding"
199```
200
201---
202
203## Task 3: CID Generation
204
205**Files:**
206- Modify: `src/pds.js`
207
208Generate CIDs (Content Identifiers) using SHA-256 + multiformat encoding.
209
210**Step 1: Add CID utilities**
211
212Add after CBOR section:
213
214```javascript
215// === CID GENERATION ===
216// dag-cbor (0x71) + sha-256 (0x12) + 32 bytes
217
218async function createCid(bytes) {
219 const hash = await crypto.subtle.digest('SHA-256', bytes)
220 const hashBytes = new Uint8Array(hash)
221
222 // CIDv1: version(1) + codec(dag-cbor=0x71) + multihash(sha256)
223 // Multihash: hash-type(0x12) + length(0x20=32) + digest
224 const cid = new Uint8Array(2 + 2 + 32)
225 cid[0] = 0x01 // CIDv1
226 cid[1] = 0x71 // dag-cbor codec
227 cid[2] = 0x12 // sha-256
228 cid[3] = 0x20 // 32 bytes
229 cid.set(hashBytes, 4)
230
231 return cid
232}
233
234function cidToString(cid) {
235 // base32lower encoding for CIDv1
236 return 'b' + base32Encode(cid)
237}
238
239function base32Encode(bytes) {
240 const alphabet = 'abcdefghijklmnopqrstuvwxyz234567'
241 let result = ''
242 let bits = 0
243 let value = 0
244
245 for (const byte of bytes) {
246 value = (value << 8) | byte
247 bits += 8
248 while (bits >= 5) {
249 bits -= 5
250 result += alphabet[(value >> bits) & 31]
251 }
252 }
253
254 if (bits > 0) {
255 result += alphabet[(value << (5 - bits)) & 31]
256 }
257
258 return result
259}
260```
261
262**Step 2: Add test endpoint**
263
264```javascript
265if (url.pathname === '/test/cid') {
266 const data = cborEncode({ test: 'data' })
267 const cid = await createCid(data)
268 return Response.json({ cid: cidToString(cid) })
269}
270```
271
272**Step 3: Verify CID generation**
273
274Run: `npx wrangler dev`
275Test: `curl "http://localhost:8787/test/cid?did=did:plc:test"`
276Expected: JSON with CID string starting with 'b'
277
278**Step 4: Commit**
279
280```bash
281git add src/pds.js
282git commit -m "feat: add CID generation with SHA-256"
283```
284
285---
286
287## Task 4: TID Generation
288
289**Files:**
290- Modify: `src/pds.js`
291
292Generate TIDs (Timestamp IDs) for record keys and revisions.
293
294**Step 1: Add TID utilities**
295
296Add after CID section:
297
298```javascript
299// === TID GENERATION ===
300// Timestamp-based IDs: base32-sort encoded microseconds + clock ID
301
302const TID_CHARS = '234567abcdefghijklmnopqrstuvwxyz'
303let lastTimestamp = 0
304let clockId = Math.floor(Math.random() * 1024)
305
306function createTid() {
307 let timestamp = Date.now() * 1000 // microseconds
308
309 // Ensure monotonic
310 if (timestamp <= lastTimestamp) {
311 timestamp = lastTimestamp + 1
312 }
313 lastTimestamp = timestamp
314
315 // 13 chars: 11 for timestamp (64 bits but only ~53 used), 2 for clock ID
316 let tid = ''
317
318 // Encode timestamp (high bits first for sortability)
319 let ts = timestamp
320 for (let i = 0; i < 11; i++) {
321 tid = TID_CHARS[ts & 31] + tid
322 ts = Math.floor(ts / 32)
323 }
324
325 // Append clock ID (2 chars)
326 tid += TID_CHARS[(clockId >> 5) & 31]
327 tid += TID_CHARS[clockId & 31]
328
329 return tid
330}
331```
332
333**Step 2: Add test endpoint**
334
335```javascript
336if (url.pathname === '/test/tid') {
337 const tids = [createTid(), createTid(), createTid()]
338 return Response.json({ tids })
339}
340```
341
342**Step 3: Verify TIDs are monotonic**
343
344Run: `npx wrangler dev`
345Test: `curl "http://localhost:8787/test/tid?did=did:plc:test"`
346Expected: Three 13-char TIDs, each greater than the previous
347
348**Step 4: Commit**
349
350```bash
351git add src/pds.js
352git commit -m "feat: add TID generation for record keys"
353```
354
355---
356
357## Task 5: SQLite Schema
358
359**Files:**
360- Modify: `src/pds.js`
361
362Initialize the database schema when the Durable Object starts.
363
364**Step 1: Add schema initialization**
365
366Modify the constructor:
367
368```javascript
369export class PersonalDataServer {
370 constructor(state, env) {
371 this.state = state
372 this.sql = state.storage.sql
373 this.env = env
374
375 // Initialize schema
376 this.sql.exec(`
377 CREATE TABLE IF NOT EXISTS blocks (
378 cid TEXT PRIMARY KEY,
379 data BLOB NOT NULL
380 );
381
382 CREATE TABLE IF NOT EXISTS records (
383 uri TEXT PRIMARY KEY,
384 cid TEXT NOT NULL,
385 collection TEXT NOT NULL,
386 rkey TEXT NOT NULL,
387 value BLOB NOT NULL
388 );
389
390 CREATE TABLE IF NOT EXISTS commits (
391 seq INTEGER PRIMARY KEY AUTOINCREMENT,
392 cid TEXT NOT NULL,
393 rev TEXT NOT NULL,
394 prev TEXT
395 );
396
397 CREATE TABLE IF NOT EXISTS seq_events (
398 seq INTEGER PRIMARY KEY AUTOINCREMENT,
399 did TEXT NOT NULL,
400 commit_cid TEXT NOT NULL,
401 evt BLOB NOT NULL
402 );
403
404 CREATE INDEX IF NOT EXISTS idx_records_collection ON records(collection, rkey);
405 `)
406 }
407 // ... rest of class
408}
409```
410
411**Step 2: Add test endpoint to verify schema**
412
413```javascript
414if (url.pathname === '/test/schema') {
415 const tables = this.sql.exec(`
416 SELECT name FROM sqlite_master WHERE type='table' ORDER BY name
417 `).toArray()
418 return Response.json({ tables: tables.map(t => t.name) })
419}
420```
421
422**Step 3: Verify schema creates**
423
424Run: `npx wrangler dev`
425Test: `curl "http://localhost:8787/test/schema?did=did:plc:test"`
426Expected: `{"tables":["blocks","commits","records","seq_events"]}`
427
428**Step 4: Commit**
429
430```bash
431git add src/pds.js
432git commit -m "feat: add SQLite schema for PDS storage"
433```
434
435---
436
437## Task 6: P-256 Signing
438
439**Files:**
440- Modify: `src/pds.js`
441
442Add P-256 ECDSA signing using Web Crypto API.
443
444**Step 1: Add signing utilities**
445
446Add after TID section:
447
448```javascript
449// === P-256 SIGNING ===
450// Web Crypto ECDSA with P-256 curve
451
452async function importPrivateKey(privateKeyBytes) {
453 // PKCS#8 wrapper for raw P-256 private key
454 const pkcs8Prefix = new Uint8Array([
455 0x30, 0x41, 0x02, 0x01, 0x00, 0x30, 0x13, 0x06, 0x07, 0x2a, 0x86, 0x48,
456 0xce, 0x3d, 0x02, 0x01, 0x06, 0x08, 0x2a, 0x86, 0x48, 0xce, 0x3d, 0x03,
457 0x01, 0x07, 0x04, 0x27, 0x30, 0x25, 0x02, 0x01, 0x01, 0x04, 0x20
458 ])
459
460 const pkcs8 = new Uint8Array(pkcs8Prefix.length + 32)
461 pkcs8.set(pkcs8Prefix)
462 pkcs8.set(privateKeyBytes, pkcs8Prefix.length)
463
464 return crypto.subtle.importKey(
465 'pkcs8',
466 pkcs8,
467 { name: 'ECDSA', namedCurve: 'P-256' },
468 false,
469 ['sign']
470 )
471}
472
473async function sign(privateKey, data) {
474 const signature = await crypto.subtle.sign(
475 { name: 'ECDSA', hash: 'SHA-256' },
476 privateKey,
477 data
478 )
479 return new Uint8Array(signature)
480}
481
482async function generateKeyPair() {
483 const keyPair = await crypto.subtle.generateKey(
484 { name: 'ECDSA', namedCurve: 'P-256' },
485 true,
486 ['sign', 'verify']
487 )
488
489 // Export private key as raw bytes
490 const privateJwk = await crypto.subtle.exportKey('jwk', keyPair.privateKey)
491 const privateBytes = base64UrlDecode(privateJwk.d)
492
493 // Export public key as compressed point
494 const publicRaw = await crypto.subtle.exportKey('raw', keyPair.publicKey)
495 const publicBytes = new Uint8Array(publicRaw)
496 const compressed = compressPublicKey(publicBytes)
497
498 return { privateKey: privateBytes, publicKey: compressed }
499}
500
501function compressPublicKey(uncompressed) {
502 // uncompressed is 65 bytes: 0x04 + x(32) + y(32)
503 // compressed is 33 bytes: prefix(02 or 03) + x(32)
504 const x = uncompressed.slice(1, 33)
505 const y = uncompressed.slice(33, 65)
506 const prefix = (y[31] & 1) === 0 ? 0x02 : 0x03
507 const compressed = new Uint8Array(33)
508 compressed[0] = prefix
509 compressed.set(x, 1)
510 return compressed
511}
512
513function base64UrlDecode(str) {
514 const base64 = str.replace(/-/g, '+').replace(/_/g, '/')
515 const binary = atob(base64)
516 const bytes = new Uint8Array(binary.length)
517 for (let i = 0; i < binary.length; i++) {
518 bytes[i] = binary.charCodeAt(i)
519 }
520 return bytes
521}
522
523function bytesToHex(bytes) {
524 return Array.from(bytes).map(b => b.toString(16).padStart(2, '0')).join('')
525}
526
527function hexToBytes(hex) {
528 const bytes = new Uint8Array(hex.length / 2)
529 for (let i = 0; i < hex.length; i += 2) {
530 bytes[i / 2] = parseInt(hex.substr(i, 2), 16)
531 }
532 return bytes
533}
534```
535
536**Step 2: Add test endpoint**
537
538```javascript
539if (url.pathname === '/test/sign') {
540 const kp = await generateKeyPair()
541 const data = new TextEncoder().encode('test message')
542 const key = await importPrivateKey(kp.privateKey)
543 const sig = await sign(key, data)
544 return Response.json({
545 publicKey: bytesToHex(kp.publicKey),
546 signature: bytesToHex(sig)
547 })
548}
549```
550
551**Step 3: Verify signing works**
552
553Run: `npx wrangler dev`
554Test: `curl "http://localhost:8787/test/sign?did=did:plc:test"`
555Expected: JSON with 66-char public key hex and 128-char signature hex
556
557**Step 4: Commit**
558
559```bash
560git add src/pds.js
561git commit -m "feat: add P-256 ECDSA signing via Web Crypto"
562```
563
564---
565
566## Task 7: Identity Storage
567
568**Files:**
569- Modify: `src/pds.js`
570
571Store DID and signing key in Durable Object storage.
572
573**Step 1: Add identity methods to class**
574
575Add to PersonalDataServer class:
576
577```javascript
578async initIdentity(did, privateKeyHex) {
579 await this.state.storage.put('did', did)
580 await this.state.storage.put('privateKey', privateKeyHex)
581}
582
583async getDid() {
584 if (!this._did) {
585 this._did = await this.state.storage.get('did')
586 }
587 return this._did
588}
589
590async getSigningKey() {
591 const hex = await this.state.storage.get('privateKey')
592 if (!hex) return null
593 return importPrivateKey(hexToBytes(hex))
594}
595```
596
597**Step 2: Add init endpoint**
598
599```javascript
600if (url.pathname === '/init') {
601 const body = await request.json()
602 if (!body.did || !body.privateKey) {
603 return Response.json({ error: 'missing did or privateKey' }, { status: 400 })
604 }
605 await this.initIdentity(body.did, body.privateKey)
606 return Response.json({ ok: true, did: body.did })
607}
608```
609
610**Step 3: Add status endpoint**
611
612```javascript
613if (url.pathname === '/status') {
614 const did = await this.getDid()
615 return Response.json({
616 initialized: !!did,
617 did: did || null
618 })
619}
620```
621
622**Step 4: Verify identity storage**
623
624Run: `npx wrangler dev`
625
626```bash
627# Check uninitialized
628curl "http://localhost:8787/status?did=did:plc:test"
629# Expected: {"initialized":false,"did":null}
630
631# Initialize
632curl -X POST "http://localhost:8787/init?did=did:plc:test" \
633 -H "Content-Type: application/json" \
634 -d '{"did":"did:plc:test","privateKey":"0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef"}'
635# Expected: {"ok":true,"did":"did:plc:test"}
636
637# Check initialized
638curl "http://localhost:8787/status?did=did:plc:test"
639# Expected: {"initialized":true,"did":"did:plc:test"}
640```
641
642**Step 5: Commit**
643
644```bash
645git add src/pds.js
646git commit -m "feat: add identity storage and init endpoint"
647```
648
649---
650
651## Task 8: MST (Merkle Search Tree)
652
653**Files:**
654- Modify: `src/pds.js`
655
656Implement simple MST that rebuilds on each write.
657
658**Step 1: Add MST utilities**
659
660Add after signing section:
661
662```javascript
663// === MERKLE SEARCH TREE ===
664// Simple rebuild-on-write implementation
665
666async function sha256(data) {
667 const hash = await crypto.subtle.digest('SHA-256', data)
668 return new Uint8Array(hash)
669}
670
671function getKeyDepth(key) {
672 // Count leading zeros in hash to determine tree depth
673 const keyBytes = new TextEncoder().encode(key)
674 // Sync hash for depth calculation (use first bytes of key as proxy)
675 let zeros = 0
676 for (const byte of keyBytes) {
677 if (byte === 0) zeros += 8
678 else {
679 for (let i = 7; i >= 0; i--) {
680 if ((byte >> i) & 1) break
681 zeros++
682 }
683 break
684 }
685 }
686 return Math.floor(zeros / 4)
687}
688
689class MST {
690 constructor(sql) {
691 this.sql = sql
692 }
693
694 async computeRoot() {
695 const records = this.sql.exec(`
696 SELECT collection, rkey, cid FROM records ORDER BY collection, rkey
697 `).toArray()
698
699 if (records.length === 0) {
700 return null
701 }
702
703 const entries = records.map(r => ({
704 key: `${r.collection}/${r.rkey}`,
705 cid: r.cid
706 }))
707
708 return this.buildTree(entries, 0)
709 }
710
711 async buildTree(entries, depth) {
712 if (entries.length === 0) return null
713
714 const node = { l: null, e: [] }
715 let leftEntries = []
716
717 for (const entry of entries) {
718 const keyDepth = getKeyDepth(entry.key)
719
720 if (keyDepth > depth) {
721 leftEntries.push(entry)
722 } else {
723 // Store accumulated left entries
724 if (leftEntries.length > 0) {
725 const leftCid = await this.buildTree(leftEntries, depth + 1)
726 if (node.e.length === 0) {
727 node.l = leftCid
728 } else {
729 node.e[node.e.length - 1].t = leftCid
730 }
731 leftEntries = []
732 }
733 node.e.push({ k: entry.key, v: entry.cid, t: null })
734 }
735 }
736
737 // Handle remaining left entries
738 if (leftEntries.length > 0) {
739 const leftCid = await this.buildTree(leftEntries, depth + 1)
740 if (node.e.length > 0) {
741 node.e[node.e.length - 1].t = leftCid
742 } else {
743 node.l = leftCid
744 }
745 }
746
747 // Encode and store node
748 const nodeBytes = cborEncode(node)
749 const nodeCid = await createCid(nodeBytes)
750 const cidStr = cidToString(nodeCid)
751
752 this.sql.exec(
753 `INSERT OR REPLACE INTO blocks (cid, data) VALUES (?, ?)`,
754 cidStr,
755 nodeBytes
756 )
757
758 return cidStr
759 }
760}
761```
762
763**Step 2: Add MST test endpoint**
764
765```javascript
766if (url.pathname === '/test/mst') {
767 // Insert some test records
768 this.sql.exec(`INSERT OR REPLACE INTO records VALUES (?, ?, ?, ?, ?)`,
769 'at://did:plc:test/app.bsky.feed.post/abc', 'cid1', 'app.bsky.feed.post', 'abc', new Uint8Array([1]))
770 this.sql.exec(`INSERT OR REPLACE INTO records VALUES (?, ?, ?, ?, ?)`,
771 'at://did:plc:test/app.bsky.feed.post/def', 'cid2', 'app.bsky.feed.post', 'def', new Uint8Array([2]))
772
773 const mst = new MST(this.sql)
774 const root = await mst.computeRoot()
775 return Response.json({ root })
776}
777```
778
779**Step 3: Verify MST builds**
780
781Run: `npx wrangler dev`
782Test: `curl "http://localhost:8787/test/mst?did=did:plc:test"`
783Expected: JSON with root CID string
784
785**Step 4: Commit**
786
787```bash
788git add src/pds.js
789git commit -m "feat: add Merkle Search Tree implementation"
790```
791
792---
793
794## Task 9: createRecord Endpoint
795
796**Files:**
797- Modify: `src/pds.js`
798
799Implement the core write path.
800
801**Step 1: Add createRecord method**
802
803Add to PersonalDataServer class:
804
805```javascript
806async createRecord(collection, record, rkey = null) {
807 const did = await this.getDid()
808 if (!did) throw new Error('PDS not initialized')
809
810 rkey = rkey || createTid()
811 const uri = `at://${did}/${collection}/${rkey}`
812
813 // Encode and hash record
814 const recordBytes = cborEncode(record)
815 const recordCid = await createCid(recordBytes)
816 const recordCidStr = cidToString(recordCid)
817
818 // Store block
819 this.sql.exec(
820 `INSERT OR REPLACE INTO blocks (cid, data) VALUES (?, ?)`,
821 recordCidStr, recordBytes
822 )
823
824 // Store record index
825 this.sql.exec(
826 `INSERT OR REPLACE INTO records (uri, cid, collection, rkey, value) VALUES (?, ?, ?, ?, ?)`,
827 uri, recordCidStr, collection, rkey, recordBytes
828 )
829
830 // Rebuild MST
831 const mst = new MST(this.sql)
832 const dataRoot = await mst.computeRoot()
833
834 // Get previous commit
835 const prevCommit = this.sql.exec(
836 `SELECT cid, rev FROM commits ORDER BY seq DESC LIMIT 1`
837 ).one()
838
839 // Create commit
840 const rev = createTid()
841 const commit = {
842 did,
843 version: 3,
844 data: dataRoot,
845 rev,
846 prev: prevCommit?.cid || null
847 }
848
849 // Sign commit
850 const commitBytes = cborEncode(commit)
851 const signingKey = await this.getSigningKey()
852 const sig = await sign(signingKey, commitBytes)
853
854 const signedCommit = { ...commit, sig }
855 const signedBytes = cborEncode(signedCommit)
856 const commitCid = await createCid(signedBytes)
857 const commitCidStr = cidToString(commitCid)
858
859 // Store commit block
860 this.sql.exec(
861 `INSERT OR REPLACE INTO blocks (cid, data) VALUES (?, ?)`,
862 commitCidStr, signedBytes
863 )
864
865 // Store commit reference
866 this.sql.exec(
867 `INSERT INTO commits (cid, rev, prev) VALUES (?, ?, ?)`,
868 commitCidStr, rev, prevCommit?.cid || null
869 )
870
871 // Sequence event
872 const evt = cborEncode({
873 ops: [{ action: 'create', path: `${collection}/${rkey}`, cid: recordCidStr }]
874 })
875 this.sql.exec(
876 `INSERT INTO seq_events (did, commit_cid, evt) VALUES (?, ?, ?)`,
877 did, commitCidStr, evt
878 )
879
880 return { uri, cid: recordCidStr, commit: commitCidStr }
881}
882```
883
884**Step 2: Add XRPC endpoint**
885
886```javascript
887if (url.pathname === '/xrpc/com.atproto.repo.createRecord') {
888 if (request.method !== 'POST') {
889 return Response.json({ error: 'method not allowed' }, { status: 405 })
890 }
891
892 const body = await request.json()
893 if (!body.collection || !body.record) {
894 return Response.json({ error: 'missing collection or record' }, { status: 400 })
895 }
896
897 try {
898 const result = await this.createRecord(body.collection, body.record, body.rkey)
899 return Response.json(result)
900 } catch (err) {
901 return Response.json({ error: err.message }, { status: 500 })
902 }
903}
904```
905
906**Step 3: Verify createRecord works**
907
908Run: `npx wrangler dev`
909
910```bash
911# First initialize
912curl -X POST "http://localhost:8787/init?did=did:plc:test123" \
913 -H "Content-Type: application/json" \
914 -d '{"did":"did:plc:test123","privateKey":"0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef"}'
915
916# Create a post
917curl -X POST "http://localhost:8787/xrpc/com.atproto.repo.createRecord?did=did:plc:test123" \
918 -H "Content-Type: application/json" \
919 -d '{"collection":"app.bsky.feed.post","record":{"text":"Hello world!","createdAt":"2026-01-04T00:00:00Z"}}'
920```
921
922Expected: JSON with uri, cid, and commit fields
923
924**Step 4: Commit**
925
926```bash
927git add src/pds.js
928git commit -m "feat: add createRecord endpoint"
929```
930
931---
932
933## Task 10: getRecord Endpoint
934
935**Files:**
936- Modify: `src/pds.js`
937
938**Step 1: Add XRPC endpoint**
939
940```javascript
941if (url.pathname === '/xrpc/com.atproto.repo.getRecord') {
942 const collection = url.searchParams.get('collection')
943 const rkey = url.searchParams.get('rkey')
944
945 if (!collection || !rkey) {
946 return Response.json({ error: 'missing collection or rkey' }, { status: 400 })
947 }
948
949 const did = await this.getDid()
950 const uri = `at://${did}/${collection}/${rkey}`
951
952 const row = this.sql.exec(
953 `SELECT cid, value FROM records WHERE uri = ?`, uri
954 ).one()
955
956 if (!row) {
957 return Response.json({ error: 'record not found' }, { status: 404 })
958 }
959
960 // Decode CBOR for response (minimal decoder)
961 const value = cborDecode(row.value)
962
963 return Response.json({ uri, cid: row.cid, value })
964}
965```
966
967**Step 2: Add minimal CBOR decoder**
968
969Add after cborEncode:
970
971```javascript
972function cborDecode(bytes) {
973 let offset = 0
974
975 function read() {
976 const initial = bytes[offset++]
977 const major = initial >> 5
978 const info = initial & 0x1f
979
980 let length = info
981 if (info === 24) length = bytes[offset++]
982 else if (info === 25) { length = (bytes[offset++] << 8) | bytes[offset++] }
983 else if (info === 26) {
984 length = (bytes[offset++] << 24) | (bytes[offset++] << 16) | (bytes[offset++] << 8) | bytes[offset++]
985 }
986
987 switch (major) {
988 case 0: return length // unsigned int
989 case 1: return -1 - length // negative int
990 case 2: { // byte string
991 const data = bytes.slice(offset, offset + length)
992 offset += length
993 return data
994 }
995 case 3: { // text string
996 const data = new TextDecoder().decode(bytes.slice(offset, offset + length))
997 offset += length
998 return data
999 }
1000 case 4: { // array
1001 const arr = []
1002 for (let i = 0; i < length; i++) arr.push(read())
1003 return arr
1004 }
1005 case 5: { // map
1006 const obj = {}
1007 for (let i = 0; i < length; i++) {
1008 const key = read()
1009 obj[key] = read()
1010 }
1011 return obj
1012 }
1013 case 7: { // special
1014 if (info === 20) return false
1015 if (info === 21) return true
1016 if (info === 22) return null
1017 return undefined
1018 }
1019 }
1020 }
1021
1022 return read()
1023}
1024```
1025
1026**Step 3: Verify getRecord works**
1027
1028Run: `npx wrangler dev`
1029
1030```bash
1031# Create a record first, then get it
1032curl "http://localhost:8787/xrpc/com.atproto.repo.getRecord?did=did:plc:test123&collection=app.bsky.feed.post&rkey=<rkey_from_create>"
1033```
1034
1035Expected: JSON with uri, cid, and value (the original record)
1036
1037**Step 4: Commit**
1038
1039```bash
1040git add src/pds.js
1041git commit -m "feat: add getRecord endpoint with CBOR decoder"
1042```
1043
1044---
1045
1046## Task 11: CAR File Builder
1047
1048**Files:**
1049- Modify: `src/pds.js`
1050
1051Build CAR (Content Addressable aRchive) files for repo export.
1052
1053**Step 1: Add CAR builder**
1054
1055Add after MST section:
1056
1057```javascript
1058// === CAR FILE BUILDER ===
1059
1060function varint(n) {
1061 const bytes = []
1062 while (n >= 0x80) {
1063 bytes.push((n & 0x7f) | 0x80)
1064 n >>>= 7
1065 }
1066 bytes.push(n)
1067 return new Uint8Array(bytes)
1068}
1069
1070function cidToBytes(cidStr) {
1071 // Decode base32lower CID string to bytes
1072 if (!cidStr.startsWith('b')) throw new Error('expected base32lower CID')
1073 return base32Decode(cidStr.slice(1))
1074}
1075
1076function base32Decode(str) {
1077 const alphabet = 'abcdefghijklmnopqrstuvwxyz234567'
1078 let bits = 0
1079 let value = 0
1080 const output = []
1081
1082 for (const char of str) {
1083 const idx = alphabet.indexOf(char)
1084 if (idx === -1) continue
1085 value = (value << 5) | idx
1086 bits += 5
1087 if (bits >= 8) {
1088 bits -= 8
1089 output.push((value >> bits) & 0xff)
1090 }
1091 }
1092
1093 return new Uint8Array(output)
1094}
1095
1096function buildCarFile(rootCid, blocks) {
1097 const parts = []
1098
1099 // Header: { version: 1, roots: [rootCid] }
1100 const rootCidBytes = cidToBytes(rootCid)
1101 const header = cborEncode({ version: 1, roots: [rootCidBytes] })
1102 parts.push(varint(header.length))
1103 parts.push(header)
1104
1105 // Blocks: varint(len) + cid + data
1106 for (const block of blocks) {
1107 const cidBytes = cidToBytes(block.cid)
1108 const blockLen = cidBytes.length + block.data.length
1109 parts.push(varint(blockLen))
1110 parts.push(cidBytes)
1111 parts.push(block.data)
1112 }
1113
1114 // Concatenate all parts
1115 const totalLen = parts.reduce((sum, p) => sum + p.length, 0)
1116 const car = new Uint8Array(totalLen)
1117 let offset = 0
1118 for (const part of parts) {
1119 car.set(part, offset)
1120 offset += part.length
1121 }
1122
1123 return car
1124}
1125```
1126
1127**Step 2: Commit**
1128
1129```bash
1130git add src/pds.js
1131git commit -m "feat: add CAR file builder"
1132```
1133
1134---
1135
1136## Task 12: getRepo Endpoint
1137
1138**Files:**
1139- Modify: `src/pds.js`
1140
1141**Step 1: Add XRPC endpoint**
1142
1143```javascript
1144if (url.pathname === '/xrpc/com.atproto.sync.getRepo') {
1145 const commit = this.sql.exec(
1146 `SELECT cid FROM commits ORDER BY seq DESC LIMIT 1`
1147 ).one()
1148
1149 if (!commit) {
1150 return Response.json({ error: 'repo not found' }, { status: 404 })
1151 }
1152
1153 const blocks = this.sql.exec(`SELECT cid, data FROM blocks`).toArray()
1154 const car = buildCarFile(commit.cid, blocks)
1155
1156 return new Response(car, {
1157 headers: { 'content-type': 'application/vnd.ipld.car' }
1158 })
1159}
1160```
1161
1162**Step 2: Verify getRepo works**
1163
1164Run: `npx wrangler dev`
1165
1166```bash
1167curl "http://localhost:8787/xrpc/com.atproto.sync.getRepo?did=did:plc:test123" -o repo.car
1168xxd repo.car | head -20
1169```
1170
1171Expected: Binary CAR file starting with CBOR header
1172
1173**Step 3: Commit**
1174
1175```bash
1176git add src/pds.js
1177git commit -m "feat: add getRepo endpoint returning CAR file"
1178```
1179
1180---
1181
1182## Task 13: subscribeRepos WebSocket
1183
1184**Files:**
1185- Modify: `src/pds.js`
1186
1187**Step 1: Add WebSocket endpoint**
1188
1189```javascript
1190if (url.pathname === '/xrpc/com.atproto.sync.subscribeRepos') {
1191 const upgradeHeader = request.headers.get('Upgrade')
1192 if (upgradeHeader !== 'websocket') {
1193 return new Response('expected websocket', { status: 426 })
1194 }
1195
1196 const { 0: client, 1: server } = new WebSocketPair()
1197 this.state.acceptWebSocket(server)
1198
1199 // Send backlog if cursor provided
1200 const cursor = url.searchParams.get('cursor')
1201 if (cursor) {
1202 const events = this.sql.exec(
1203 `SELECT * FROM seq_events WHERE seq > ? ORDER BY seq`,
1204 parseInt(cursor)
1205 ).toArray()
1206
1207 for (const evt of events) {
1208 server.send(this.formatEvent(evt))
1209 }
1210 }
1211
1212 return new Response(null, { status: 101, webSocket: client })
1213}
1214```
1215
1216**Step 2: Add event formatting and WebSocket handlers**
1217
1218Add to PersonalDataServer class:
1219
1220```javascript
1221formatEvent(evt) {
1222 const did = this.sql.exec(`SELECT did FROM seq_events WHERE seq = ?`, evt.seq).one()?.did
1223
1224 // AT Protocol frame format: header + body
1225 const header = cborEncode({ op: 1, t: '#commit' })
1226 const body = cborEncode({
1227 seq: evt.seq,
1228 rebase: false,
1229 tooBig: false,
1230 repo: did || evt.did,
1231 commit: cidToBytes(evt.commit_cid),
1232 rev: createTid(),
1233 since: null,
1234 blocks: new Uint8Array(0), // Simplified - real impl includes CAR slice
1235 ops: cborDecode(evt.evt).ops,
1236 blobs: [],
1237 time: new Date().toISOString()
1238 })
1239
1240 // Concatenate header + body
1241 const frame = new Uint8Array(header.length + body.length)
1242 frame.set(header)
1243 frame.set(body, header.length)
1244 return frame
1245}
1246
1247async webSocketMessage(ws, message) {
1248 // Handle ping
1249 if (message === 'ping') ws.send('pong')
1250}
1251
1252async webSocketClose(ws, code, reason) {
1253 // Durable Object will hibernate when no connections remain
1254}
1255
1256broadcastEvent(evt) {
1257 const frame = this.formatEvent(evt)
1258 for (const ws of this.state.getWebSockets()) {
1259 try {
1260 ws.send(frame)
1261 } catch (e) {
1262 // Client disconnected
1263 }
1264 }
1265}
1266```
1267
1268**Step 3: Update createRecord to broadcast**
1269
1270Add at end of createRecord method, before return:
1271
1272```javascript
1273// Broadcast to subscribers
1274const evtRow = this.sql.exec(
1275 `SELECT * FROM seq_events ORDER BY seq DESC LIMIT 1`
1276).one()
1277if (evtRow) {
1278 this.broadcastEvent(evtRow)
1279}
1280```
1281
1282**Step 4: Verify WebSocket works**
1283
1284Run: `npx wrangler dev`
1285
1286Use websocat or similar:
1287```bash
1288websocat "ws://localhost:8787/xrpc/com.atproto.sync.subscribeRepos?did=did:plc:test123"
1289```
1290
1291In another terminal, create a record — you should see bytes appear in the WebSocket connection.
1292
1293**Step 5: Commit**
1294
1295```bash
1296git add src/pds.js
1297git commit -m "feat: add subscribeRepos WebSocket endpoint"
1298```
1299
1300---
1301
1302## Task 14: Clean Up Test Endpoints
1303
1304**Files:**
1305- Modify: `src/pds.js`
1306
1307**Step 1: Remove test endpoints**
1308
1309Remove all `/test/*` endpoint handlers from the fetch method. Keep only:
1310- `/init`
1311- `/status`
1312- `/xrpc/com.atproto.repo.createRecord`
1313- `/xrpc/com.atproto.repo.getRecord`
1314- `/xrpc/com.atproto.sync.getRepo`
1315- `/xrpc/com.atproto.sync.subscribeRepos`
1316
1317**Step 2: Add proper 404 handler**
1318
1319```javascript
1320return Response.json({ error: 'not found' }, { status: 404 })
1321```
1322
1323**Step 3: Commit**
1324
1325```bash
1326git add src/pds.js
1327git commit -m "chore: remove test endpoints, clean up routing"
1328```
1329
1330---
1331
1332## Task 15: Deploy and Test
1333
1334**Step 1: Deploy to Cloudflare**
1335
1336```bash
1337npx wrangler deploy
1338```
1339
1340**Step 2: Initialize with a real DID**
1341
1342Generate a P-256 keypair and create a did:plc (or use existing).
1343
1344```bash
1345# Example initialization
1346curl -X POST "https://atproto-pds.<your-subdomain>.workers.dev/init?did=did:plc:yourActualDid" \
1347 -H "Content-Type: application/json" \
1348 -d '{"did":"did:plc:yourActualDid","privateKey":"your64CharHexPrivateKey"}'
1349```
1350
1351**Step 3: Create a test post**
1352
1353```bash
1354curl -X POST "https://atproto-pds.<your-subdomain>.workers.dev/xrpc/com.atproto.repo.createRecord?did=did:plc:yourActualDid" \
1355 -H "Content-Type: application/json" \
1356 -d '{"collection":"app.bsky.feed.post","record":{"$type":"app.bsky.feed.post","text":"Hello from Cloudflare PDS!","createdAt":"2026-01-04T12:00:00.000Z"}}'
1357```
1358
1359**Step 4: Verify repo is accessible**
1360
1361```bash
1362curl "https://atproto-pds.<your-subdomain>.workers.dev/xrpc/com.atproto.sync.getRepo?did=did:plc:yourActualDid" -o test.car
1363```
1364
1365**Step 5: Commit deployment config if needed**
1366
1367```bash
1368git add -A
1369git commit -m "chore: ready for deployment"
1370```
1371
1372---
1373
1374## Summary
1375
1376**Total Lines:** ~400 in single file
1377**Dependencies:** Zero
1378**Endpoints:** 4 XRPC + 2 internal
1379
1380**What works:**
1381- Create records with proper CIDs
1382- MST for repo structure
1383- P-256 signed commits
1384- CAR file export for relays
1385- WebSocket streaming for real-time sync
1386
1387**What's next (future tasks):**
1388- Incremental MST updates
1389- OAuth/JWT authentication
1390- Blob storage (R2)
1391- Handle resolution
1392- DID:PLC registration helper