this repo has no description
1# Blob Support Implementation Plan
2
3> **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task.
4
5**Goal:** Add blob (image/video) upload, storage, and retrieval to the PDS using Cloudflare R2.
6
7**Architecture:** Blobs stored in R2 bucket keyed by `{did}/{cid}`. Metadata tracked in SQLite tables (`blob`, `record_blob`) within each Durable Object. Orphan cleanup via DO alarm. MIME sniffing for security.
8
9**Tech Stack:** Cloudflare R2, Durable Object SQLite, Web Crypto API (SHA-256 for CID generation)
10
11---
12
13## Task 1: Add R2 Bucket Binding
14
15**Files:**
16- Modify: `wrangler.toml`
17
18**Step 1: Add R2 binding to wrangler.toml**
19
20Add after the existing migrations section:
21
22```toml
23[[r2_buckets]]
24binding = "BLOBS"
25bucket_name = "pds-blobs"
26```
27
28**Step 2: Create R2 bucket (if not exists)**
29
30Run: `npx wrangler r2 bucket create pds-blobs`
31
32**Step 3: Commit**
33
34```bash
35git add wrangler.toml
36git commit -m "feat: add R2 bucket binding for blob storage"
37```
38
39---
40
41## Task 2: Add Blob Database Schema
42
43**Files:**
44- Modify: `src/pds.js:1162-1190` (constructor schema initialization)
45
46**Step 1: Add blob and record_blob tables**
47
48In the `PersonalDataServer` constructor, after the existing `CREATE TABLE` statements (around line 1186), add:
49
50```javascript
51 CREATE TABLE IF NOT EXISTS blob (
52 cid TEXT PRIMARY KEY,
53 mimeType TEXT NOT NULL,
54 size INTEGER NOT NULL,
55 createdAt TEXT NOT NULL
56 );
57
58 CREATE TABLE IF NOT EXISTS record_blob (
59 blobCid TEXT NOT NULL,
60 recordUri TEXT NOT NULL,
61 PRIMARY KEY (blobCid, recordUri)
62 );
63```
64
65**Step 2: Test schema creation manually**
66
67Deploy and verify tables exist:
68```bash
69npx wrangler deploy
70```
71
72**Step 3: Commit**
73
74```bash
75git add src/pds.js
76git commit -m "feat: add blob and record_blob tables to schema"
77```
78
79---
80
81## Task 3: Implement MIME Type Sniffing
82
83**Files:**
84- Modify: `src/pds.js` (add after error helper, around line 30)
85- Test: `test/pds.test.js`
86
87**Step 1: Write the failing test**
88
89Add to `test/pds.test.js`:
90
91```javascript
92import {
93 // ... existing imports ...
94 sniffMimeType,
95} from '../src/pds.js';
96
97describe('MIME Type Sniffing', () => {
98 test('detects JPEG', () => {
99 const bytes = new Uint8Array([0xFF, 0xD8, 0xFF, 0xE0, 0x00, 0x10]);
100 assert.strictEqual(sniffMimeType(bytes), 'image/jpeg');
101 });
102
103 test('detects PNG', () => {
104 const bytes = new Uint8Array([0x89, 0x50, 0x4E, 0x47, 0x0D, 0x0A, 0x1A, 0x0A]);
105 assert.strictEqual(sniffMimeType(bytes), 'image/png');
106 });
107
108 test('detects GIF', () => {
109 const bytes = new Uint8Array([0x47, 0x49, 0x46, 0x38, 0x39, 0x61]);
110 assert.strictEqual(sniffMimeType(bytes), 'image/gif');
111 });
112
113 test('detects WebP', () => {
114 const bytes = new Uint8Array([
115 0x52, 0x49, 0x46, 0x46, // RIFF
116 0x00, 0x00, 0x00, 0x00, // size (ignored)
117 0x57, 0x45, 0x42, 0x50, // WEBP
118 ]);
119 assert.strictEqual(sniffMimeType(bytes), 'image/webp');
120 });
121
122 test('detects MP4', () => {
123 const bytes = new Uint8Array([
124 0x00, 0x00, 0x00, 0x18, // size
125 0x66, 0x74, 0x79, 0x70, // ftyp
126 ]);
127 assert.strictEqual(sniffMimeType(bytes), 'video/mp4');
128 });
129
130 test('returns null for unknown', () => {
131 const bytes = new Uint8Array([0x00, 0x01, 0x02, 0x03]);
132 assert.strictEqual(sniffMimeType(bytes), null);
133 });
134});
135```
136
137**Step 2: Run test to verify it fails**
138
139Run: `npm test`
140Expected: FAIL with "sniffMimeType is not exported"
141
142**Step 3: Write minimal implementation**
143
144Add to `src/pds.js` after the error helper (around line 30):
145
146```javascript
147// === MIME TYPE SNIFFING ===
148// Detect file type from magic bytes (first 12 bytes)
149
150/**
151 * Sniff MIME type from file magic bytes
152 * @param {Uint8Array|ArrayBuffer} bytes - File bytes (only first 12 needed)
153 * @returns {string|null} Detected MIME type or null if unknown
154 */
155export function sniffMimeType(bytes) {
156 const arr = new Uint8Array(bytes.slice(0, 12));
157
158 // JPEG: FF D8 FF
159 if (arr[0] === 0xff && arr[1] === 0xd8 && arr[2] === 0xff) {
160 return 'image/jpeg';
161 }
162
163 // PNG: 89 50 4E 47 0D 0A 1A 0A
164 if (
165 arr[0] === 0x89 &&
166 arr[1] === 0x50 &&
167 arr[2] === 0x4e &&
168 arr[3] === 0x47 &&
169 arr[4] === 0x0d &&
170 arr[5] === 0x0a &&
171 arr[6] === 0x1a &&
172 arr[7] === 0x0a
173 ) {
174 return 'image/png';
175 }
176
177 // GIF: 47 49 46 38 (GIF8)
178 if (
179 arr[0] === 0x47 &&
180 arr[1] === 0x49 &&
181 arr[2] === 0x46 &&
182 arr[3] === 0x38
183 ) {
184 return 'image/gif';
185 }
186
187 // WebP: RIFF....WEBP
188 if (
189 arr[0] === 0x52 &&
190 arr[1] === 0x49 &&
191 arr[2] === 0x46 &&
192 arr[3] === 0x46 &&
193 arr[8] === 0x57 &&
194 arr[9] === 0x45 &&
195 arr[10] === 0x42 &&
196 arr[11] === 0x50
197 ) {
198 return 'image/webp';
199 }
200
201 // MP4/MOV: ....ftyp at byte 4
202 if (
203 arr[4] === 0x66 &&
204 arr[5] === 0x74 &&
205 arr[6] === 0x79 &&
206 arr[7] === 0x70
207 ) {
208 return 'video/mp4';
209 }
210
211 return null;
212}
213```
214
215**Step 4: Run test to verify it passes**
216
217Run: `npm test`
218Expected: PASS
219
220**Step 5: Commit**
221
222```bash
223git add src/pds.js test/pds.test.js
224git commit -m "feat: add MIME type sniffing from magic bytes"
225```
226
227---
228
229## Task 4: Implement Blob Ref Detection
230
231**Files:**
232- Modify: `src/pds.js` (add after sniffMimeType)
233- Test: `test/pds.test.js`
234
235**Step 1: Write the failing test**
236
237Add to `test/pds.test.js`:
238
239```javascript
240import {
241 // ... existing imports ...
242 findBlobRefs,
243} from '../src/pds.js';
244
245describe('Blob Ref Detection', () => {
246 test('finds blob ref in simple object', () => {
247 const record = {
248 $type: 'app.bsky.feed.post',
249 text: 'Hello',
250 embed: {
251 $type: 'app.bsky.embed.images',
252 images: [
253 {
254 image: {
255 $type: 'blob',
256 ref: { $link: 'bafkreiabc123' },
257 mimeType: 'image/jpeg',
258 size: 1234,
259 },
260 alt: 'test image',
261 },
262 ],
263 },
264 };
265 const refs = findBlobRefs(record);
266 assert.deepStrictEqual(refs, ['bafkreiabc123']);
267 });
268
269 test('finds multiple blob refs', () => {
270 const record = {
271 images: [
272 { image: { $type: 'blob', ref: { $link: 'cid1' }, mimeType: 'image/png', size: 100 } },
273 { image: { $type: 'blob', ref: { $link: 'cid2' }, mimeType: 'image/png', size: 200 } },
274 ],
275 };
276 const refs = findBlobRefs(record);
277 assert.deepStrictEqual(refs, ['cid1', 'cid2']);
278 });
279
280 test('returns empty array when no blobs', () => {
281 const record = { text: 'Hello world', count: 42 };
282 const refs = findBlobRefs(record);
283 assert.deepStrictEqual(refs, []);
284 });
285
286 test('handles null and primitives', () => {
287 assert.deepStrictEqual(findBlobRefs(null), []);
288 assert.deepStrictEqual(findBlobRefs('string'), []);
289 assert.deepStrictEqual(findBlobRefs(42), []);
290 });
291});
292```
293
294**Step 2: Run test to verify it fails**
295
296Run: `npm test`
297Expected: FAIL with "findBlobRefs is not exported"
298
299**Step 3: Write minimal implementation**
300
301Add to `src/pds.js` after sniffMimeType:
302
303```javascript
304// === BLOB REF DETECTION ===
305// Recursively find blob references in records
306
307/**
308 * Find all blob CID references in a record
309 * @param {*} obj - Record value to scan
310 * @param {string[]} refs - Accumulator array (internal)
311 * @returns {string[]} Array of blob CID strings
312 */
313export function findBlobRefs(obj, refs = []) {
314 if (!obj || typeof obj !== 'object') {
315 return refs;
316 }
317
318 // Check if this object is a blob ref
319 if (obj.$type === 'blob' && obj.ref?.$link) {
320 refs.push(obj.ref.$link);
321 }
322
323 // Recurse into arrays and objects
324 if (Array.isArray(obj)) {
325 for (const item of obj) {
326 findBlobRefs(item, refs);
327 }
328 } else {
329 for (const value of Object.values(obj)) {
330 findBlobRefs(value, refs);
331 }
332 }
333
334 return refs;
335}
336```
337
338**Step 4: Run test to verify it passes**
339
340Run: `npm test`
341Expected: PASS
342
343**Step 5: Commit**
344
345```bash
346git add src/pds.js test/pds.test.js
347git commit -m "feat: add blob ref detection for records"
348```
349
350---
351
352## Task 5: Implement uploadBlob Endpoint
353
354**Files:**
355- Modify: `src/pds.js` (add route and handler)
356
357**Step 1: Add route to pdsRoutes**
358
359In `pdsRoutes` object (around line 1055), add:
360
361```javascript
362 '/xrpc/com.atproto.repo.uploadBlob': {
363 method: 'POST',
364 handler: (pds, req, _url) => pds.handleUploadBlob(req),
365 },
366```
367
368**Step 2: Add handler method to PersonalDataServer class**
369
370Add method to the class (after existing handlers):
371
372```javascript
373 async handleUploadBlob(request) {
374 // Require auth
375 const authResult = await this.requireAuth(request);
376 if (authResult instanceof Response) return authResult;
377
378 const did = await this.getDid();
379 if (!did) {
380 return errorResponse('InvalidRequest', 'PDS not initialized', 400);
381 }
382
383 // Read body as ArrayBuffer
384 const bodyBytes = await request.arrayBuffer();
385 const size = bodyBytes.byteLength;
386
387 // Check size limit (50MB)
388 const MAX_BLOB_SIZE = 50 * 1024 * 1024;
389 if (size > MAX_BLOB_SIZE) {
390 return errorResponse(
391 'BlobTooLarge',
392 `Blob size ${size} exceeds maximum ${MAX_BLOB_SIZE}`,
393 400,
394 );
395 }
396
397 // Sniff MIME type, fall back to Content-Type header
398 const contentType = request.headers.get('Content-Type') || 'application/octet-stream';
399 const sniffed = sniffMimeType(bodyBytes);
400 const mimeType = sniffed || contentType;
401
402 // Compute CID (reuse existing createCid)
403 const cid = await createCid(new Uint8Array(bodyBytes));
404 const cidStr = cidToString(cid);
405
406 // Check if blob already exists
407 const existing = this.sql
408 .exec('SELECT cid FROM blob WHERE cid = ?', cidStr)
409 .toArray();
410
411 if (existing.length === 0) {
412 // Upload to R2
413 const r2Key = `${did}/${cidStr}`;
414 await this.env.BLOBS.put(r2Key, bodyBytes, {
415 httpMetadata: { contentType: mimeType },
416 });
417
418 // Insert metadata
419 const createdAt = new Date().toISOString();
420 this.sql.exec(
421 'INSERT INTO blob (cid, mimeType, size, createdAt) VALUES (?, ?, ?, ?)',
422 cidStr,
423 mimeType,
424 size,
425 createdAt,
426 );
427 }
428
429 // Return BlobRef
430 return Response.json({
431 blob: {
432 $type: 'blob',
433 ref: { $link: cidStr },
434 mimeType,
435 size,
436 },
437 });
438 }
439```
440
441**Step 3: Verify deployment**
442
443Run: `npx wrangler deploy`
444
445**Step 4: Test manually with curl**
446
447```bash
448curl -X POST \
449 -H "Authorization: Bearer <access-token>" \
450 -H "Content-Type: image/png" \
451 --data-binary @test-image.png \
452 https://your-pds.workers.dev/xrpc/com.atproto.repo.uploadBlob
453```
454
455Expected: JSON response with blob ref
456
457**Step 5: Commit**
458
459```bash
460git add src/pds.js
461git commit -m "feat: implement uploadBlob endpoint with R2 storage"
462```
463
464---
465
466## Task 6: Implement getBlob Endpoint
467
468**Files:**
469- Modify: `src/pds.js` (add route and handler)
470
471**Step 1: Add route to pdsRoutes**
472
473```javascript
474 '/xrpc/com.atproto.sync.getBlob': {
475 handler: (pds, _req, url) => pds.handleGetBlob(url),
476 },
477```
478
479**Step 2: Add handler method**
480
481```javascript
482 async handleGetBlob(url) {
483 const did = url.searchParams.get('did');
484 const cid = url.searchParams.get('cid');
485
486 if (!did || !cid) {
487 return errorResponse('InvalidRequest', 'missing did or cid parameter', 400);
488 }
489
490 // Verify DID matches this DO
491 const myDid = await this.getDid();
492 if (did !== myDid) {
493 return errorResponse('InvalidRequest', 'DID does not match this repo', 400);
494 }
495
496 // Look up blob metadata
497 const rows = this.sql
498 .exec('SELECT mimeType, size FROM blob WHERE cid = ?', cid)
499 .toArray();
500
501 if (rows.length === 0) {
502 return errorResponse('BlobNotFound', 'blob not found', 404);
503 }
504
505 const { mimeType, size } = rows[0];
506
507 // Fetch from R2
508 const r2Key = `${did}/${cid}`;
509 const object = await this.env.BLOBS.get(r2Key);
510
511 if (!object) {
512 return errorResponse('BlobNotFound', 'blob not found in storage', 404);
513 }
514
515 // Return blob with security headers
516 return new Response(object.body, {
517 headers: {
518 'Content-Type': mimeType,
519 'Content-Length': String(size),
520 'X-Content-Type-Options': 'nosniff',
521 'Content-Security-Policy': "default-src 'none'; sandbox",
522 'Cache-Control': 'public, max-age=31536000, immutable',
523 },
524 });
525 }
526```
527
528**Step 3: Deploy and test**
529
530Run: `npx wrangler deploy`
531
532Test:
533```bash
534curl "https://your-pds.workers.dev/xrpc/com.atproto.sync.getBlob?did=did:plc:xxx&cid=bafkrei..."
535```
536
537**Step 4: Commit**
538
539```bash
540git add src/pds.js
541git commit -m "feat: implement getBlob endpoint"
542```
543
544---
545
546## Task 7: Implement listBlobs Endpoint
547
548**Files:**
549- Modify: `src/pds.js` (add route and handler)
550
551**Step 1: Add route to pdsRoutes**
552
553```javascript
554 '/xrpc/com.atproto.sync.listBlobs': {
555 handler: (pds, _req, url) => pds.handleListBlobs(url),
556 },
557```
558
559**Step 2: Add handler method**
560
561```javascript
562 async handleListBlobs(url) {
563 const did = url.searchParams.get('did');
564 const cursor = url.searchParams.get('cursor');
565 const limit = Math.min(Number(url.searchParams.get('limit')) || 500, 1000);
566
567 if (!did) {
568 return errorResponse('InvalidRequest', 'missing did parameter', 400);
569 }
570
571 // Verify DID matches this DO
572 const myDid = await this.getDid();
573 if (did !== myDid) {
574 return errorResponse('InvalidRequest', 'DID does not match this repo', 400);
575 }
576
577 // Query blobs with pagination
578 let query = 'SELECT cid, createdAt FROM blob';
579 const params = [];
580
581 if (cursor) {
582 query += ' WHERE createdAt > ?';
583 params.push(cursor);
584 }
585
586 query += ' ORDER BY createdAt ASC LIMIT ?';
587 params.push(limit + 1); // Fetch one extra to detect if there's more
588
589 const rows = this.sql.exec(query, ...params).toArray();
590
591 // Determine if there's a next page
592 let nextCursor = null;
593 if (rows.length > limit) {
594 rows.pop(); // Remove the extra row
595 nextCursor = rows[rows.length - 1].createdAt;
596 }
597
598 return Response.json({
599 cids: rows.map((r) => r.cid),
600 cursor: nextCursor,
601 });
602 }
603```
604
605**Step 3: Deploy and test**
606
607Run: `npx wrangler deploy`
608
609Test:
610```bash
611curl "https://your-pds.workers.dev/xrpc/com.atproto.sync.listBlobs?did=did:plc:xxx"
612```
613
614**Step 4: Commit**
615
616```bash
617git add src/pds.js
618git commit -m "feat: implement listBlobs endpoint"
619```
620
621---
622
623## Task 8: Integrate Blob Association with createRecord
624
625**Files:**
626- Modify: `src/pds.js:1253` (createRecord method)
627
628**Step 1: Add blob association after record storage**
629
630In `createRecord` method, after storing the record in the `records` table (around line 1280), add:
631
632```javascript
633 // Associate blobs with this record
634 const blobRefs = findBlobRefs(record);
635 for (const blobCid of blobRefs) {
636 // Verify blob exists
637 const blobExists = this.sql
638 .exec('SELECT cid FROM blob WHERE cid = ?', blobCid)
639 .toArray();
640
641 if (blobExists.length === 0) {
642 throw new Error(`BlobNotFound: ${blobCid}`);
643 }
644
645 // Create association
646 this.sql.exec(
647 'INSERT OR IGNORE INTO record_blob (blobCid, recordUri) VALUES (?, ?)',
648 blobCid,
649 uri,
650 );
651 }
652```
653
654**Step 2: Deploy and test**
655
656Test by uploading a blob, then creating a post that references it:
657
658```bash
659# Upload blob
660BLOB=$(curl -X POST -H "Authorization: Bearer $TOKEN" \
661 -H "Content-Type: image/png" --data-binary @test.png \
662 https://your-pds.workers.dev/xrpc/com.atproto.repo.uploadBlob)
663
664echo $BLOB # Get the CID
665
666# Create post with image
667curl -X POST -H "Authorization: Bearer $TOKEN" \
668 -H "Content-Type: application/json" \
669 https://your-pds.workers.dev/xrpc/com.atproto.repo.createRecord \
670 -d '{
671 "repo": "did:plc:xxx",
672 "collection": "app.bsky.feed.post",
673 "record": {
674 "$type": "app.bsky.feed.post",
675 "text": "Hello with image!",
676 "createdAt": "2026-01-06T12:00:00.000Z",
677 "embed": {
678 "$type": "app.bsky.embed.images",
679 "images": [{
680 "image": {
681 "$type": "blob",
682 "ref": {"$link": "<cid-from-upload>"},
683 "mimeType": "image/png",
684 "size": 1234
685 },
686 "alt": "test"
687 }]
688 }
689 }
690 }'
691```
692
693**Step 3: Commit**
694
695```bash
696git add src/pds.js
697git commit -m "feat: associate blobs with records on createRecord"
698```
699
700---
701
702## Task 9: Implement Blob Cleanup on deleteRecord
703
704**Files:**
705- Modify: `src/pds.js:1391` (deleteRecord method)
706
707**Step 1: Add blob cleanup after record deletion**
708
709In `deleteRecord` method, after deleting the record from the `records` table, add:
710
711```javascript
712 // Get blobs associated with this record
713 const associatedBlobs = this.sql
714 .exec('SELECT blobCid FROM record_blob WHERE recordUri = ?', uri)
715 .toArray();
716
717 // Remove associations for this record
718 this.sql.exec('DELETE FROM record_blob WHERE recordUri = ?', uri);
719
720 // Check each blob for orphan status and delete if unreferenced
721 for (const { blobCid } of associatedBlobs) {
722 const stillReferenced = this.sql
723 .exec('SELECT 1 FROM record_blob WHERE blobCid = ? LIMIT 1', blobCid)
724 .toArray();
725
726 if (stillReferenced.length === 0) {
727 // Blob is orphaned, delete from R2 and database
728 const did = await this.getDid();
729 await this.env.BLOBS.delete(`${did}/${blobCid}`);
730 this.sql.exec('DELETE FROM blob WHERE cid = ?', blobCid);
731 }
732 }
733```
734
735**Step 2: Deploy and test**
736
737Test by creating a post with an image, then deleting it:
738
739```bash
740# Delete the post
741curl -X POST -H "Authorization: Bearer $TOKEN" \
742 -H "Content-Type: application/json" \
743 https://your-pds.workers.dev/xrpc/com.atproto.repo.deleteRecord \
744 -d '{
745 "repo": "did:plc:xxx",
746 "collection": "app.bsky.feed.post",
747 "rkey": "<rkey>"
748 }'
749
750# Verify blob is gone
751curl "https://your-pds.workers.dev/xrpc/com.atproto.sync.listBlobs?did=did:plc:xxx"
752```
753
754**Step 3: Commit**
755
756```bash
757git add src/pds.js
758git commit -m "feat: cleanup orphaned blobs on record deletion"
759```
760
761---
762
763## Task 10: Implement Orphan Cleanup Alarm
764
765**Files:**
766- Modify: `src/pds.js` (add alarm handler and scheduling)
767
768**Step 1: Add alarm scheduling in initIdentity**
769
770In the `initIdentity` method (or after successful init), add:
771
772```javascript
773 // Schedule blob cleanup alarm (runs daily)
774 const currentAlarm = await this.state.storage.getAlarm();
775 if (!currentAlarm) {
776 await this.state.storage.setAlarm(Date.now() + 24 * 60 * 60 * 1000);
777 }
778```
779
780**Step 2: Add alarm handler to PersonalDataServer class**
781
782```javascript
783 async alarm() {
784 await this.cleanupOrphanedBlobs();
785 // Reschedule for next day
786 await this.state.storage.setAlarm(Date.now() + 24 * 60 * 60 * 1000);
787 }
788
789 async cleanupOrphanedBlobs() {
790 const did = await this.getDid();
791 if (!did) return;
792
793 // Find orphans: blobs not in record_blob, older than 24h
794 const cutoff = new Date(Date.now() - 24 * 60 * 60 * 1000).toISOString();
795
796 const orphans = this.sql
797 .exec(
798 `SELECT b.cid FROM blob b
799 LEFT JOIN record_blob rb ON b.cid = rb.blobCid
800 WHERE rb.blobCid IS NULL AND b.createdAt < ?`,
801 cutoff,
802 )
803 .toArray();
804
805 for (const { cid } of orphans) {
806 await this.env.BLOBS.delete(`${did}/${cid}`);
807 this.sql.exec('DELETE FROM blob WHERE cid = ?', cid);
808 }
809
810 if (orphans.length > 0) {
811 console.log(`Cleaned up ${orphans.length} orphaned blobs`);
812 }
813 }
814```
815
816**Step 3: Deploy**
817
818Run: `npx wrangler deploy`
819
820**Step 4: Commit**
821
822```bash
823git add src/pds.js
824git commit -m "feat: add DO alarm for orphaned blob cleanup"
825```
826
827---
828
829## Task 11: Update README
830
831**Files:**
832- Modify: `README.md`
833
834**Step 1: Update feature checklist**
835
836Change:
837```markdown
838- [ ] Blob storage (uploadBlob, getBlob, listBlobs)
839```
840
841To:
842```markdown
843- [x] Blob storage (uploadBlob, getBlob, listBlobs)
844```
845
846**Step 2: Add blob configuration section**
847
848Add under configuration:
849
850```markdown
851### Blob Storage
852
853Blobs (images, videos) are stored in Cloudflare R2:
854
8551. Create an R2 bucket: `npx wrangler r2 bucket create pds-blobs`
8562. The binding is already configured in `wrangler.toml`
857
858Supported formats: JPEG, PNG, GIF, WebP, MP4
859Max size: 50MB
860Orphaned blobs are automatically cleaned up after 24 hours.
861```
862
863**Step 3: Commit**
864
865```bash
866git add README.md
867git commit -m "docs: update README with blob storage feature"
868```
869
870---
871
872## Summary
873
874| Task | Description | Files Modified |
875|------|-------------|----------------|
876| 1 | Add R2 bucket binding | `wrangler.toml` |
877| 2 | Add blob database schema | `src/pds.js` |
878| 3 | Implement MIME sniffing | `src/pds.js`, `test/pds.test.js` |
879| 4 | Implement blob ref detection | `src/pds.js`, `test/pds.test.js` |
880| 5 | Implement uploadBlob endpoint | `src/pds.js` |
881| 6 | Implement getBlob endpoint | `src/pds.js` |
882| 7 | Implement listBlobs endpoint | `src/pds.js` |
883| 8 | Integrate blob association | `src/pds.js` |
884| 9 | Cleanup blobs on delete | `src/pds.js` |
885| 10 | Add orphan cleanup alarm | `src/pds.js` |
886| 11 | Update README | `README.md` |
887
888**Estimated additions:** ~250 lines to `src/pds.js`, ~60 lines to `test/pds.test.js`