Import your Last.fm and Spotify listening history to the AT Protocol network using the
fm.teal.alpha.feed.play lexicon.
1# Malachite
2
3Import your Last.fm and Spotify listening history to the AT Protocol network using the `fm.teal.alpha.feed.play` lexicon.
4
5**Repository:** [atproto-lastfm-importer](https://github.com/ewanc26/atproto-lastfm-importer)
6[Also available on Tangled](https://tangled.org/@did:plc:ofrbh253gwicbkc5nktqepol/atproto-lastfm-importer)
7
8## ⚠️ Important: Rate Limits
9
10**CRITICAL**: Bluesky's AppView has rate limits on PDS instances. Exceeding 10K records per day can rate limit your **ENTIRE PDS**, affecting all users on your instance.
11
12This importer automatically protects your PDS by:
13- Limiting imports to **1,000 records per day** (with 75% safety margin)
14- Calculating optimal batch sizes and delays
15- Automatically waiting for rate limit resets when limits are hit
16- Providing clear progress tracking and time estimates
17
18For more details, see the [Bluesky Rate Limits Documentation](https://docs.bsky.app/blog/rate-limits-pds-v3).
19
20## What’s with the name?
21
22It used to be called `atproto-lastfm-importer` — generic as fuck. That name told you what it did and nothing about why it mattered, and it sounded like a disposable weekend script. So I renamed it.
23
24At the moment, the repository is still called `atproto-lastfm-importer` to avoid link rot. That will probably change to `malachite` later once things settle.
25
26**Malachite** is a greenish-blue copper mineral associated with preservation and transformation. That’s exactly what this tool does: it preserves your scrobbles and transforms them into proper `fm.teal.alpha.feed.play` records on the AT Protocol. The colour match isn’t an accident — malachite sits squarely in the teal/green range, a deliberate nod to the `teal` lexicon it publishes to.
27
28## Quick Start
29
30**Note:** You must build the project first, then run with arguments.
31
32```bash
33# Install dependencies and build
34pnpm install
35pnpm build
36
37# Show help
38pnpm start -- --help
39
40# Run with command line arguments
41pnpm start -- -i lastfm.csv -h alice.bsky.social -p xxxx-xxxx-xxxx-xxxx -y
42
43# Alternative: run directly with node (no -- needed)
44node dist/index.js -i lastfm.csv -h alice.bsky.social -p xxxx-xxxx-xxxx-xxxx -y
45```
46
47## Features
48
49### Import Capabilities
50- ✅ **Last.fm Import**: Full support for Last.fm CSV exports with MusicBrainz IDs
51- ✅ **Spotify Import**: Import Extended Streaming History JSON files
52- ✅ **Combined Import**: Merge Last.fm and Spotify exports with intelligent deduplication
53- ✅ **Re-Sync Mode**: Import only new scrobbles without creating duplicates
54- ✅ **Duplicate Removal**: Clean up accidentally imported duplicate records
55
56### Performance & Safety
57- ✅ **Automatic Duplicate Prevention**: Automatically checks Teal and skips records that already exist (no duplicates!)
58- ✅ **Input Deduplication**: Removes duplicate entries within the source file before submission
59- ✅ **Batch Operations**: Uses `com.atproto.repo.applyWrites` for efficient batch publishing (up to 200 records per call)
60- ✅ **Rate Limiting**: Automatic daily limits prevent PDS rate limiting
61- ✅ **Automatic Rate Limiting**: Waits for limit resets when daily/hourly limits are reached
62- ✅ **Multi-Day Imports**: Large imports automatically span multiple days with automatic waits
63- ✅ **Resume Support**: Safe to stop (Ctrl+C) and restart - continues from where it left off
64- ✅ **Graceful Cancellation**: Press Ctrl+C to stop after the current batch completes
65
66### User Experience
67- ✅ **Structured Logging**: Color-coded output with debug/verbose modes
68- ✅ **Progress Tracking**: Real-time progress with time estimates
69- ✅ **Dry Run Mode**: Preview records without publishing
70- ✅ **Interactive Mode**: Simple prompts guide you through the process
71- ✅ **Command Line Mode**: Full automation support for scripting
72
73### Technical Features
74- ✅ **TID-based Record Keys**: Timestamp-based identifiers for chronological ordering
75- ✅ **Identity Resolution**: Resolves ATProto handles/DIDs using Slingshot
76- ✅ **PDS Auto-Discovery**: Automatically connects to your personal PDS
77- ✅ **MusicBrainz Support**: Preserves MusicBrainz IDs when available (Last.fm)
78- ✅ **Chronological Ordering**: Processes oldest first (or newest with `-r` flag)
79- ✅ **Error Handling**: Continues on errors with detailed reporting
80
81## Usage Examples
82
83### Combined Import (Last.fm + Spotify)
84
85Merge your Last.fm and Spotify listening history into a single, deduplicated import:
86
87```bash
88# Preview the merged import
89pnpm start -- -i lastfm.csv --spotify-input spotify-export/ -m combined --dry-run
90
91# Perform the combined import
92pnpm start -- -i lastfm.csv --spotify-input spotify-export/ -m combined -h alice.bsky.social -p xxxx-xxxx-xxxx-xxxx -y
93```
94
95**What combined mode does:**
961. Parses both Last.fm CSV and Spotify JSON exports
972. Normalizes track names and artist names for comparison
983. Identifies duplicate plays (same track within 5 minutes)
994. Chooses the best version of each play (prefers Last.fm with MusicBrainz IDs)
1005. Merges into a single chronological timeline
1016. Shows detailed statistics about the merge
102
103**Example output:**
104```
105📊 Merge Statistics
106═══════════════════════════════════════════
107Last.fm records: 15,234
108Spotify records: 8,567
109Total before merge: 23,801
110
111Duplicates removed: 3,421
112Last.fm unique: 11,813
113Spotify unique: 5,146
114
115Final merged total: 16,959
116
117Date range:
118 First: 2015-03-15 10:23:45
119 Last: 2025-01-07 14:32:11
120═══════════════════════════════════════════
121```
122
123### Re-Sync Mode
124
125Sync your Last.fm export with Teal without creating duplicates:
126
127```bash
128# Preview what will be synced
129pnpm start -- -i lastfm.csv -h alice.bsky.social -p xxxx-xxxx-xxxx-xxxx -m sync --dry-run
130
131# Perform the sync
132pnpm start -- -i lastfm.csv -h alice.bsky.social -p xxxx-xxxx-xxxx-xxxx -m sync -y
133```
134
135**Perfect for:**
136- Re-running imports with updated Last.fm exports
137- Recovering from interrupted imports
138- Adding recent scrobbles without duplicating old ones
139
140**Note:** Sync mode requires authentication even in dry-run mode to fetch existing records.
141
142### Remove Duplicates
143
144Clean up accidentally imported duplicate records:
145
146```bash
147# Preview duplicates (dry run)
148pnpm start -- -m deduplicate -h alice.bsky.social -p xxxx-xxxx-xxxx-xxxx --dry-run
149
150# Remove duplicates (keeps first occurrence)
151pnpm start -- -m deduplicate -h alice.bsky.social -p xxxx-xxxx-xxxx-xxxx
152```
153
154### Import from Spotify
155
156```bash
157# Import single Spotify JSON file
158pnpm start -- -i Streaming_History_Audio_2021-2023_0.json -m spotify -h alice.bsky.social -p xxxx-xxxx-xxxx-xxxx -y
159
160# Import directory with multiple Spotify files (recommended)
161pnpm start -- -i '/path/to/Spotify Extended Streaming History' -m spotify -h alice.bsky.social -p xxxx-xxxx-xxxx-xxxx -y
162```
163
164### Import from Last.fm
165
166```bash
167# Standard Last.fm import
168pnpm start -- -i lastfm.csv -h alice.bsky.social -p xxxx-xxxx-xxxx-xxxx -y
169
170# Preview without publishing
171pnpm start -- -i lastfm.csv --dry-run
172
173# Process newest tracks first
174pnpm start -- -i lastfm.csv -h alice.bsky.social -r -y
175
176# Verbose debug output
177pnpm start -- -i lastfm.csv --dry-run -v
178
179# Quiet mode (only warnings and errors)
180pnpm start -- -i lastfm.csv -h alice.bsky.social -p xxxx-xxxx-xxxx-xxxx -q -y
181```
182
183### Advanced Options
184
185```bash
186# Custom batch settings (advanced users only)
187pnpm start -- -i lastfm.csv -h alice.bsky.social -b 20 -d 3000
188
189# Full automation with all flags
190pnpm start -- -i lastfm.csv -h alice.bsky.social -p xxxx-xxxx-xxxx-xxxx -y -q
191```
192
193## Command Line Options
194
195**Note:** When importing data (not in deduplicate mode), you must provide `--input`, `--handle`, and `--password`. The `--yes` flag skips confirmation prompts for automation.
196
197### Required Options
198
199| Option | Short | Description | Example |
200|--------|-------|-------------|---------|
201| `--input <path>` | `-i` | Path to Last.fm CSV or Spotify JSON file/directory | `-i lastfm.csv` |
202| `--handle <handle>` | `-h` | ATProto handle or DID | `-h alice.bsky.social` |
203| `--password <pass>` | `-p` | ATProto app password | `-p xxxx-xxxx-xxxx-xxxx` |
204
205### Import Mode
206
207| Option | Short | Description | Default |
208|--------|-------|-------------|---------|
209| `--mode <mode>` | `-m` | Import mode | `lastfm` |
210
211**Available modes:**
212- `lastfm` - Import Last.fm export only
213- `spotify` - Import Spotify export only
214- `combined` - Merge Last.fm + Spotify exports
215- `sync` - Skip existing records (sync mode)
216- `deduplicate` - Remove duplicate records
217
218### Additional Options
219
220| Option | Short | Description | Default |
221|--------|-------|-------------|---------|
222| `--spotify-input <path>` | | Path to Spotify export (for combined mode) | - |
223| `--reverse` | `-r` | Process newest first | `false` |
224| `--yes` | `-y` | Skip confirmation prompts | `false` |
225| `--dry-run` | | Preview without importing | `false` |
226| `--verbose` | `-v` | Enable debug logging | `false` |
227| `--quiet` | `-q` | Suppress non-essential output | `false` |
228| `--batch-size <num>` | `-b` | Records per batch (1-200) | Auto-calculated |
229| `--batch-delay <ms>` | `-d` | Delay between batches in ms | `500` (min) |
230| `--help` | | Show help message | - |
231
232### Legacy Flags (Backwards Compatible)
233
234These old flags still work but are deprecated:
235- `--file` → Use `--input`
236- `--identifier` → Use `--handle`
237- `--spotify-file` → Use `--spotify-input`
238- `--reverse-chronological` → Use `--reverse`
239- `--spotify` → Use `--mode spotify`
240- `--combined` → Use `--mode combined`
241- `--sync` → Use `--mode sync`
242- `--remove-duplicates` → Use `--mode deduplicate`
243
244## Getting Your Data
245
246### Last.fm Export
247
2481. Visit [Last.fm Export Tool](https://lastfm.ghan.nl/export/)
2492. Request your data export in CSV format
2503. Download the CSV file when ready
2514. Use the CSV file path with this importer
252
253### Spotify Export
254
2551. Go to [Spotify Privacy Settings](https://www.spotify.com/account/privacy/)
2562. Scroll to "Download your data" and request your data
2573. Select "Extended streaming history" (can take up to 30 days)
2584. When ready, download and extract the ZIP file
2595. Use either:
260 - A single JSON file: `Streaming_History_Audio_2021-2023_0.json`
261 - The entire extracted directory (recommended)
262
263**Note:** The importer automatically:
264- Reads all `Streaming_History_Audio_*.json` files in a directory
265- Filters out podcasts, audiobooks, and non-music content
266- Combines all music tracks into a single import
267
268## Data Format
269
270Each scrobble becomes an `fm.teal.alpha.feed.play` record with:
271
272### Required Fields
273- **trackName**: The name of the track
274- **artists**: Array of artist objects (requires `artistName`, optional `artistMbId` for Last.fm)
275- **playedTime**: ISO 8601 timestamp of when you listened
276- **submissionClientAgent**: Identifies this importer (`malachite/v0.6.2`)
277- **musicServiceBaseDomain**: Set to `last.fm` or `spotify.com`
278
279### Optional Fields
280- **releaseName**: Album/release name
281- **releaseMbId**: MusicBrainz release ID (Last.fm only)
282- **recordingMbId**: MusicBrainz recording/track ID (Last.fm only)
283- **originUrl**: Link to the track on Last.fm or Spotify
284
285### Example Records
286
287**Last.fm Record:**
288```json
289{
290 "$type": "fm.teal.alpha.feed.play",
291 "trackName": "Paint My Masterpiece",
292 "artists": [
293 {
294 "artistName": "Cjbeards",
295 "artistMbId": "c8d4f4bf-1b82-4d4d-9d73-05909faaff89"
296 }
297 ],
298 "releaseName": "Masquerade",
299 "releaseMbId": "fdb2397b-78d5-4019-8fad-656d286e4d33",
300 "recordingMbId": "3a390ad3-fe56-45f2-a073-bebc45d6bde1",
301 "playedTime": "2025-11-13T23:49:36Z",
302 "originUrl": "https://www.last.fm/music/Cjbeards/_/Paint+My+Masterpiece",
303 "submissionClientAgent": "malachite/v0.6.2",
304 "musicServiceBaseDomain": "last.fm"
305}
306```
307
308**Spotify Record:**
309```json
310{
311 "$type": "fm.teal.alpha.feed.play",
312 "trackName": "Don't Give Up",
313 "artists": [
314 {
315 "artistName": "Chicane"
316 }
317 ],
318 "releaseName": "Twenty",
319 "playedTime": "2021-09-09T10:34:08Z",
320 "originUrl": "https://open.spotify.com/track/3gZqDJkMZipOYCRjlHWgOV",
321 "submissionClientAgent": "malachite/v0.6.2",
322 "musicServiceBaseDomain": "spotify.com"
323}
324```
325
326## How It Works
327
328### Processing Flow
3291. **Parses input file(s)**:
330 - Last.fm: CSV using `csv-parse` library
331 - Spotify: JSON files (single or multiple in directory)
3322. **Filters data**:
333 - Spotify: Automatically removes podcasts, audiobooks, and non-music content
3343. **Converts to schema**: Maps to `fm.teal.alpha.feed.play` format
3354. **Deduplicates input**: Removes duplicate entries from the source data (keeps first occurrence)
3365. **Checks Teal**: Fetches existing records and skips any that are already imported (prevents duplicates)
3376. **Sorts records**: Chronologically (oldest first) or reverse with `-r` flag
3387. **Generates TID-based keys**: From `playedTime` for chronological ordering
3398. **Validates fields**: Ensures required fields are present
3409. **Publishes in batches**: Uses `com.atproto.repo.applyWrites` (up to 200 records per call)
341
342### Automatic Duplicate Prevention
343
344The importer has **two layers of duplicate prevention** to ensure you never import the same record twice:
345
346#### Step 1: Input File Deduplication
347
348Removes duplicates within your source file(s):
349
350**How duplicates are identified:**
351- Same track name (case-insensitive)
352- Same artist name (case-insensitive)
353- Same timestamp (exact match)
354
355**What happens:**
356- First occurrence is kept
357- Subsequent duplicates are removed
358- Shows message: "No duplicates found in input data" or "Removed X duplicate(s)"
359
360#### Step 2: Teal Comparison (Automatic & Adaptive)
361
362**Automatically checks your existing Teal records** and skips any that are already imported:
363
364**What happens:**
365- Fetches all existing records from your Teal feed with **adaptive batch sizing**
366- Starts with small batches (25 records) and automatically adjusts based on network performance
367- Increases batch size (up to 100) when network is fast
368- Decreases batch size (down to 10) when network is slow
369- Shows real-time progress with fetch rate (records/second) and current batch size
370- Compares against your input file
371- Only imports records that don't already exist
372- Shows: "Found X record(s) already in Teal (skipping)"
373
374**Example output:**
375```
376✓ Loaded 10,234 records
377ℹ No duplicates found in input data
378
379=== Checking Existing Records ===
380ℹ Fetching records from Teal to avoid duplicates...
381→ Fetched 1,000 records (125 rec/s, batch: 37, 8.0s)...
382📈 Network good: batch size 37 → 55
383→ Fetched 2,000 records (140 rec/s, batch: 82, 14.3s)...
384📈 Network good: batch size 82 → 100
385→ Fetched 3,000 records (155 rec/s, batch: 100, 19.4s)...
386...
387✓ Found 9,500 existing records in 61.3s (avg 155 rec/s)
388
389=== Identifying New Records ===
390ℹ Total: 10,234 records
391ℹ Existing: 9,100 already in Teal
392ℹ New: 1,134 to import
393```
394
395**This means:**
396- ✅ Safe to re-run imports with updated exports
397- ✅ Won't create duplicates if you run the import twice
398- ✅ Only pays for API calls on new records
399- ✅ Works automatically - no special mode needed
400- ✅ Adapts to your network speed - faster on good connections, stable on slow ones
401- ✅ Batch size shown in debug mode (`-v`) for transparency
402
403**Note:**
404- This duplicate prevention happens automatically for all imports (default behavior)
405- **Credentials required**: Even `--dry-run` needs `--handle` and `--password` to check Teal
406- **Sync mode** (`-m sync`): Now primarily just shows detailed statistics about what's being synced
407- **Deduplicate mode** (`-m deduplicate`): Removes duplicates from already-imported Teal records (cleanup tool)
408
409### Rate Limiting Algorithm
4101. Calculates safe daily limit (75% of 10K = 7,500 records/day by default)
4112. Determines how many days needed for your import
4123. Calculates optimal batch size and delay to spread records evenly
4134. Enforces minimum delay between batches
4145. Shows clear schedule before starting
4156. Logs waiting periods when rate limits are hit with duration
416
417**Example rate-limit wait logging:**
418```
419ℹ Rate limit (hourly), waiting 23m 45s for reset
420ℹ Rate limit (daily), waiting 1h 12m 30s for reset
421```
422
423### Multi-Day Imports
424
425For imports exceeding the daily limit, the importer automatically:
4261. **Calculates a schedule**: Splits your import across multiple days
4272. **Shows the plan**: Displays which records will be imported each day
4283. **Processes Day 1**: Imports the first batch of records
4294. **Waits for reset**: When limits are reached, waits for the hourly/daily reset
4305. **Repeats**: Continues until all records are imported
431
432**Example output for a 20,000 record import:**
433```
434=== Batch Configuration ===
435ℹ Using auto-calculated batch size: 200 records
436ℹ Batch delay: 11520ms
437
438=== Import Configuration ===
439ℹ Total records: 20,000
440ℹ Batch size: 200 records
441ℹ Batch delay: 11520ms
442ℹ Duration: 3 days (7,500 records/day limit)
443⚠️ Large import will span multiple days with automatic rate-limit waits
444
445=== Publishing Records ===
446→ Processed batch 1-200 (0.0s, 173.2 rec/s, 2m 0s remaining)
447→ Processed batch 201-400 (2.0s, 100.0 rec/s, 1m 58s remaining)
448...
449ℹ Rate limit (hourly), waiting 45m 0s for reset
450→ Resuming after rate limit reset
451→ Processed batch 7801-8000 (45m 0s, 2.9 rec/s, 4h 35m 50s remaining)
452```
453
454**Important notes:**
455- You can safely stop (Ctrl+C) and restart
456- Progress is preserved - continues where it left off
457- Each day's progress is clearly displayed
458- Time estimates account for multi-day duration
459
460## Logging and Output
461
462The importer uses color-coded output for clarity:
463
464- **Green (✓)**: Success messages
465- **Cyan (→)**: Progress updates
466- **Yellow (⚠️)**: Warnings
467- **Red (✗)**: Errors
468- **Bold Red (🛑)**: Fatal errors
469- **Blue (ℹ)**: Informational messages (including rate-limit waits)
470
471### Rate-Limit Wait Messages
472
473When the importer hits a rate limit and needs to wait, it logs a clear message:
474
475```
476ℹ Rate limit (hourly), waiting 23m 45s for reset
477ℹ Rate limit (daily), waiting 1h 12m 30s for reset
478```
479
480These messages use `formatDuration()` for human-readable duration display (e.g., `23m 45s`, `1h 12m 30s`). The wait reason indicates which limit was hit:
481- **hourly**: Hourly rate limit reached
482- **daily**: Daily rate limit reached
483
484### Verbosity Levels
485
486**Default Mode**: Standard operational messages
487```bash
488pnpm start -- -i lastfm.csv -h alice.bsky.social -p pass
489```
490
491**Verbose Mode** (`-v`): Detailed debug information including batch timing and API calls
492```bash
493pnpm start -- -i lastfm.csv -h alice.bsky.social -p pass -v
494```
495
496**Quiet Mode** (`-q`): Only warnings and errors
497```bash
498pnpm start -- -i lastfm.csv -h alice.bsky.social -p pass -q
499```
500
501## Error Handling
502
503The importer is designed to be resilient:
504
505- **Network errors**: Failed records are logged but don't stop the import
506- **Invalid data**: Skipped with error messages
507- **Authentication issues**: Clear error messages with suggested fixes
508- **Rate limit hits**: Automatic adjustment with logged backoff duration and retry logic
509- **Ctrl+C handling**: Gracefully stops after current batch
510
511## Troubleshooting
512
513### Authentication Issues
514
515**"Handle not found"**
516- Verify your ATProto handle is correct (e.g., `alice.bsky.social`)
517- Ensure you're using a valid DID or handle
518
519**"Invalid credentials"**
520- Use an **app password**, not your main account password
521- Generate app passwords in your account settings
522
523### Performance Issues
524
525**"Rate limit exceeded"**
526- The importer handles this automatically by waiting for reset
527- Progress messages show wait duration when rate limits are hit
528- Consider reducing batch size with `-b` flag
529
530**Import seems stuck**
531- Check progress messages - large imports take time
532- Rate-limit waits may occur between days
533- You can safely stop (Ctrl+C) and resume later
534- Use `--verbose` flag to see detailed progress
535
536### Connection Issues
537
538**"Connection refused"**
539- Check your internet connection
540- Verify your PDS is accessible
541- Some PDSs may have firewall rules
542
543### Output Control
544
545**Too much output**
546- Use `--quiet` flag to suppress non-essential messages
547- Only warnings and errors will be shown
548
549**Need more details**
550- Use `--verbose` flag to see debug-level information
551- Shows batch timing, API calls, and detailed progress
552
553## Development
554
555```bash
556# Type checking
557pnpm run type-check
558
559# Build
560pnpm run build
561
562# Development mode (rebuild + run)
563pnpm run dev
564
565# Run tests
566pnpm run test
567
568# Clean build artifacts
569pnpm run clean
570```
571
572## Project Structure
573
574```
575atproto-lastfm-importer/
576├── src/
577│ ├── lib/
578│ │ ├── auth.ts # Authentication & identity resolution
579│ │ ├── cli.ts # Command line interface & argument parsing
580│ │ ├── csv.ts # CSV parsing & record conversion
581│ │ ├── publisher.ts # Batch publishing with rate limiting
582│ │ ├── spotify.ts # Spotify JSON parsing
583│ │ ├── merge.ts # Combined import deduplication
584│ │ └── sync.ts # Re-sync mode & duplicate detection
585│ ├── utils/
586│ │ ├── logger.ts # Structured logging system
587│ │ ├── helpers.ts # Utility functions (timing, formatting)
588│ │ ├── input.ts # User input handling (prompts, passwords)
589│ │ ├── rate-limiter.ts # Rate limiting calculations
590│ │ ├── killswitch.ts # Graceful shutdown handling
591│ │ ├── tid.ts # TID generation from timestamps
592│ │ └── ui.ts # UI elements (spinners, progress bars)
593│ ├── config.ts # Configuration constants
594│ └── types.ts # TypeScript type definitions
595├── lexicons/ # fm.teal.alpha lexicon definitions
596│ └── fm.teal.alpha/
597│ └── feed/
598│ └── play.json # Play record schema
599├── package.json
600├── tsconfig.json
601└── README.md
602```
603
604## Technical Details
605
606### Authentication
607- Uses Slingshot resolver to discover your PDS from your handle/DID
608- Requires an ATProto app password (not your main password)
609- Automatically configures the agent for your personal PDS
610
611### Batch Publishing
612- Uses `com.atproto.repo.applyWrites` for efficiency (up to 20x faster than individual calls)
613- Batches up to 200 records per API call (PDS maximum)
614- Automatically adjusts batch size based on total record count
615- Enforces minimum delays between batches for rate limit safety
616
617### Data Mapping
618
619**Last.fm:**
620- Direct mapping from CSV columns
621- Converts Unix timestamps to ISO 8601
622- Preserves MusicBrainz IDs when present
623- Generates URLs from artist/track names
624- Wraps artists in array format with optional MBID
625
626**Spotify:**
627- Extracts data from JSON fields
628- Already in ISO 8601 format (`ts` field)
629- Generates URLs from `spotify_track_uri`
630- Automatically filters non-music content
631- Extracts artist and album from metadata fields
632
633### Lexicon Reference
634
635This importer follows the official `fm.teal.alpha` lexicon defined in `/lexicons/fm.teal.alpha/feed/play.json`.
636
637The lexicon defines required and optional field types, string length constraints, array formats, timestamp formatting, and URL validation.
638
639## Contributing
640
641Contributions are welcome! Please:
6421. Fork the repository
6432. Create a feature branch
6443. Make your changes with tests
6454. Submit a pull request
646
647## License
648
649AGPL-3.0-only - See LICENCE file for details
650
651## Credits
652
653- Uses [@atproto/api](https://www.npmjs.com/package/@atproto/api) for ATProto interactions
654- CSV parsing via [csv-parse](https://www.npmjs.com/package/csv-parse)
655- Identity resolution via [Slingshot](https://slingshot.danner.cloud)
656- Follows the `fm.teal.alpha` lexicon standard
657- Colored output via [chalk](https://www.npmjs.com/package/chalk)
658- Progress indicators via [ora](https://www.npmjs.com/package/ora) and [cli-progress](https://www.npmjs.com/package/cli-progress)
659
660---
661
662**Note**: This tool is for personal use. Respect the terms of service and rate limits when importing your data.