Auto-indexing service and GraphQL API for AT Protocol Records

joins

+12307 -437
.DS_Store

This is a binary file and will not be displayed.

+34
docs/README.md
··· 24 24 25 25 - **Records**: AT Protocol records automatically mapped to GraphQL types 26 26 - **Queries**: Fetch records with filtering, sorting, and pagination 27 + - **Joins**: Traverse relationships between records (forward and reverse) 27 28 - **Mutations**: Create, update, and delete records 28 29 - **Blobs**: Upload and reference binary data (images, files) 29 30 ··· 76 77 } 77 78 ``` 78 79 80 + ### Query with Joins 81 + 82 + ```graphql 83 + query { 84 + appBskyFeedPost(first: 10) { 85 + edges { 86 + node { 87 + uri 88 + text 89 + # Forward join: Get parent post 90 + replyToResolved { 91 + ... on AppBskyFeedPost { 92 + uri 93 + text 94 + } 95 + } 96 + # Reverse join: Get first 20 likes (paginated connection) 97 + appBskyFeedLikeViaSubject(first: 20) { 98 + totalCount # Total likes 99 + edges { 100 + node { 101 + uri 102 + createdAt 103 + } 104 + } 105 + } 106 + } 107 + } 108 + } 109 + } 110 + ``` 111 + 79 112 ## Documentation 80 113 81 114 - [Queries](./queries.md) - Fetching records with filters and sorting 82 115 - [Mutations](./mutations.md) - Creating, updating, and deleting records 116 + - [Joins](./joins.md) - Forward and reverse joins between records 83 117 - [Variables](./variables.md) - Using GraphQL variables 84 118 - [Blobs](./blobs.md) - Working with binary data 85 119
+786
docs/joins.md
··· 1 + # Joins 2 + 3 + QuickSlice automatically generates **forward joins**, **reverse joins**, and **DID joins** based on AT Protocol lexicon schemas, allowing you to traverse relationships between records. 4 + 5 + ## Overview 6 + 7 + - **Forward Joins**: Follow references from one record to another (e.g., post → parent post) 8 + - Returns: Single object or `Record` union 9 + - Naming: `{fieldName}Resolved` 10 + 11 + - **Reverse Joins**: Discover records that reference a given record (e.g., post → all likes on that post) 12 + - Returns: **Paginated Connection** with sorting, filtering, and pagination 13 + - Naming: `{SourceType}Via{FieldName}` 14 + 15 + - **DID Joins**: Find records that share the same author (DID) 16 + - Returns: Single object (unique DID) or **Paginated Connection** (non-unique DID) 17 + - Naming: `{CollectionName}ByDid` 18 + 19 + - **Union Types**: Forward joins return a `Record` union, allowing type-specific field access via inline fragments 20 + 21 + ## Forward Joins 22 + 23 + Forward joins are generated for fields that reference other records via: 24 + - `at-uri` format strings 25 + - `strongRef` objects 26 + 27 + ### Basic Forward Join 28 + 29 + When a field references another record, QuickSlice creates a `*Resolved` field: 30 + 31 + ```graphql 32 + query { 33 + appBskyFeedPost { 34 + edges { 35 + node { 36 + uri 37 + text 38 + replyTo # The at-uri string 39 + replyToResolved { # The resolved record 40 + uri 41 + } 42 + } 43 + } 44 + } 45 + } 46 + ``` 47 + 48 + ### Union Types & Inline Fragments 49 + 50 + Forward join fields return a `Record` union type because the referenced record could be any type. Use inline fragments to access type-specific fields: 51 + 52 + ```graphql 53 + query { 54 + appBskyFeedPost { 55 + edges { 56 + node { 57 + uri 58 + text 59 + replyToResolved { 60 + # Access fields based on the actual type 61 + ... on AppBskyFeedPost { 62 + uri 63 + text 64 + createdAt 65 + } 66 + ... on AppBskyFeedLike { 67 + uri 68 + subject 69 + createdAt 70 + } 71 + } 72 + } 73 + } 74 + } 75 + } 76 + ``` 77 + 78 + ### StrongRef Forward Joins 79 + 80 + StrongRef fields (containing `uri` and `cid`) are resolved automatically: 81 + 82 + ```graphql 83 + query { 84 + appBskyActorProfile { 85 + edges { 86 + node { 87 + displayName 88 + pinnedPost { 89 + uri # Original strongRef uri 90 + cid # Original strongRef cid 91 + } 92 + pinnedPostResolved { 93 + ... on AppBskyFeedPost { 94 + uri 95 + text 96 + likeCount 97 + } 98 + } 99 + } 100 + } 101 + } 102 + } 103 + ``` 104 + 105 + ## Reverse Joins 106 + 107 + Reverse joins are automatically discovered by analyzing all lexicons. They allow you to find all records that reference a given record. **Reverse joins return paginated connections** with support for sorting, filtering, and cursor-based pagination. 108 + 109 + ### Basic Reverse Join 110 + 111 + Reverse join fields are named: `{SourceType}Via{FieldName}` and return a Connection type: 112 + 113 + ```graphql 114 + query { 115 + appBskyFeedPost { 116 + edges { 117 + node { 118 + uri 119 + text 120 + # Find all likes that reference this post via their 'subject' field 121 + appBskyFeedLikeViaSubject(first: 20) { 122 + totalCount # Total number of likes 123 + edges { 124 + node { 125 + uri 126 + createdAt 127 + } 128 + cursor 129 + } 130 + pageInfo { 131 + hasNextPage 132 + hasPreviousPage 133 + startCursor 134 + endCursor 135 + } 136 + } 137 + } 138 + } 139 + } 140 + } 141 + ``` 142 + 143 + ### Multiple Reverse Joins 144 + 145 + A record type can have multiple reverse join fields. You can request different page sizes for each: 146 + 147 + ```graphql 148 + query { 149 + appBskyFeedPost { 150 + edges { 151 + node { 152 + uri 153 + text 154 + # Get first 10 replies 155 + appBskyFeedPostViaReplyTo(first: 10) { 156 + totalCount 157 + edges { 158 + node { 159 + uri 160 + text 161 + } 162 + } 163 + } 164 + # Get first 20 likes 165 + appBskyFeedLikeViaSubject(first: 20) { 166 + totalCount 167 + edges { 168 + node { 169 + uri 170 + createdAt 171 + } 172 + } 173 + } 174 + # Get first 20 reposts 175 + appBskyFeedRepostViaSubject(first: 20) { 176 + totalCount 177 + edges { 178 + node { 179 + uri 180 + createdAt 181 + } 182 + } 183 + } 184 + } 185 + } 186 + } 187 + } 188 + ``` 189 + 190 + ### Reverse Joins with StrongRef 191 + 192 + Reverse joins work with strongRef fields too. You can also use sorting and filtering: 193 + 194 + ```graphql 195 + query { 196 + appBskyFeedPost { 197 + edges { 198 + node { 199 + uri 200 + text 201 + # Find all profiles that pinned this post 202 + appBskyActorProfileViaPinnedPost( 203 + sortBy: [{field: "indexedAt", direction: DESC}] 204 + ) { 205 + totalCount 206 + edges { 207 + node { 208 + uri 209 + displayName 210 + } 211 + } 212 + } 213 + } 214 + } 215 + } 216 + } 217 + ``` 218 + 219 + ### Sorting Reverse Joins 220 + 221 + You can sort reverse join results by any field in the joined collection: 222 + 223 + ```graphql 224 + query { 225 + appBskyFeedPost { 226 + edges { 227 + node { 228 + uri 229 + # Get most recent likes first 230 + appBskyFeedLikeViaSubject( 231 + first: 10 232 + sortBy: [{field: "createdAt", direction: DESC}] 233 + ) { 234 + edges { 235 + node { 236 + uri 237 + createdAt 238 + } 239 + } 240 + } 241 + } 242 + } 243 + } 244 + } 245 + ``` 246 + 247 + ### Filtering Reverse Joins 248 + 249 + Use `where` filters to narrow down nested join results: 250 + 251 + ```graphql 252 + query { 253 + appBskyFeedPost { 254 + edges { 255 + node { 256 + uri 257 + text 258 + # Only get likes from a specific user 259 + appBskyFeedLikeViaSubject( 260 + where: { did: { eq: "did:plc:abc123" } } 261 + ) { 262 + totalCount # Likes from this specific user 263 + edges { 264 + node { 265 + uri 266 + createdAt 267 + } 268 + } 269 + } 270 + } 271 + } 272 + } 273 + } 274 + ``` 275 + 276 + ## DID Joins 277 + 278 + DID joins allow you to traverse relationships between records that share the same author (DID). These are automatically generated for all collection pairs and are named: `{CollectionName}ByDid` 279 + 280 + ### Two Types of DID Joins 281 + 282 + #### 1. Unique DID Joins (literal:self key) 283 + 284 + Collections with a `literal:self` key (like profiles) have only one record per DID. These return a **single nullable object** (no pagination needed): 285 + 286 + ```graphql 287 + query { 288 + appBskyFeedPost { 289 + edges { 290 + node { 291 + uri 292 + text 293 + # Get the author's profile (single object, not paginated) 294 + appBskyActorProfileByDid { 295 + uri 296 + displayName 297 + bio 298 + } 299 + } 300 + } 301 + } 302 + } 303 + ``` 304 + 305 + #### 2. Non-Unique DID Joins 306 + 307 + Most collections can have multiple records per DID. These return **paginated connections** with full support for sorting, filtering, and pagination: 308 + 309 + ```graphql 310 + query { 311 + appBskyActorProfile { 312 + edges { 313 + node { 314 + displayName 315 + # Get all posts by this user (paginated) 316 + appBskyFeedPostByDid( 317 + first: 10 318 + sortBy: [{field: "indexedAt", direction: DESC}] 319 + ) { 320 + totalCount # Total posts by this user 321 + edges { 322 + node { 323 + uri 324 + text 325 + indexedAt 326 + } 327 + } 328 + pageInfo { 329 + hasNextPage 330 + endCursor 331 + } 332 + } 333 + } 334 + } 335 + } 336 + } 337 + ``` 338 + 339 + ### DID Join with Filtering 340 + 341 + Combine DID joins with filters to find specific records: 342 + 343 + ```graphql 344 + query { 345 + appBskyActorProfile(where: { did: { eq: "did:plc:abc123" } }) { 346 + edges { 347 + node { 348 + displayName 349 + # Get only posts containing "gleam" 350 + appBskyFeedPostByDid( 351 + where: { text: { contains: "gleam" } } 352 + sortBy: [{field: "indexedAt", direction: DESC}] 353 + ) { 354 + totalCount # Posts mentioning "gleam" 355 + edges { 356 + node { 357 + text 358 + indexedAt 359 + } 360 + } 361 + } 362 + } 363 + } 364 + } 365 + } 366 + ``` 367 + 368 + ### Cross-Collection DID Queries 369 + 370 + DID joins work across all collection pairs, enabling powerful cross-collection queries: 371 + 372 + ```graphql 373 + query { 374 + appBskyActorProfile { 375 + edges { 376 + node { 377 + displayName 378 + # All their posts 379 + appBskyFeedPostByDid(first: 10) { 380 + totalCount 381 + edges { 382 + node { 383 + text 384 + } 385 + } 386 + } 387 + # All their likes 388 + appBskyFeedLikeByDid(first: 10) { 389 + totalCount 390 + edges { 391 + node { 392 + subject 393 + } 394 + } 395 + } 396 + # All their reposts 397 + appBskyFeedRepostByDid(first: 10) { 398 + totalCount 399 + edges { 400 + node { 401 + subject 402 + } 403 + } 404 + } 405 + } 406 + } 407 + } 408 + } 409 + ``` 410 + 411 + ### DID Join Arguments 412 + 413 + Non-unique DID joins support all standard connection arguments: 414 + 415 + | Argument | Type | Description | 416 + |----------|------|-------------| 417 + | `first` | `Int` | Number of records to return (forward pagination) | 418 + | `after` | `String` | Cursor for forward pagination | 419 + | `last` | `Int` | Number of records to return (backward pagination) | 420 + | `before` | `String` | Cursor for backward pagination | 421 + | `sortBy` | `[SortFieldInput!]` | Sort by any field in the collection | 422 + | `where` | `WhereInput` | Filter nested records | 423 + 424 + ## Complete Example 425 + 426 + Combining forward joins, reverse joins, and DID joins to build a rich thread view: 427 + 428 + ```graphql 429 + query GetThread($postUri: String!) { 430 + appBskyFeedPost(where: { uri: { eq: $postUri } }) { 431 + edges { 432 + node { 433 + uri 434 + text 435 + createdAt 436 + 437 + # DID join: Get the author's profile 438 + appBskyActorProfileByDid { 439 + displayName 440 + bio 441 + } 442 + 443 + # Forward join: Get the parent post 444 + replyToResolved { 445 + ... on AppBskyFeedPost { 446 + uri 447 + text 448 + createdAt 449 + } 450 + } 451 + 452 + # Reverse join: Get first 10 replies 453 + appBskyFeedPostViaReplyTo( 454 + first: 10 455 + sortBy: [{field: "createdAt", direction: ASC}] 456 + ) { 457 + totalCount # Total replies 458 + edges { 459 + node { 460 + uri 461 + text 462 + createdAt 463 + } 464 + } 465 + pageInfo { 466 + hasNextPage 467 + } 468 + } 469 + 470 + # Reverse join: Get first 20 likes 471 + appBskyFeedLikeViaSubject(first: 20) { 472 + totalCount # Like count 473 + edges { 474 + node { 475 + uri 476 + createdAt 477 + } 478 + } 479 + } 480 + 481 + # Reverse join: Get reposts 482 + appBskyFeedRepostViaSubject(first: 20) { 483 + totalCount # Repost count 484 + edges { 485 + node { 486 + uri 487 + createdAt 488 + } 489 + } 490 + } 491 + } 492 + } 493 + } 494 + } 495 + ``` 496 + 497 + ## DataLoader Batching 498 + 499 + All joins use DataLoader for efficient batching: 500 + 501 + ```graphql 502 + # This query will batch all replyToResolved lookups into a single database query 503 + query { 504 + appBskyFeedPost(first: 100) { 505 + edges { 506 + node { 507 + uri 508 + text 509 + replyToResolved { 510 + ... on AppBskyFeedPost { 511 + uri 512 + text 513 + } 514 + } 515 + } 516 + } 517 + } 518 + } 519 + ``` 520 + 521 + **How it works:** 522 + 1. Fetches 100 posts 523 + 2. Collects all unique `replyTo` URIs 524 + 3. Batches them into a single SQL query: `WHERE uri IN (...)` 525 + 4. Returns resolved records efficiently 526 + 527 + ## Performance Tips 528 + 529 + ### 1. Only Request What You Need 530 + 531 + ```graphql 532 + # Good: Only request specific fields 533 + query { 534 + appBskyFeedPost { 535 + edges { 536 + node { 537 + uri 538 + text 539 + appBskyFeedLikeViaSubject(first: 20) { 540 + totalCount # Get count without fetching all records 541 + edges { 542 + node { 543 + uri # Only need the URI 544 + } 545 + } 546 + } 547 + } 548 + } 549 + } 550 + } 551 + ``` 552 + 553 + ### 2. Use totalCount for Metrics 554 + 555 + Get engagement counts efficiently without fetching all records: 556 + 557 + ```graphql 558 + query { 559 + appBskyFeedPost { 560 + edges { 561 + node { 562 + uri 563 + text 564 + # Just get counts, no records 565 + likes: appBskyFeedLikeViaSubject(first: 0) { 566 + totalCount # Like count 567 + } 568 + reposts: appBskyFeedRepostViaSubject(first: 0) { 569 + totalCount # Repost count 570 + } 571 + replies: appBskyFeedPostViaReplyTo(first: 0) { 572 + totalCount # Reply count 573 + } 574 + } 575 + } 576 + } 577 + } 578 + ``` 579 + 580 + ### 3. Use Pagination on Nested Joins 581 + 582 + Nested joins are paginated by default. Always specify `first` or `last` for optimal performance: 583 + 584 + ```graphql 585 + query { 586 + appBskyFeedPost(first: 10) { 587 + edges { 588 + node { 589 + uri 590 + text 591 + # Limit nested join results 592 + appBskyFeedLikeViaSubject(first: 20) { 593 + totalCount # Total likes 594 + edges { 595 + node { 596 + uri 597 + } 598 + } 599 + pageInfo { 600 + hasNextPage # Know if there are more 601 + } 602 + } 603 + } 604 + } 605 + } 606 + } 607 + ``` 608 + 609 + ### 4. Avoid Deep Nesting 610 + 611 + ```graphql 612 + # Avoid: Deeply nested joins can be expensive 613 + query { 614 + appBskyFeedPost { 615 + edges { 616 + node { 617 + replyToResolved { 618 + ... on AppBskyFeedPost { 619 + replyToResolved { 620 + ... on AppBskyFeedPost { 621 + replyToResolved { 622 + # Too deep! 623 + } 624 + } 625 + } 626 + } 627 + } 628 + } 629 + } 630 + } 631 + } 632 + ``` 633 + 634 + ## Type Resolution 635 + 636 + The `Record` union uses a type resolver that examines the `collection` field: 637 + 638 + | Collection | GraphQL Type | 639 + |------------|--------------| 640 + | `app.bsky.feed.post` | `AppBskyFeedPost` | 641 + | `app.bsky.feed.like` | `AppBskyFeedLike` | 642 + | `app.bsky.actor.profile` | `AppBskyActorProfile` | 643 + 644 + This allows inline fragments to work correctly: 645 + 646 + ```graphql 647 + { 648 + appBskyFeedPost { 649 + edges { 650 + node { 651 + replyToResolved { 652 + # Runtime type is determined by the collection field 653 + ... on AppBskyFeedPost { text } 654 + ... on AppBskyFeedLike { subject } 655 + } 656 + } 657 + } 658 + } 659 + } 660 + ``` 661 + 662 + ## Schema Introspection 663 + 664 + Discover available joins using introspection: 665 + 666 + ```graphql 667 + query { 668 + __type(name: "AppBskyFeedPost") { 669 + fields { 670 + name 671 + type { 672 + name 673 + kind 674 + } 675 + } 676 + } 677 + } 678 + ``` 679 + 680 + Look for fields ending in: 681 + - `Resolved` (forward joins) 682 + - `Via*` (reverse joins) 683 + - `ByDid` (DID joins) 684 + 685 + ## Common Patterns 686 + 687 + ### Thread Navigation 688 + 689 + ```graphql 690 + # Get a post and its parent 691 + query { 692 + appBskyFeedPost(where: { uri: { eq: $uri } }) { 693 + edges { 694 + node { 695 + uri 696 + text 697 + replyToResolved { 698 + ... on AppBskyFeedPost { 699 + uri 700 + text 701 + } 702 + } 703 + } 704 + } 705 + } 706 + } 707 + ``` 708 + 709 + ### Engagement Metrics 710 + 711 + Use `totalCount` to get efficient engagement counts without fetching all records: 712 + 713 + ```graphql 714 + # Get counts efficiently 715 + query { 716 + appBskyFeedPost { 717 + edges { 718 + node { 719 + uri 720 + text 721 + # Get like count 722 + likes: appBskyFeedLikeViaSubject(first: 0) { 723 + totalCount 724 + } 725 + # Get repost count 726 + reposts: appBskyFeedRepostViaSubject(first: 0) { 727 + totalCount 728 + } 729 + # Get reply count 730 + replies: appBskyFeedPostViaReplyTo(first: 0) { 731 + totalCount 732 + } 733 + } 734 + } 735 + } 736 + } 737 + ``` 738 + 739 + Or fetch recent engagement with pagination: 740 + 741 + ```graphql 742 + query { 743 + appBskyFeedPost { 744 + edges { 745 + node { 746 + uri 747 + text 748 + # Get 10 most recent likes 749 + likes: appBskyFeedLikeViaSubject( 750 + first: 10 751 + sortBy: [{field: "createdAt", direction: DESC}] 752 + ) { 753 + totalCount # Total like count 754 + edges { 755 + node { 756 + did # Who liked it 757 + createdAt 758 + } 759 + } 760 + } 761 + } 762 + } 763 + } 764 + } 765 + ``` 766 + 767 + ### User's Pinned Content 768 + 769 + ```graphql 770 + query { 771 + appBskyActorProfile(where: { did: { eq: $did } }) { 772 + edges { 773 + node { 774 + displayName 775 + pinnedPostResolved { 776 + ... on AppBskyFeedPost { 777 + uri 778 + text 779 + createdAt 780 + } 781 + } 782 + } 783 + } 784 + } 785 + } 786 + ```
example/.DS_Store

This is a binary file and will not be displayed.

+7
graphql/birdie_snapshots/execute_mixed_aliased_and_non_aliased_fields.accepted
··· 1 + --- 2 + version: 1.4.1 3 + title: Execute mixed aliased and non-aliased fields 4 + file: ./test/executor_test.gleam 5 + test_name: execute_mixed_aliased_fields_test 6 + --- 7 + Response(Object([#("greeting", String("world")), #("number", Int(42))]), [])
+7
graphql/birdie_snapshots/execute_multiple_fields_with_aliases.accepted
··· 1 + --- 2 + version: 1.4.1 3 + title: Execute multiple fields with aliases 4 + file: ./test/executor_test.gleam 5 + test_name: execute_multiple_fields_with_aliases_test 6 + --- 7 + Response(Object([#("greeting", String("world")), #("num", Int(42))]), [])
+7
graphql/birdie_snapshots/execute_union_list_with_inline_fragments.accepted
··· 1 + --- 2 + version: 1.4.1 3 + title: Execute union list with inline fragments 4 + file: ./test/executor_test.gleam 5 + test_name: execute_union_list_with_inline_fragments_test 6 + --- 7 + Response(Object([#("searchAll", List([Object([#("title", String("First Post"))]), Object([#("text", String("Great article!"))]), Object([#("title", String("Second Post"))])]))]), [])
+7
graphql/birdie_snapshots/execute_union_with_inline_fragment.accepted
··· 1 + --- 2 + version: 1.4.1 3 + title: Execute union with inline fragment 4 + file: ./test/executor_test.gleam 5 + test_name: execute_union_with_inline_fragment_test 6 + --- 7 + Response(Object([#("search", Object([#("title", String("GraphQL is awesome")), #("content", String("Learn all about GraphQL..."))]))]), [])
+168 -24
graphql/src/graphql/executor.gleam
··· 4 4 import gleam/dict.{type Dict} 5 5 import gleam/list 6 6 import gleam/option.{None, Some} 7 + import gleam/set.{type Set} 7 8 import graphql/introspection 8 9 import graphql/parser 9 10 import graphql/schema ··· 17 18 /// GraphQL Response 18 19 pub type Response { 19 20 Response(data: value.Value, errors: List(GraphQLError)) 21 + } 22 + 23 + /// Get the response key for a field (alias if present, otherwise field name) 24 + fn response_key(field_name: String, alias: option.Option(String)) -> String { 25 + case alias { 26 + option.Some(alias_name) -> alias_name 27 + option.None -> field_name 28 + } 20 29 } 21 30 22 31 /// Execute a GraphQL query ··· 312 321 } 313 322 } 314 323 } 315 - parser.Field(name, _alias, arguments, nested_selections) -> { 324 + parser.Field(name, alias, arguments, nested_selections) -> { 316 325 // Convert arguments to dict (with variable resolution from context) 317 326 let args_dict = arguments_to_dict(arguments, ctx) 318 327 328 + // Determine the response key (use alias if provided, otherwise field name) 329 + let key = response_key(name, alias) 330 + 319 331 // Handle introspection meta-fields 320 332 case name { 321 333 "__typename" -> { 322 334 let type_name = schema.type_name(parent_type) 323 - Ok(#("__typename", value.String(type_name), [])) 335 + Ok(#(key, value.String(type_name), [])) 324 336 } 325 337 "__schema" -> { 326 338 let schema_value = introspection.schema_introspection(graphql_schema) 327 339 // Handle nested selections on __schema 328 340 case nested_selections { 329 - [] -> Ok(#("__schema", schema_value, [])) 341 + [] -> Ok(#(key, schema_value, [])) 330 342 _ -> { 331 343 let selection_set = parser.SelectionSet(nested_selections) 332 344 // We don't have an actual type for __Schema, so we'll handle it specially ··· 339 351 ctx, 340 352 fragments, 341 353 ["__schema", ..path], 354 + set.new(), 342 355 ) 343 356 { 344 357 Ok(#(nested_data, nested_errors)) -> 345 - Ok(#("__schema", nested_data, nested_errors)) 358 + Ok(#(key, nested_data, nested_errors)) 346 359 Error(err) -> { 347 360 let error = GraphQLError(err, ["__schema", ..path]) 348 - Ok(#("__schema", value.Null, [error])) 361 + Ok(#(key, value.Null, [error])) 362 + } 363 + } 364 + } 365 + } 366 + } 367 + "__type" -> { 368 + // Extract the "name" argument 369 + case dict.get(args_dict, "name") { 370 + Ok(value.String(type_name)) -> { 371 + // Look up the type in the schema 372 + case introspection.type_by_name_introspection(graphql_schema, type_name) { 373 + option.Some(type_value) -> { 374 + // Handle nested selections on __type 375 + case nested_selections { 376 + [] -> Ok(#(key, type_value, [])) 377 + _ -> { 378 + let selection_set = parser.SelectionSet(nested_selections) 379 + case 380 + execute_introspection_selection_set( 381 + selection_set, 382 + type_value, 383 + graphql_schema, 384 + ctx, 385 + fragments, 386 + ["__type", ..path], 387 + set.new(), 388 + ) 389 + { 390 + Ok(#(nested_data, nested_errors)) -> 391 + Ok(#(key, nested_data, nested_errors)) 392 + Error(err) -> { 393 + let error = GraphQLError(err, ["__type", ..path]) 394 + Ok(#(key, value.Null, [error])) 395 + } 396 + } 397 + } 398 + } 399 + } 400 + option.None -> { 401 + // Type not found, return null (per GraphQL spec) 402 + Ok(#(key, value.Null, [])) 349 403 } 350 404 } 351 405 } 406 + Ok(_) -> { 407 + let error = GraphQLError("__type argument 'name' must be a String", path) 408 + Ok(#(key, value.Null, [error])) 409 + } 410 + Error(_) -> { 411 + let error = GraphQLError("__type requires a 'name' argument", path) 412 + Ok(#(key, value.Null, [error])) 413 + } 352 414 } 353 415 } 354 416 _ -> { ··· 356 418 case schema.get_field(parent_type, name) { 357 419 None -> { 358 420 let error = GraphQLError("Field '" <> name <> "' not found", path) 359 - Ok(#(name, value.Null, [error])) 421 + Ok(#(key, value.Null, [error])) 360 422 } 361 423 Some(field) -> { 362 424 // Get the field's type for nested selections ··· 369 431 case schema.resolve_field(field, field_ctx) { 370 432 Error(err) -> { 371 433 let error = GraphQLError(err, [name, ..path]) 372 - Ok(#(name, value.Null, [error])) 434 + Ok(#(key, value.Null, [error])) 373 435 } 374 436 Ok(field_value) -> { 375 437 // If there are nested selections, recurse 376 438 case nested_selections { 377 - [] -> Ok(#(name, field_value, [])) 439 + [] -> Ok(#(key, field_value, [])) 378 440 _ -> { 379 441 // Need to resolve nested fields 380 442 case field_value { 381 443 value.Object(_) -> { 382 - // Execute nested selections using the field's type, not parent type 444 + // Check if field_type_def is a union type 445 + // If so, resolve it to the concrete type first 446 + let type_to_use = case schema.is_union(field_type_def) { 447 + True -> { 448 + // Create context with the field value for type resolution 449 + let resolve_ctx = 450 + schema.context(option.Some(field_value)) 451 + case 452 + schema.resolve_union_type(field_type_def, resolve_ctx) 453 + { 454 + Ok(concrete_type) -> concrete_type 455 + Error(_) -> field_type_def 456 + // Fallback to union type if resolution fails 457 + } 458 + } 459 + False -> field_type_def 460 + } 461 + 462 + // Execute nested selections using the resolved type 383 463 // Create new context with this object's data 384 464 let object_ctx = schema.context(option.Some(field_value)) 385 465 let selection_set = ··· 387 467 case 388 468 execute_selection_set( 389 469 selection_set, 390 - field_type_def, 470 + type_to_use, 391 471 graphql_schema, 392 472 object_ctx, 393 473 fragments, ··· 395 475 ) 396 476 { 397 477 Ok(#(nested_data, nested_errors)) -> 398 - Ok(#(name, nested_data, nested_errors)) 478 + Ok(#(key, nested_data, nested_errors)) 399 479 Error(err) -> { 400 480 let error = GraphQLError(err, [name, ..path]) 401 - Ok(#(name, value.Null, [error])) 481 + Ok(#(key, value.Null, [error])) 402 482 } 403 483 } 404 484 } ··· 423 503 parser.SelectionSet(nested_selections) 424 504 let results = 425 505 list.map(items, fn(item) { 506 + // Check if inner_type is a union and resolve it 507 + let item_type = case schema.is_union(inner_type) { 508 + True -> { 509 + // Create context with the item value for type resolution 510 + let resolve_ctx = schema.context(option.Some(item)) 511 + case schema.resolve_union_type(inner_type, resolve_ctx) { 512 + Ok(concrete_type) -> concrete_type 513 + Error(_) -> inner_type // Fallback to union type if resolution fails 514 + } 515 + } 516 + False -> inner_type 517 + } 518 + 426 519 // Create context with this item's data 427 520 let item_ctx = schema.context(option.Some(item)) 428 521 execute_selection_set( 429 522 selection_set, 430 - inner_type, 523 + item_type, 431 524 graphql_schema, 432 525 item_ctx, 433 526 fragments, ··· 454 547 } 455 548 }) 456 549 457 - Ok(#(name, value.List(processed_items), all_errors)) 550 + Ok(#(key, value.List(processed_items), all_errors)) 458 551 } 459 - _ -> Ok(#(name, field_value, [])) 552 + _ -> Ok(#(key, field_value, [])) 460 553 } 461 554 } 462 555 } ··· 479 572 ctx: schema.Context, 480 573 fragments: Dict(String, parser.Operation), 481 574 path: List(String), 575 + visited_types: Set(String), 482 576 ) -> Result(#(value.Value, List(GraphQLError)), String) { 483 577 case selection_set { 484 578 parser.SelectionSet(selections) -> { ··· 494 588 ctx, 495 589 fragments, 496 590 path, 591 + visited_types, 497 592 ) 498 593 }) 499 594 ··· 524 619 Ok(#(value.Null, [])) 525 620 } 526 621 value.Object(fields) -> { 527 - // For each selection, find the corresponding field in the object 528 - let results = 622 + // CYCLE DETECTION: Extract type name from object to detect circular references 623 + let type_name = case list.key_find(fields, "name") { 624 + Ok(value.String(name)) -> option.Some(name) 625 + _ -> option.None 626 + } 627 + 628 + // Check if we've already visited this type to prevent infinite loops 629 + let is_cycle = case type_name { 630 + option.Some(name) -> set.contains(visited_types, name) 631 + option.None -> False 632 + } 633 + 634 + // If we detected a cycle, return a minimal object to break the loop 635 + case is_cycle { 636 + True -> { 637 + // Return just the type name and kind to break the cycle 638 + let minimal_fields = case type_name { 639 + option.Some(name) -> { 640 + let kind_value = case list.key_find(fields, "kind") { 641 + Ok(kind) -> kind 642 + Error(_) -> value.Null 643 + } 644 + [#("name", value.String(name)), #("kind", kind_value)] 645 + } 646 + option.None -> [] 647 + } 648 + Ok(#(value.Object(minimal_fields), [])) 649 + } 650 + False -> { 651 + // Add current type to visited set before recursing 652 + let new_visited = case type_name { 653 + option.Some(name) -> set.insert(visited_types, name) 654 + option.None -> visited_types 655 + } 656 + 657 + // For each selection, find the corresponding field in the object 658 + let results = 529 659 list.map(selections, fn(selection) { 530 660 case selection { 531 661 parser.FragmentSpread(name) -> { 532 662 // Look up the fragment definition 533 663 case dict.get(fragments, name) { 534 - Error(_) -> Error(Nil) 535 - // Fragment not found, skip it 664 + Error(_) -> { 665 + // Fragment not found - return error 666 + let error = GraphQLError("Fragment '" <> name <> "' not found", path) 667 + Ok(#("__FRAGMENT_ERROR", value.String("Fragment not found: " <> name), [error])) 668 + } 536 669 Ok(parser.FragmentDefinition( 537 670 _fname, 538 671 _type_condition, 539 672 fragment_selection_set, 540 673 )) -> { 541 674 // For introspection, we don't check type conditions - just execute the fragment 675 + // IMPORTANT: Use visited_types (not new_visited) because we're selecting from 676 + // the SAME object, not recursing into it. The current object was already added 677 + // to new_visited, but the fragment is just selecting different fields. 542 678 case 543 679 execute_introspection_selection_set( 544 680 fragment_selection_set, ··· 547 683 ctx, 548 684 fragments, 549 685 path, 686 + visited_types, 550 687 ) 551 688 { 552 689 Ok(#(value.Object(fragment_fields), errs)) -> ··· 577 714 ctx, 578 715 fragments, 579 716 path, 717 + new_visited, 580 718 ) 581 719 { 582 720 Ok(#(value.Object(fragment_fields), errs)) -> ··· 590 728 Error(_err) -> Error(Nil) 591 729 } 592 730 } 593 - parser.Field(name, _alias, _arguments, nested_selections) -> { 731 + parser.Field(name, alias, _arguments, nested_selections) -> { 732 + // Determine the response key (use alias if provided, otherwise field name) 733 + let key = response_key(name, alias) 734 + 594 735 // Find the field in the object 595 736 case list.key_find(fields, name) { 596 737 Ok(field_value) -> { 597 738 // Handle nested selections 598 739 case nested_selections { 599 - [] -> Ok(#(name, field_value, [])) 740 + [] -> Ok(#(key, field_value, [])) 600 741 _ -> { 601 742 let selection_set = 602 743 parser.SelectionSet(nested_selections) ··· 608 749 ctx, 609 750 fragments, 610 751 [name, ..path], 752 + new_visited, 611 753 ) 612 754 { 613 755 Ok(#(nested_data, nested_errors)) -> 614 - Ok(#(name, nested_data, nested_errors)) 756 + Ok(#(key, nested_data, nested_errors)) 615 757 Error(err) -> { 616 758 let error = GraphQLError(err, [name, ..path]) 617 - Ok(#(name, value.Null, [error])) 759 + Ok(#(key, value.Null, [error])) 618 760 } 619 761 } 620 762 } ··· 623 765 Error(_) -> { 624 766 let error = 625 767 GraphQLError("Field '" <> name <> "' not found", path) 626 - Ok(#(name, value.Null, [error])) 768 + Ok(#(key, value.Null, [error])) 627 769 } 628 770 } 629 771 } ··· 655 797 }) 656 798 657 799 Ok(#(value.Object(data), errors)) 800 + } 801 + } 658 802 } 659 803 _ -> 660 804 Error(
+150 -28
graphql/src/graphql/introspection.gleam
··· 2 2 /// 3 3 /// Implements the GraphQL introspection system per the GraphQL spec. 4 4 /// Provides __schema, __type, and __typename meta-fields. 5 + import gleam/dict 5 6 import gleam/list 6 7 import gleam/option 8 + import gleam/result 7 9 import graphql/schema 8 10 import graphql/value 9 11 ··· 30 32 ]) 31 33 } 32 34 35 + /// Build introspection value for __type(name: "TypeName") 36 + /// Returns Some(type_introspection) if the type is found, None otherwise 37 + pub fn type_by_name_introspection( 38 + graphql_schema: schema.Schema, 39 + type_name: String, 40 + ) -> option.Option(value.Value) { 41 + let all_types = get_all_schema_types(graphql_schema) 42 + 43 + // Find the type with the matching name 44 + let found_type = 45 + list.find(all_types, fn(t) { schema.type_name(t) == type_name }) 46 + 47 + case found_type { 48 + Ok(t) -> option.Some(type_introspection(t)) 49 + Error(_) -> option.None 50 + } 51 + } 52 + 33 53 /// Get all types from the schema as schema.Type values 34 54 /// Useful for testing and documentation generation 35 55 pub fn get_all_schema_types(graphql_schema: schema.Schema) -> List(schema.Type) { ··· 46 66 option.None -> mut_collected_types 47 67 } 48 68 49 - // Deduplicate by type name 50 - let type_names = list.map(all_collected_types, schema.type_name) 51 - let unique_types = 52 - list.zip(type_names, all_collected_types) 53 - |> list.unique 54 - |> list.map(fn(pair) { pair.1 }) 69 + // Deduplicate by type name, preferring types with more fields 70 + // This ensures we get the "most complete" version of each type 71 + let unique_types = deduplicate_types_by_name(all_collected_types) 55 72 56 73 // Add any built-in scalars that aren't already in the list 57 74 let all_built_ins = [ ··· 80 97 list.map(all_types, type_introspection) 81 98 } 82 99 100 + /// Deduplicate types by name, keeping the version with the most fields 101 + /// This ensures we get the "most complete" version of each type when 102 + /// multiple versions exist (e.g., from different passes in schema building) 103 + fn deduplicate_types_by_name( 104 + types: List(schema.Type), 105 + ) -> List(schema.Type) { 106 + // Group types by name 107 + types 108 + |> list.group(schema.type_name) 109 + |> dict.to_list 110 + |> list.map(fn(pair) { 111 + let #(_name, type_list) = pair 112 + // For each group, find the type with the most content 113 + type_list 114 + |> list.reduce(fn(best, current) { 115 + // Count content: fields for object types, enum values for enums, etc. 116 + let best_content_count = get_type_content_count(best) 117 + let current_content_count = get_type_content_count(current) 118 + 119 + // Prefer the type with more content 120 + case current_content_count > best_content_count { 121 + True -> current 122 + False -> best 123 + } 124 + }) 125 + |> result.unwrap( 126 + list.first(type_list) 127 + |> result.unwrap(schema.string_type()), 128 + ) 129 + }) 130 + } 131 + 132 + /// Get the "content count" for a type (fields, enum values, input fields, etc.) 133 + /// This helps us pick the most complete version of a type during deduplication 134 + fn get_type_content_count(t: schema.Type) -> Int { 135 + // For object types, count fields 136 + let field_count = list.length(schema.get_fields(t)) 137 + 138 + // For enum types, count enum values 139 + let enum_value_count = list.length(schema.get_enum_values(t)) 140 + 141 + // For input object types, count input fields 142 + let input_field_count = list.length(schema.get_input_fields(t)) 143 + 144 + // Return the maximum (types will only have one of these be non-zero) 145 + [field_count, enum_value_count, input_field_count] 146 + |> list.reduce(fn(a, b) { 147 + case a > b { 148 + True -> a 149 + False -> b 150 + } 151 + }) 152 + |> result.unwrap(0) 153 + } 154 + 83 155 /// Collect all types referenced in a type (recursively) 156 + /// Note: We collect ALL instances of each type (even duplicates by name) 157 + /// because we want to find the "most complete" version during deduplication 84 158 fn collect_types_from_type( 85 159 t: schema.Type, 86 160 acc: List(schema.Type), 87 161 ) -> List(schema.Type) { 88 - case 89 - list.any(acc, fn(existing) { 90 - schema.type_name(existing) == schema.type_name(t) 91 - }) 162 + // Always add this type - we'll deduplicate later by choosing the version with most fields 163 + let new_acc = [t, ..acc] 164 + 165 + // To prevent infinite recursion, check if we've already traversed this exact type instance 166 + // We use a simple heuristic: if this type name appears multiple times AND this specific 167 + // instance has the same or fewer content than what we've seen, skip traversing its children 168 + let should_traverse_children = case 169 + schema.is_object(t) || schema.is_enum(t) || schema.is_union(t) 92 170 { 93 - True -> acc 94 - // Already collected this type 95 - False -> { 96 - let new_acc = [t, ..acc] 171 + True -> { 172 + let current_content_count = get_type_content_count(t) 173 + let existing_with_same_name = 174 + list.filter(acc, fn(existing) { 175 + schema.type_name(existing) == schema.type_name(t) 176 + }) 177 + let max_existing_content = 178 + existing_with_same_name 179 + |> list.map(get_type_content_count) 180 + |> list.reduce(fn(a, b) { 181 + case a > b { 182 + True -> a 183 + False -> b 184 + } 185 + }) 186 + |> result.unwrap(0) 187 + 188 + // Only traverse if this instance has more content than we've seen before 189 + current_content_count > max_existing_content 190 + } 191 + False -> True 192 + } 193 + 194 + case should_traverse_children { 195 + False -> new_acc 196 + True -> { 97 197 98 198 // Recursively collect types from fields if this is an object type 99 199 case schema.is_object(t) { ··· 112 212 }) 113 213 } 114 214 False -> { 115 - // Check if it's an InputObjectType 116 - let input_fields = schema.get_input_fields(t) 117 - case list.is_empty(input_fields) { 118 - False -> { 119 - // This is an InputObjectType, collect types from its fields 120 - list.fold(input_fields, new_acc, fn(acc2, input_field) { 121 - let field_type = schema.input_field_type(input_field) 122 - collect_types_from_type_deep(field_type, acc2) 215 + // Check if it's a union type 216 + case schema.is_union(t) { 217 + True -> { 218 + // Collect types from union's possible_types 219 + let possible_types = schema.get_possible_types(t) 220 + list.fold(possible_types, new_acc, fn(acc2, union_type) { 221 + collect_types_from_type_deep(union_type, acc2) 123 222 }) 124 223 } 125 - True -> { 126 - // Check if it's a wrapping type (List or NonNull) 127 - case schema.inner_type(t) { 128 - option.Some(inner) -> collect_types_from_type_deep(inner, new_acc) 129 - option.None -> new_acc 224 + False -> { 225 + // Check if it's an InputObjectType 226 + let input_fields = schema.get_input_fields(t) 227 + case list.is_empty(input_fields) { 228 + False -> { 229 + // This is an InputObjectType, collect types from its fields 230 + list.fold(input_fields, new_acc, fn(acc2, input_field) { 231 + let field_type = schema.input_field_type(input_field) 232 + collect_types_from_type_deep(field_type, acc2) 233 + }) 234 + } 235 + True -> { 236 + // Check if it's a wrapping type (List or NonNull) 237 + case schema.inner_type(t) { 238 + option.Some(inner) -> 239 + collect_types_from_type_deep(inner, new_acc) 240 + option.None -> new_acc 241 + } 242 + } 130 243 } 131 244 } 132 245 } ··· 177 290 _ -> value.Null 178 291 } 179 292 293 + // Determine possibleTypes for UNION types 294 + let possible_types = case kind { 295 + "UNION" -> { 296 + let types = schema.get_possible_types(t) 297 + value.List(list.map(types, type_ref)) 298 + } 299 + _ -> value.Null 300 + } 301 + 180 302 // Handle wrapping types (LIST/NON_NULL) differently 181 303 let name = case kind { 182 304 "LIST" -> value.Null ··· 195 317 #("description", description), 196 318 #("fields", fields), 197 319 #("interfaces", value.List([])), 198 - #("possibleTypes", value.Null), 320 + #("possibleTypes", possible_types), 199 321 #("enumValues", enum_values), 200 322 #("inputFields", input_fields), 201 323 #("ofType", of_type),
+19 -11
graphql/src/graphql/parser.gleam
··· 212 212 case parse_selection_set(tokens) { 213 213 Ok(#(selections, remaining)) -> { 214 214 let op = Query(selections) 215 - // Don't continue parsing if we have operations already - single anonymous query 216 - case acc { 217 - [] -> Ok(#(list.reverse([op]), remaining)) 218 - _ -> parse_operations(remaining, [op, ..acc]) 219 - } 215 + // Continue parsing to see if there are more operations (e.g., fragment definitions) 216 + parse_operations(remaining, [op, ..acc]) 220 217 } 221 218 Error(err) -> Error(err) 222 219 } ··· 281 278 parse_selections(rest, [spread, ..acc]) 282 279 } 283 280 284 - // Field 281 + // Field with alias: "alias: fieldName" 282 + [lexer.Name(alias), lexer.Colon, lexer.Name(field_name), ..rest] -> { 283 + case parse_field_with_alias(field_name, Some(alias), rest) { 284 + Ok(#(field, remaining)) -> { 285 + parse_selections(remaining, [field, ..acc]) 286 + } 287 + Error(err) -> Error(err) 288 + } 289 + } 290 + 291 + // Field without alias 285 292 [lexer.Name(name), ..rest] -> { 286 - case parse_field(name, rest) { 293 + case parse_field_with_alias(name, None, rest) { 287 294 Ok(#(field, remaining)) -> { 288 295 parse_selections(remaining, [field, ..acc]) 289 296 } ··· 297 304 } 298 305 } 299 306 300 - /// Parse a field with optional arguments and nested selections 301 - fn parse_field( 307 + /// Parse a field with optional alias, arguments and nested selections 308 + fn parse_field_with_alias( 302 309 name: String, 310 + alias: Option(String), 303 311 tokens: List(lexer.Token), 304 312 ) -> Result(#(Selection, List(lexer.Token)), ParseError) { 305 313 // Parse arguments if present ··· 319 327 [lexer.BraceOpen, ..] -> { 320 328 case parse_nested_selections(after_args) { 321 329 Ok(#(nested, remaining)) -> 322 - Ok(#(Field(name, None, arguments, nested), remaining)) 330 + Ok(#(Field(name, alias, arguments, nested), remaining)) 323 331 Error(err) -> Error(err) 324 332 } 325 333 } 326 - _ -> Ok(#(Field(name, None, arguments, []), after_args)) 334 + _ -> Ok(#(Field(name, alias, arguments, []), after_args)) 327 335 } 328 336 } 329 337
+62
graphql/src/graphql/schema.gleam
··· 49 49 ObjectType(name: String, description: String, fields: List(Field)) 50 50 InputObjectType(name: String, description: String, fields: List(InputField)) 51 51 EnumType(name: String, description: String, values: List(EnumValue)) 52 + UnionType( 53 + name: String, 54 + description: String, 55 + possible_types: List(Type), 56 + type_resolver: fn(Context) -> Result(String, String), 57 + ) 52 58 ListType(inner_type: Type) 53 59 NonNullType(inner_type: Type) 54 60 } ··· 140 146 InputObjectType(name, description, fields) 141 147 } 142 148 149 + pub fn union_type( 150 + name: String, 151 + description: String, 152 + possible_types: List(Type), 153 + type_resolver: fn(Context) -> Result(String, String), 154 + ) -> Type { 155 + UnionType(name, description, possible_types, type_resolver) 156 + } 157 + 143 158 pub fn list_type(inner_type: Type) -> Type { 144 159 ListType(inner_type) 145 160 } ··· 205 220 ObjectType(name, _, _) -> name 206 221 InputObjectType(name, _, _) -> name 207 222 EnumType(name, _, _) -> name 223 + UnionType(name, _, _, _) -> name 208 224 ListType(inner) -> "[" <> type_name(inner) <> "]" 209 225 NonNullType(inner) -> type_name(inner) <> "!" 210 226 } ··· 405 421 } 406 422 } 407 423 424 + /// Check if type is a union 425 + pub fn is_union(t: Type) -> Bool { 426 + case t { 427 + UnionType(_, _, _, _) -> True 428 + _ -> False 429 + } 430 + } 431 + 432 + /// Get the possible types from a union 433 + pub fn get_possible_types(t: Type) -> List(Type) { 434 + case t { 435 + UnionType(_, _, possible_types, _) -> possible_types 436 + _ -> [] 437 + } 438 + } 439 + 440 + /// Resolve a union type to its concrete type using the type resolver 441 + pub fn resolve_union_type(t: Type, ctx: Context) -> Result(Type, String) { 442 + case t { 443 + UnionType(_, _, possible_types, type_resolver) -> { 444 + // Call the type resolver to get the concrete type name 445 + case type_resolver(ctx) { 446 + Ok(resolved_type_name) -> { 447 + // Find the concrete type in possible_types 448 + case 449 + list.find(possible_types, fn(pt) { 450 + type_name(pt) == resolved_type_name 451 + }) 452 + { 453 + Ok(concrete_type) -> Ok(concrete_type) 454 + Error(_) -> 455 + Error( 456 + "Type resolver returned '" 457 + <> resolved_type_name 458 + <> "' which is not a possible type of this union", 459 + ) 460 + } 461 + } 462 + Error(err) -> Error(err) 463 + } 464 + } 465 + _ -> Error("Cannot resolve non-union type") 466 + } 467 + } 468 + 408 469 /// Get the inner type from a wrapping type (List or NonNull) 409 470 pub fn inner_type(t: Type) -> option.Option(Type) { 410 471 case t { ··· 421 482 ObjectType(_, _, _) -> "OBJECT" 422 483 InputObjectType(_, _, _) -> "INPUT_OBJECT" 423 484 EnumType(_, _, _) -> "ENUM" 485 + UnionType(_, _, _, _) -> "UNION" 424 486 ListType(_) -> "LIST" 425 487 NonNullType(_) -> "NON_NULL" 426 488 }
+23
graphql/src/graphql/sdl.gleam
··· 31 31 "INPUT_OBJECT" -> print_input_object(type_, indent_level, inline) 32 32 "OBJECT" -> print_object(type_, indent_level, inline) 33 33 "ENUM" -> print_enum(type_, indent_level, inline) 34 + "UNION" -> print_union(type_, indent_level, inline) 34 35 "SCALAR" -> print_scalar(type_, indent_level, inline) 35 36 "LIST" -> { 36 37 case schema.inner_type(type_) { ··· 62 63 } 63 64 64 65 desc_block <> indent <> "scalar " <> schema.type_name(type_) 66 + } 67 + } 68 + } 69 + 70 + fn print_union(type_: schema.Type, indent_level: Int, inline: Bool) -> String { 71 + case inline { 72 + True -> schema.type_name(type_) 73 + False -> { 74 + let type_name = schema.type_name(type_) 75 + let indent = string.repeat(" ", indent_level * 2) 76 + let description = schema.type_description(type_) 77 + let desc_block = case description { 78 + "" -> "" 79 + _ -> indent <> format_description(description) <> "\n" 80 + } 81 + 82 + let possible_types = schema.get_possible_types(type_) 83 + let type_names = 84 + list.map(possible_types, fn(t) { schema.type_name(t) }) 85 + |> string.join(" | ") 86 + 87 + desc_block <> indent <> "union " <> type_name <> " = " <> type_names 65 88 } 66 89 } 67 90 }
+291
graphql/test/executor_test.gleam
··· 579 579 content: format_response(response), 580 580 ) 581 581 } 582 + 583 + // Union type execution tests 584 + pub fn execute_union_with_inline_fragment_test() { 585 + // Create object types that will be part of the union 586 + let post_type = 587 + schema.object_type("Post", "A blog post", [ 588 + schema.field("title", schema.string_type(), "Post title", fn(ctx) { 589 + case ctx.data { 590 + option.Some(value.Object(fields)) -> { 591 + case list.key_find(fields, "title") { 592 + Ok(title_val) -> Ok(title_val) 593 + Error(_) -> Ok(value.Null) 594 + } 595 + } 596 + _ -> Ok(value.Null) 597 + } 598 + }), 599 + schema.field("content", schema.string_type(), "Post content", fn(ctx) { 600 + case ctx.data { 601 + option.Some(value.Object(fields)) -> { 602 + case list.key_find(fields, "content") { 603 + Ok(content_val) -> Ok(content_val) 604 + Error(_) -> Ok(value.Null) 605 + } 606 + } 607 + _ -> Ok(value.Null) 608 + } 609 + }), 610 + ]) 611 + 612 + let comment_type = 613 + schema.object_type("Comment", "A comment", [ 614 + schema.field("text", schema.string_type(), "Comment text", fn(ctx) { 615 + case ctx.data { 616 + option.Some(value.Object(fields)) -> { 617 + case list.key_find(fields, "text") { 618 + Ok(text_val) -> Ok(text_val) 619 + Error(_) -> Ok(value.Null) 620 + } 621 + } 622 + _ -> Ok(value.Null) 623 + } 624 + }), 625 + ]) 626 + 627 + // Type resolver that examines the __typename field 628 + let type_resolver = fn(ctx: schema.Context) -> Result(String, String) { 629 + case ctx.data { 630 + option.Some(value.Object(fields)) -> { 631 + case list.key_find(fields, "__typename") { 632 + Ok(value.String(type_name)) -> Ok(type_name) 633 + _ -> Error("No __typename field found") 634 + } 635 + } 636 + _ -> Error("No data") 637 + } 638 + } 639 + 640 + // Create union type 641 + let search_result_union = 642 + schema.union_type( 643 + "SearchResult", 644 + "A search result", 645 + [post_type, comment_type], 646 + type_resolver, 647 + ) 648 + 649 + // Create query type with a field returning the union 650 + let query_type = 651 + schema.object_type("Query", "Root query type", [ 652 + schema.field("search", search_result_union, "Search for content", fn(_ctx) { 653 + // Return a Post 654 + Ok( 655 + value.Object([ 656 + #("__typename", value.String("Post")), 657 + #("title", value.String("GraphQL is awesome")), 658 + #("content", value.String("Learn all about GraphQL...")), 659 + ]), 660 + ) 661 + }), 662 + ]) 663 + 664 + let test_schema = schema.schema(query_type, None) 665 + 666 + // Query with inline fragment 667 + let query = 668 + " 669 + { 670 + search { 671 + ... on Post { 672 + title 673 + content 674 + } 675 + ... on Comment { 676 + text 677 + } 678 + } 679 + } 680 + " 681 + 682 + let result = executor.execute(query, test_schema, schema.context(None)) 683 + 684 + let response = case result { 685 + Ok(r) -> r 686 + Error(_) -> panic as "Execution failed" 687 + } 688 + 689 + birdie.snap( 690 + title: "Execute union with inline fragment", 691 + content: format_response(response), 692 + ) 693 + } 694 + 695 + pub fn execute_union_list_with_inline_fragments_test() { 696 + // Create object types 697 + let post_type = 698 + schema.object_type("Post", "A blog post", [ 699 + schema.field("title", schema.string_type(), "Post title", fn(ctx) { 700 + case ctx.data { 701 + option.Some(value.Object(fields)) -> { 702 + case list.key_find(fields, "title") { 703 + Ok(title_val) -> Ok(title_val) 704 + Error(_) -> Ok(value.Null) 705 + } 706 + } 707 + _ -> Ok(value.Null) 708 + } 709 + }), 710 + ]) 711 + 712 + let comment_type = 713 + schema.object_type("Comment", "A comment", [ 714 + schema.field("text", schema.string_type(), "Comment text", fn(ctx) { 715 + case ctx.data { 716 + option.Some(value.Object(fields)) -> { 717 + case list.key_find(fields, "text") { 718 + Ok(text_val) -> Ok(text_val) 719 + Error(_) -> Ok(value.Null) 720 + } 721 + } 722 + _ -> Ok(value.Null) 723 + } 724 + }), 725 + ]) 726 + 727 + // Type resolver 728 + let type_resolver = fn(ctx: schema.Context) -> Result(String, String) { 729 + case ctx.data { 730 + option.Some(value.Object(fields)) -> { 731 + case list.key_find(fields, "__typename") { 732 + Ok(value.String(type_name)) -> Ok(type_name) 733 + _ -> Error("No __typename field found") 734 + } 735 + } 736 + _ -> Error("No data") 737 + } 738 + } 739 + 740 + // Create union type 741 + let search_result_union = 742 + schema.union_type( 743 + "SearchResult", 744 + "A search result", 745 + [post_type, comment_type], 746 + type_resolver, 747 + ) 748 + 749 + // Create query type with a list of unions 750 + let query_type = 751 + schema.object_type("Query", "Root query type", [ 752 + schema.field( 753 + "searchAll", 754 + schema.list_type(search_result_union), 755 + "Search for all content", 756 + fn(_ctx) { 757 + // Return a list with mixed types 758 + Ok( 759 + value.List([ 760 + value.Object([ 761 + #("__typename", value.String("Post")), 762 + #("title", value.String("First Post")), 763 + ]), 764 + value.Object([ 765 + #("__typename", value.String("Comment")), 766 + #("text", value.String("Great article!")), 767 + ]), 768 + value.Object([ 769 + #("__typename", value.String("Post")), 770 + #("title", value.String("Second Post")), 771 + ]), 772 + ]), 773 + ) 774 + }, 775 + ), 776 + ]) 777 + 778 + let test_schema = schema.schema(query_type, None) 779 + 780 + // Query with inline fragments on list items 781 + let query = 782 + " 783 + { 784 + searchAll { 785 + ... on Post { 786 + title 787 + } 788 + ... on Comment { 789 + text 790 + } 791 + } 792 + } 793 + " 794 + 795 + let result = executor.execute(query, test_schema, schema.context(None)) 796 + 797 + let response = case result { 798 + Ok(r) -> r 799 + Error(_) -> panic as "Execution failed" 800 + } 801 + 802 + birdie.snap( 803 + title: "Execute union list with inline fragments", 804 + content: format_response(response), 805 + ) 806 + } 807 + 808 + // Test field aliases 809 + pub fn execute_field_with_alias_test() { 810 + let schema = test_schema() 811 + let query = "{ greeting: hello }" 812 + 813 + let result = executor.execute(query, schema, schema.context(None)) 814 + 815 + let response = case result { 816 + Ok(r) -> r 817 + Error(_) -> panic as "Execution failed" 818 + } 819 + 820 + // Response should contain "greeting" as the key, not "hello" 821 + case response.data { 822 + value.Object(fields) -> { 823 + case list.key_find(fields, "greeting") { 824 + Ok(_) -> should.be_true(True) 825 + Error(_) -> { 826 + // Check if it incorrectly used "hello" instead 827 + case list.key_find(fields, "hello") { 828 + Ok(_) -> panic as "Alias not applied - used 'hello' instead of 'greeting'" 829 + Error(_) -> panic as "Neither 'greeting' nor 'hello' found in response" 830 + } 831 + } 832 + } 833 + } 834 + _ -> panic as "Expected object response" 835 + } 836 + } 837 + 838 + // Test multiple aliases 839 + pub fn execute_multiple_fields_with_aliases_test() { 840 + let schema = test_schema() 841 + let query = "{ greeting: hello num: number }" 842 + 843 + let result = executor.execute(query, schema, schema.context(None)) 844 + 845 + let response = case result { 846 + Ok(r) -> r 847 + Error(_) -> panic as "Execution failed" 848 + } 849 + 850 + birdie.snap( 851 + title: "Execute multiple fields with aliases", 852 + content: format_response(response), 853 + ) 854 + } 855 + 856 + // Test mixed aliased and non-aliased fields 857 + pub fn execute_mixed_aliased_fields_test() { 858 + let schema = test_schema() 859 + let query = "{ greeting: hello number }" 860 + 861 + let result = executor.execute(query, schema, schema.context(None)) 862 + 863 + let response = case result { 864 + Ok(r) -> r 865 + Error(_) -> panic as "Execution failed" 866 + } 867 + 868 + birdie.snap( 869 + title: "Execute mixed aliased and non-aliased fields", 870 + content: format_response(response), 871 + ) 872 + }
+362
graphql/test/introspection_test.gleam
··· 309 309 } 310 310 |> should.be_true 311 311 } 312 + 313 + /// Test: Basic __type query 314 + /// Verifies that __type(name: "TypeName") returns the correct type 315 + pub fn type_basic_query_test() { 316 + let schema = test_schema() 317 + let query = "{ __type(name: \"Query\") { name kind } }" 318 + 319 + let result = executor.execute(query, schema, schema.context(None)) 320 + 321 + should.be_ok(result) 322 + |> fn(response) { 323 + case response { 324 + executor.Response(data: value.Object(fields), errors: []) -> { 325 + case list.key_find(fields, "__type") { 326 + Ok(value.Object(type_fields)) -> { 327 + // Check name and kind 328 + let has_correct_name = case list.key_find(type_fields, "name") { 329 + Ok(value.String("Query")) -> True 330 + _ -> False 331 + } 332 + let has_correct_kind = case list.key_find(type_fields, "kind") { 333 + Ok(value.String("OBJECT")) -> True 334 + _ -> False 335 + } 336 + has_correct_name && has_correct_kind 337 + } 338 + _ -> False 339 + } 340 + } 341 + _ -> False 342 + } 343 + } 344 + |> should.be_true 345 + } 346 + 347 + /// Test: __type query with nested fields 348 + /// Verifies that nested selections work correctly on __type 349 + pub fn type_nested_fields_test() { 350 + let schema = test_schema() 351 + let query = "{ __type(name: \"Query\") { name kind fields { name type { name kind } } } }" 352 + 353 + let result = executor.execute(query, schema, schema.context(None)) 354 + 355 + should.be_ok(result) 356 + |> fn(response) { 357 + case response { 358 + executor.Response(data: value.Object(fields), errors: []) -> { 359 + case list.key_find(fields, "__type") { 360 + Ok(value.Object(type_fields)) -> { 361 + // Check that fields exists and is a list 362 + case list.key_find(type_fields, "fields") { 363 + Ok(value.List(field_list)) -> { 364 + // Should have 2 fields (hello and number) 365 + list.length(field_list) == 2 366 + && list.all(field_list, fn(field_val) { 367 + case field_val { 368 + value.Object(field_fields) -> { 369 + let has_name = case list.key_find(field_fields, "name") { 370 + Ok(value.String(_)) -> True 371 + _ -> False 372 + } 373 + let has_type = case list.key_find(field_fields, "type") { 374 + Ok(value.Object(_)) -> True 375 + _ -> False 376 + } 377 + has_name && has_type 378 + } 379 + _ -> False 380 + } 381 + }) 382 + } 383 + _ -> False 384 + } 385 + } 386 + _ -> False 387 + } 388 + } 389 + _ -> False 390 + } 391 + } 392 + |> should.be_true 393 + } 394 + 395 + /// Test: __type query for scalar types 396 + /// Verifies that __type works for built-in scalar types 397 + pub fn type_scalar_query_test() { 398 + let schema = test_schema() 399 + let query = "{ __type(name: \"String\") { name kind } }" 400 + 401 + let result = executor.execute(query, schema, schema.context(None)) 402 + 403 + should.be_ok(result) 404 + |> fn(response) { 405 + case response { 406 + executor.Response(data: value.Object(fields), errors: []) -> { 407 + case list.key_find(fields, "__type") { 408 + Ok(value.Object(type_fields)) -> { 409 + // Check name and kind 410 + let has_correct_name = case list.key_find(type_fields, "name") { 411 + Ok(value.String("String")) -> True 412 + _ -> False 413 + } 414 + let has_correct_kind = case list.key_find(type_fields, "kind") { 415 + Ok(value.String("SCALAR")) -> True 416 + _ -> False 417 + } 418 + has_correct_name && has_correct_kind 419 + } 420 + _ -> False 421 + } 422 + } 423 + _ -> False 424 + } 425 + } 426 + |> should.be_true 427 + } 428 + 429 + /// Test: __type query for non-existent type 430 + /// Verifies that __type returns null for types that don't exist 431 + pub fn type_not_found_test() { 432 + let schema = test_schema() 433 + let query = "{ __type(name: \"NonExistentType\") { name kind } }" 434 + 435 + let result = executor.execute(query, schema, schema.context(None)) 436 + 437 + should.be_ok(result) 438 + |> fn(response) { 439 + case response { 440 + executor.Response(data: value.Object(fields), errors: []) -> { 441 + case list.key_find(fields, "__type") { 442 + Ok(value.Null) -> True 443 + _ -> False 444 + } 445 + } 446 + _ -> False 447 + } 448 + } 449 + |> should.be_true 450 + } 451 + 452 + /// Test: __type query without name argument 453 + /// Verifies that __type returns an error when name argument is missing 454 + pub fn type_missing_argument_test() { 455 + let schema = test_schema() 456 + let query = "{ __type { name kind } }" 457 + 458 + let result = executor.execute(query, schema, schema.context(None)) 459 + 460 + should.be_ok(result) 461 + |> fn(response) { 462 + case response { 463 + executor.Response(data: value.Object(fields), errors: errors) -> { 464 + // Should have __type field as null 465 + let has_null_type = case list.key_find(fields, "__type") { 466 + Ok(value.Null) -> True 467 + _ -> False 468 + } 469 + // Should have an error 470 + let has_error = errors != [] 471 + has_null_type && has_error 472 + } 473 + _ -> False 474 + } 475 + } 476 + |> should.be_true 477 + } 478 + 479 + /// Test: Combined __type and __schema query 480 + /// Verifies that __type and __schema can be queried together 481 + pub fn type_and_schema_combined_test() { 482 + let schema = test_schema() 483 + let query = "{ __schema { queryType { name } } __type(name: \"String\") { name kind } }" 484 + 485 + let result = executor.execute(query, schema, schema.context(None)) 486 + 487 + should.be_ok(result) 488 + |> fn(response) { 489 + case response { 490 + executor.Response(data: value.Object(fields), errors: []) -> { 491 + let has_schema = case list.key_find(fields, "__schema") { 492 + Ok(value.Object(_)) -> True 493 + _ -> False 494 + } 495 + let has_type = case list.key_find(fields, "__type") { 496 + Ok(value.Object(_)) -> True 497 + _ -> False 498 + } 499 + has_schema && has_type 500 + } 501 + _ -> False 502 + } 503 + } 504 + |> should.be_true 505 + } 506 + 507 + /// Test: Deep introspection queries complete without hanging 508 + /// This test verifies that the cycle detection prevents infinite loops 509 + /// by successfully completing a deeply nested introspection query 510 + pub fn deep_introspection_test() { 511 + let schema = test_schema() 512 + 513 + // Query with deep nesting including ofType chains 514 + // Without cycle detection, this could cause infinite loops 515 + let query = 516 + "{ __schema { types { name kind fields { name type { name kind ofType { name kind ofType { name } } } } } } }" 517 + 518 + let result = executor.execute(query, schema, schema.context(None)) 519 + 520 + // The key test: should complete without hanging 521 + should.be_ok(result) 522 + |> fn(response) { 523 + case response { 524 + executor.Response(data: value.Object(fields), errors: _errors) -> { 525 + // Should have __schema field with types 526 + case list.key_find(fields, "__schema") { 527 + Ok(value.Object(schema_fields)) -> { 528 + case list.key_find(schema_fields, "types") { 529 + Ok(value.List(types)) -> types != [] 530 + _ -> False 531 + } 532 + } 533 + _ -> False 534 + } 535 + } 536 + _ -> False 537 + } 538 + } 539 + |> should.be_true 540 + } 541 + 542 + /// Test: Fragment spreads work in introspection queries 543 + /// Verifies that fragment spreads like those used by GraphiQL work correctly 544 + pub fn introspection_fragment_spread_test() { 545 + // Create a schema with an ENUM type 546 + let sort_enum = 547 + schema.enum_type("SortDirection", "Sort direction", [ 548 + schema.enum_value("ASC", "Ascending"), 549 + schema.enum_value("DESC", "Descending"), 550 + ]) 551 + 552 + let query_type = 553 + schema.object_type("Query", "Root query", [ 554 + schema.field("items", schema.list_type(schema.string_type()), "", fn(_) { 555 + Ok(value.List([value.String("a"), value.String("b")])) 556 + }), 557 + schema.field("sort", sort_enum, "", fn(_) { Ok(value.String("ASC")) }), 558 + ]) 559 + 560 + let test_schema = schema.schema(query_type, None) 561 + 562 + // Use a fragment spread like GraphiQL does 563 + let query = 564 + " 565 + query IntrospectionQuery { 566 + __schema { 567 + types { 568 + ...FullType 569 + } 570 + } 571 + } 572 + 573 + fragment FullType on __Type { 574 + kind 575 + name 576 + enumValues(includeDeprecated: true) { 577 + name 578 + description 579 + } 580 + } 581 + " 582 + 583 + let result = executor.execute(query, test_schema, schema.context(None)) 584 + 585 + should.be_ok(result) 586 + |> fn(response) { 587 + case response { 588 + executor.Response(data: value.Object(fields), errors: _) -> { 589 + case list.key_find(fields, "__schema") { 590 + Ok(value.Object(schema_fields)) -> { 591 + case list.key_find(schema_fields, "types") { 592 + Ok(value.List(types)) -> { 593 + // Find the SortDirection enum 594 + let enum_type = 595 + list.find(types, fn(t) { 596 + case t { 597 + value.Object(type_fields) -> { 598 + case list.key_find(type_fields, "name") { 599 + Ok(value.String("SortDirection")) -> True 600 + _ -> False 601 + } 602 + } 603 + _ -> False 604 + } 605 + }) 606 + 607 + case enum_type { 608 + Ok(value.Object(type_fields)) -> { 609 + // Should have kind field from fragment 610 + let has_kind = case list.key_find(type_fields, "kind") { 611 + Ok(value.String("ENUM")) -> True 612 + _ -> False 613 + } 614 + 615 + // Should have enumValues field from fragment 616 + let has_enum_values = case 617 + list.key_find(type_fields, "enumValues") 618 + { 619 + Ok(value.List(values)) -> list.length(values) == 2 620 + _ -> False 621 + } 622 + 623 + has_kind && has_enum_values 624 + } 625 + _ -> False 626 + } 627 + } 628 + _ -> False 629 + } 630 + } 631 + _ -> False 632 + } 633 + } 634 + _ -> False 635 + } 636 + } 637 + |> should.be_true 638 + } 639 + 640 + /// Test: Simple fragment on __type 641 + pub fn simple_type_fragment_test() { 642 + let schema = test_schema() 643 + 644 + let query = "{ __type(name: \"Query\") { ...TypeFrag } } fragment TypeFrag on __Type { name kind }" 645 + 646 + let result = executor.execute(query, schema, schema.context(None)) 647 + 648 + should.be_ok(result) 649 + |> fn(response) { 650 + case response { 651 + executor.Response(data: value.Object(fields), errors: _) -> { 652 + case list.key_find(fields, "__type") { 653 + Ok(value.Object(type_fields)) -> { 654 + // Check if we got an error about fragment not found 655 + case list.key_find(type_fields, "__FRAGMENT_ERROR") { 656 + Ok(value.String(msg)) -> { 657 + // Fragment wasn't found 658 + panic as msg 659 + } 660 + _ -> { 661 + // No error, check if we have actual fields 662 + type_fields != [] 663 + } 664 + } 665 + } 666 + _ -> False 667 + } 668 + } 669 + _ -> False 670 + } 671 + } 672 + |> should.be_true 673 + }
+36
graphql/test/parser_test.gleam
··· 2 2 /// 3 3 /// GraphQL spec Section 2 - Language 4 4 /// Parse tokens into Abstract Syntax Tree 5 + import gleam/list 5 6 import gleam/option.{None} 6 7 import gleeunit/should 7 8 import graphql/parser ··· 246 247 ])), 247 248 ]) -> True 248 249 _ -> False 250 + } 251 + } 252 + |> should.be_true 253 + } 254 + 255 + pub fn parse_fragment_single_line_test() { 256 + // The multiline version works - let's try it 257 + " 258 + { __type(name: \"Query\") { ...TypeFrag } } 259 + fragment TypeFrag on __Type { name kind } 260 + " 261 + |> parser.parse 262 + |> should.be_ok 263 + |> fn(doc) { 264 + case doc { 265 + parser.Document(operations) -> list.length(operations) == 2 266 + } 267 + } 268 + |> should.be_true 269 + } 270 + 271 + pub fn parse_fragment_truly_single_line_test() { 272 + // This is the problematic single-line version 273 + "{ __type(name: \"Query\") { ...TypeFrag } } fragment TypeFrag on __Type { name kind }" 274 + |> parser.parse 275 + |> should.be_ok 276 + |> fn(doc) { 277 + case doc { 278 + parser.Document(operations) -> { 279 + // If we only got 1 operation, the parser stopped after the query 280 + case operations { 281 + [parser.Query(_)] -> panic as "Only got Query - fragment was not parsed" 282 + _ -> list.length(operations) == 2 283 + } 284 + } 249 285 } 250 286 } 251 287 |> should.be_true
+131
graphql/test/schema_test.gleam
··· 82 82 83 83 should.be_true(True) 84 84 } 85 + 86 + // Union type tests 87 + pub fn create_union_type_test() { 88 + let post_type = 89 + schema.object_type("Post", "A blog post", [ 90 + schema.field("title", schema.string_type(), "Post title", fn(_ctx) { 91 + Ok(value.String("Hello")) 92 + }), 93 + ]) 94 + 95 + let comment_type = 96 + schema.object_type("Comment", "A comment", [ 97 + schema.field("text", schema.string_type(), "Comment text", fn(_ctx) { 98 + Ok(value.String("Nice post")) 99 + }), 100 + ]) 101 + 102 + let type_resolver = fn(_ctx: schema.Context) -> Result(String, String) { 103 + Ok("Post") 104 + } 105 + 106 + let union_type = 107 + schema.union_type( 108 + "SearchResult", 109 + "A search result", 110 + [post_type, comment_type], 111 + type_resolver, 112 + ) 113 + 114 + should.equal(schema.type_name(union_type), "SearchResult") 115 + should.be_true(schema.is_union(union_type)) 116 + } 117 + 118 + pub fn union_possible_types_test() { 119 + let post_type = 120 + schema.object_type("Post", "A blog post", [ 121 + schema.field("title", schema.string_type(), "Post title", fn(_ctx) { 122 + Ok(value.String("Hello")) 123 + }), 124 + ]) 125 + 126 + let comment_type = 127 + schema.object_type("Comment", "A comment", [ 128 + schema.field("text", schema.string_type(), "Comment text", fn(_ctx) { 129 + Ok(value.String("Nice post")) 130 + }), 131 + ]) 132 + 133 + let type_resolver = fn(_ctx: schema.Context) -> Result(String, String) { 134 + Ok("Post") 135 + } 136 + 137 + let union_type = 138 + schema.union_type( 139 + "SearchResult", 140 + "A search result", 141 + [post_type, comment_type], 142 + type_resolver, 143 + ) 144 + 145 + let possible_types = schema.get_possible_types(union_type) 146 + should.equal(possible_types, [post_type, comment_type]) 147 + } 148 + 149 + pub fn resolve_union_type_test() { 150 + let post_type = 151 + schema.object_type("Post", "A blog post", [ 152 + schema.field("title", schema.string_type(), "Post title", fn(_ctx) { 153 + Ok(value.String("Hello")) 154 + }), 155 + ]) 156 + 157 + let comment_type = 158 + schema.object_type("Comment", "A comment", [ 159 + schema.field("text", schema.string_type(), "Comment text", fn(_ctx) { 160 + Ok(value.String("Nice post")) 161 + }), 162 + ]) 163 + 164 + // Type resolver that examines the __typename field in the data 165 + let type_resolver = fn(ctx: schema.Context) -> Result(String, String) { 166 + case ctx.data { 167 + None -> Error("No data") 168 + option.Some(value.Object(fields)) -> { 169 + case fields { 170 + [#("__typename", value.String(type_name)), ..] -> Ok(type_name) 171 + _ -> Error("No __typename field") 172 + } 173 + } 174 + _ -> Error("Data is not an object") 175 + } 176 + } 177 + 178 + let union_type = 179 + schema.union_type( 180 + "SearchResult", 181 + "A search result", 182 + [post_type, comment_type], 183 + type_resolver, 184 + ) 185 + 186 + // Create context with data that has __typename 187 + let data = 188 + value.Object([#("__typename", value.String("Post")), #("title", value.String("Test"))]) 189 + let ctx = schema.context(option.Some(data)) 190 + let result = schema.resolve_union_type(union_type, ctx) 191 + 192 + case result { 193 + Ok(resolved_type) -> 194 + should.equal(schema.type_name(resolved_type), "Post") 195 + Error(_) -> should.be_true(False) 196 + } 197 + } 198 + 199 + pub fn union_type_kind_test() { 200 + let post_type = 201 + schema.object_type("Post", "A blog post", [ 202 + schema.field("title", schema.string_type(), "Post title", fn(_ctx) { 203 + Ok(value.String("Hello")) 204 + }), 205 + ]) 206 + 207 + let type_resolver = fn(_ctx: schema.Context) -> Result(String, String) { 208 + Ok("Post") 209 + } 210 + 211 + let union_type = 212 + schema.union_type("SearchResult", "A search result", [post_type], type_resolver) 213 + 214 + should.equal(schema.type_kind(union_type), "UNION") 215 + }
+99 -99
lexicon_graphql/birdie_snapshots/all_types_generated_by_db_schema_builder_including_connection,_edge,_page_info,_sort_field_enum,_where_input,_etc.accepted
··· 4 4 file: ./test/sorting_test.gleam 5 5 test_name: db_schema_all_types_snapshot_test 6 6 --- 7 + scalar Boolean 8 + 7 9 """Result of a delete mutation""" 8 10 type DeleteResult { 9 11 """URI of deleted record""" 10 12 uri: String 11 13 } 12 14 13 - """Input type for XyzStatusphereStatusInput""" 14 - input XyzStatusphereStatusInput { 15 - """Input field for text""" 16 - text: String 17 - """Input field for createdAt""" 18 - createdAt: String 19 - } 15 + scalar Int 20 16 21 17 """Root mutation type""" 22 18 type Mutation { ··· 28 24 deleteXyzStatusphereStatus: DeleteResult 29 25 } 30 26 31 - """Filter operators for XyzStatusphereStatus fields""" 32 - input XyzStatusphereStatusFieldCondition { 33 - """Exact match (equals)""" 34 - eq: String 35 - """Match any value in the list""" 36 - in: [String!] 37 - """Case-insensitive substring match (string fields only)""" 38 - contains: String 39 - """Greater than""" 40 - gt: String 41 - """Greater than or equal to""" 42 - gte: String 43 - """Less than""" 44 - lt: String 45 - """Less than or equal to""" 46 - lte: String 47 - } 48 - 49 - """Filter conditions for XyzStatusphereStatus with nested AND/OR support""" 50 - input XyzStatusphereStatusWhereInput { 51 - """Filter by uri""" 52 - uri: XyzStatusphereStatusFieldCondition 53 - """Filter by cid""" 54 - cid: XyzStatusphereStatusFieldCondition 55 - """Filter by did""" 56 - did: XyzStatusphereStatusFieldCondition 57 - """Filter by collection""" 58 - collection: XyzStatusphereStatusFieldCondition 59 - """Filter by indexedAt""" 60 - indexedAt: XyzStatusphereStatusFieldCondition 61 - """Filter by actorHandle""" 62 - actorHandle: XyzStatusphereStatusFieldCondition 63 - """Filter by text""" 64 - text: XyzStatusphereStatusFieldCondition 65 - """Filter by createdAt""" 66 - createdAt: XyzStatusphereStatusFieldCondition 67 - """All conditions must match (AND logic)""" 68 - and: [XyzStatusphereStatusWhereInput!] 69 - """Any condition must match (OR logic)""" 70 - or: [XyzStatusphereStatusWhereInput!] 71 - } 72 - 73 - """Sort direction for query results""" 74 - enum SortDirection { 75 - """Ascending order""" 76 - ASC 77 - """Descending order""" 78 - DESC 79 - } 80 - 81 - """Available sort fields for XyzStatusphereStatus""" 82 - enum XyzStatusphereStatusSortField { 83 - """Sort by uri""" 84 - uri 85 - """Sort by cid""" 86 - cid 87 - """Sort by did""" 88 - did 89 - """Sort by collection""" 90 - collection 91 - """Sort by indexedAt""" 92 - indexedAt 93 - """Sort by text""" 94 - text 95 - """Sort by createdAt""" 96 - createdAt 97 - } 98 - 99 - """Specifies a field to sort by and its direction""" 100 - input SortFieldInput { 101 - """Field to sort by""" 102 - field: XyzStatusphereStatusSortField! 103 - """Sort direction (ASC or DESC)""" 104 - direction: SortDirection! 105 - } 106 - 107 - scalar Int 108 - 109 - scalar Boolean 110 - 111 27 """Information about pagination in a connection""" 112 28 type PageInfo { 113 29 """When paginating forwards, are there more items?""" ··· 120 36 endCursor: String 121 37 } 122 38 39 + """Root query type""" 40 + type Query { 41 + """Query xyz.statusphere.status with cursor pagination and sorting""" 42 + xyzStatusphereStatus: XyzStatusphereStatusConnection 43 + } 44 + 45 + """Sort direction for query results""" 46 + enum SortDirection { 47 + """Ascending order""" 48 + ASC 49 + """Descending order""" 50 + DESC 51 + } 52 + 123 53 scalar String 124 54 125 55 """Record type: xyz.statusphere.status""" ··· 142 72 createdAt: String 143 73 } 144 74 75 + """A connection to a list of items for XyzStatusphereStatus""" 76 + type XyzStatusphereStatusConnection { 77 + """A list of edges""" 78 + edges: [XyzStatusphereStatusEdge!]! 79 + """Information to aid in pagination""" 80 + pageInfo: PageInfo! 81 + """Total number of items in the connection""" 82 + totalCount: Int 83 + } 84 + 145 85 """An edge in a connection for XyzStatusphereStatus""" 146 86 type XyzStatusphereStatusEdge { 147 87 """The item at the end of the edge""" ··· 150 90 cursor: String! 151 91 } 152 92 153 - """A connection to a list of items for XyzStatusphereStatus""" 154 - type XyzStatusphereStatusConnection { 155 - """A list of edges""" 156 - edges: [XyzStatusphereStatusEdge!]! 157 - """Information to aid in pagination""" 158 - pageInfo: PageInfo! 159 - """Total number of items in the connection""" 160 - totalCount: Int 93 + """Filter operators for XyzStatusphereStatus fields""" 94 + input XyzStatusphereStatusFieldCondition { 95 + """Exact match (equals)""" 96 + eq: String 97 + """Match any value in the list""" 98 + in: [String!] 99 + """Case-insensitive substring match (string fields only)""" 100 + contains: String 101 + """Greater than""" 102 + gt: String 103 + """Greater than or equal to""" 104 + gte: String 105 + """Less than""" 106 + lt: String 107 + """Less than or equal to""" 108 + lte: String 109 + } 110 + 111 + """Input type for XyzStatusphereStatusInput""" 112 + input XyzStatusphereStatusInput { 113 + """Input field for text""" 114 + text: String 115 + """Input field for createdAt""" 116 + createdAt: String 117 + } 118 + 119 + """Available sort fields for XyzStatusphereStatus""" 120 + enum XyzStatusphereStatusSortField { 121 + """Sort by uri""" 122 + uri 123 + """Sort by cid""" 124 + cid 125 + """Sort by did""" 126 + did 127 + """Sort by collection""" 128 + collection 129 + """Sort by indexedAt""" 130 + indexedAt 131 + """Sort by text""" 132 + text 133 + """Sort by createdAt""" 134 + createdAt 135 + } 136 + 137 + """Specifies a field to sort by and its direction for XyzStatusphereStatus""" 138 + input XyzStatusphereStatusSortFieldInput { 139 + """Field to sort by""" 140 + field: XyzStatusphereStatusSortField! 141 + """Sort direction (ASC or DESC)""" 142 + direction: SortDirection! 161 143 } 162 144 163 - """Root query type""" 164 - type Query { 165 - """Query xyz.statusphere.status with cursor pagination and sorting""" 166 - xyzStatusphereStatus: XyzStatusphereStatusConnection 145 + """Filter conditions for XyzStatusphereStatus with nested AND/OR support""" 146 + input XyzStatusphereStatusWhereInput { 147 + """Filter by uri""" 148 + uri: XyzStatusphereStatusFieldCondition 149 + """Filter by cid""" 150 + cid: XyzStatusphereStatusFieldCondition 151 + """Filter by did""" 152 + did: XyzStatusphereStatusFieldCondition 153 + """Filter by collection""" 154 + collection: XyzStatusphereStatusFieldCondition 155 + """Filter by indexedAt""" 156 + indexedAt: XyzStatusphereStatusFieldCondition 157 + """Filter by actorHandle""" 158 + actorHandle: XyzStatusphereStatusFieldCondition 159 + """Filter by text""" 160 + text: XyzStatusphereStatusFieldCondition 161 + """Filter by createdAt""" 162 + createdAt: XyzStatusphereStatusFieldCondition 163 + """All conditions must match (AND logic)""" 164 + and: [XyzStatusphereStatusWhereInput!] 165 + """Any condition must match (OR logic)""" 166 + or: [XyzStatusphereStatusWhereInput!] 167 167 } 168 168 169 169 scalar Float
+12 -12
lexicon_graphql/birdie_snapshots/all_types_generated_for_simple_status_record.accepted
··· 10 10 uri: String 11 11 } 12 12 13 - """Input type for XyzStatusphereStatusInput""" 14 - input XyzStatusphereStatusInput { 15 - """Input field for text""" 16 - text: String 17 - """Input field for createdAt""" 18 - createdAt: String! 19 - } 20 - 21 13 """Root mutation type""" 22 14 type Mutation { 23 15 """Create a new xyz.statusphere.status record""" ··· 26 18 updateXyzStatusphereStatus: XyzStatusphereStatus 27 19 """Delete a xyz.statusphere.status record""" 28 20 deleteXyzStatusphereStatus: DeleteResult 21 + } 22 + 23 + """Root query type""" 24 + type Query { 25 + """Query xyz.statusphere.status""" 26 + xyzStatusphereStatus: [XyzStatusphereStatus] 29 27 } 30 28 31 29 scalar String ··· 46 44 createdAt: String 47 45 } 48 46 49 - """Root query type""" 50 - type Query { 51 - """Query xyz.statusphere.status""" 52 - xyzStatusphereStatus: [XyzStatusphereStatus] 47 + """Input type for XyzStatusphereStatusInput""" 48 + input XyzStatusphereStatusInput { 49 + """Input field for text""" 50 + text: String 51 + """Input field for createdAt""" 52 + createdAt: String! 53 53 } 54 54 55 55 scalar Int
+31
lexicon_graphql/birdie_snapshots/sort_field_enum_with_mixed_types_only_includes_primitives.accepted
··· 1 + --- 2 + version: 1.4.1 3 + title: SortField enum with mixed types - only includes primitives 4 + file: ./test/sorting_test.gleam 5 + test_name: sort_enum_with_mixed_field_types_snapshot_test 6 + --- 7 + """Available sort fields for AppBskyTestRecord""" 8 + enum AppBskyTestRecordSortField { 9 + """Sort by uri""" 10 + uri 11 + """Sort by cid""" 12 + cid 13 + """Sort by did""" 14 + did 15 + """Sort by collection""" 16 + collection 17 + """Sort by indexedAt""" 18 + indexedAt 19 + """Sort by stringField""" 20 + stringField 21 + """Sort by intField""" 22 + intField 23 + """Sort by boolField""" 24 + boolField 25 + """Sort by numberField""" 26 + numberField 27 + """Sort by datetimeField""" 28 + datetimeField 29 + """Sort by uriField""" 30 + uriField 31 + }
+37
lexicon_graphql/birdie_snapshots/where_input_with_mixed_types_only_includes_primitives.accepted
··· 1 + --- 2 + version: 1.4.1 3 + title: WhereInput with mixed types - only includes primitives 4 + file: ./test/where_schema_test.gleam 5 + test_name: where_input_with_mixed_field_types_snapshot_test 6 + --- 7 + """Filter conditions for AppBskyTestRecord with nested AND/OR support""" 8 + input AppBskyTestRecordWhereInput { 9 + """Filter by uri""" 10 + uri: AppBskyTestRecordFieldCondition 11 + """Filter by cid""" 12 + cid: AppBskyTestRecordFieldCondition 13 + """Filter by did""" 14 + did: AppBskyTestRecordFieldCondition 15 + """Filter by collection""" 16 + collection: AppBskyTestRecordFieldCondition 17 + """Filter by indexedAt""" 18 + indexedAt: AppBskyTestRecordFieldCondition 19 + """Filter by actorHandle""" 20 + actorHandle: AppBskyTestRecordFieldCondition 21 + """Filter by stringField""" 22 + stringField: AppBskyTestRecordFieldCondition 23 + """Filter by intField""" 24 + intField: AppBskyTestRecordFieldCondition 25 + """Filter by boolField""" 26 + boolField: AppBskyTestRecordFieldCondition 27 + """Filter by numberField""" 28 + numberField: AppBskyTestRecordFieldCondition 29 + """Filter by datetimeField""" 30 + datetimeField: AppBskyTestRecordFieldCondition 31 + """Filter by uriField""" 32 + uriField: AppBskyTestRecordFieldCondition 33 + """All conditions must match (AND logic)""" 34 + and: [AppBskyTestRecordWhereInput!] 35 + """Any condition must match (OR logic)""" 36 + or: [AppBskyTestRecordWhereInput!] 37 + }
+6
lexicon_graphql/src/dataloader_ffi.erl
··· 1 + -module(dataloader_ffi). 2 + -export([identity/1]). 3 + 4 + %% Identity function - returns value unchanged 5 + %% In Erlang, everything is already "dynamic", so this just passes through 6 + identity(Value) -> Value.
+156
lexicon_graphql/src/lexicon_graphql/collection_meta.gleam
··· 1 + /// Collection Metadata Extraction 2 + /// 3 + /// Extracts metadata from lexicons to identify fields that can be used for joins. 4 + /// This enables dynamic forward and reverse join field generation. 5 + import gleam/list 6 + import gleam/option.{type Option, None, Some} 7 + import lexicon_graphql/nsid 8 + import lexicon_graphql/types 9 + 10 + /// Metadata about a collection extracted from its lexicon 11 + pub type CollectionMeta { 12 + CollectionMeta( 13 + /// The NSID of the collection (e.g., "app.bsky.feed.post") 14 + nsid: String, 15 + /// The GraphQL type name (e.g., "AppBskyFeedPost") 16 + type_name: String, 17 + /// Record key type: "tid", "literal:self", or "any" 18 + key_type: String, 19 + /// Whether this collection has only one record per DID (e.g., profiles) 20 + has_unique_did: Bool, 21 + /// Fields that can be used for forward joins (following references) 22 + forward_join_fields: List(ForwardJoinField), 23 + /// Fields that can be used for reverse joins (fields with at-uri format) 24 + reverse_join_fields: List(String), 25 + ) 26 + } 27 + 28 + /// A field that can be used for forward joins 29 + pub type ForwardJoinField { 30 + /// A strongRef field - references a specific version of a record (URI + CID) 31 + StrongRefField(name: String) 32 + /// An at-uri field - references the latest version of a record (URI only) 33 + AtUriField(name: String) 34 + } 35 + 36 + /// Extract metadata from a lexicon 37 + pub fn extract_metadata(lexicon: types.Lexicon) -> CollectionMeta { 38 + let type_name = nsid.to_type_name(lexicon.id) 39 + 40 + case lexicon.defs.main { 41 + Some(main_def) -> { 42 + // Extract key type from lexicon (default to "tid" if not specified) 43 + let key_type = case main_def.key { 44 + Some(k) -> k 45 + None -> "tid" 46 + } 47 + 48 + // Determine if this collection has only one record per DID 49 + // Collections with key="literal:self" have a unique record per DID (e.g., profiles) 50 + let has_unique_did = key_type == "literal:self" 51 + 52 + let #(forward_fields, reverse_fields) = 53 + scan_properties(main_def.properties) 54 + 55 + CollectionMeta( 56 + nsid: lexicon.id, 57 + type_name: type_name, 58 + key_type: key_type, 59 + has_unique_did: has_unique_did, 60 + forward_join_fields: forward_fields, 61 + reverse_join_fields: reverse_fields, 62 + ) 63 + } 64 + None -> { 65 + // No main definition, return empty metadata 66 + CollectionMeta( 67 + nsid: lexicon.id, 68 + type_name: type_name, 69 + key_type: "tid", 70 + has_unique_did: False, 71 + forward_join_fields: [], 72 + reverse_join_fields: [], 73 + ) 74 + } 75 + } 76 + } 77 + 78 + /// Scan properties to identify forward and reverse join fields 79 + fn scan_properties( 80 + properties: List(#(String, types.Property)), 81 + ) -> #(List(ForwardJoinField), List(String)) { 82 + list.fold(properties, #([], []), fn(acc, prop) { 83 + let #(forward_fields, reverse_fields) = acc 84 + let #(name, property) = prop 85 + 86 + // Check if this is a forward join field 87 + let new_forward = case is_forward_join_field(property) { 88 + Some(field_type) -> [field_type(name), ..forward_fields] 89 + None -> forward_fields 90 + } 91 + 92 + // Check if this is a reverse join field (at-uri format) 93 + let new_reverse = case is_reverse_join_field(property) { 94 + True -> [name, ..reverse_fields] 95 + False -> reverse_fields 96 + } 97 + 98 + #(new_forward, new_reverse) 99 + }) 100 + } 101 + 102 + /// Check if a property is a forward join field 103 + /// Returns Some(constructor) if it is, None otherwise 104 + fn is_forward_join_field( 105 + property: types.Property, 106 + ) -> Option(fn(String) -> ForwardJoinField) { 107 + // Case 1: strongRef field 108 + case property.type_, property.ref { 109 + "ref", Some(ref_target) if ref_target == "com.atproto.repo.strongRef" -> 110 + Some(StrongRefField) 111 + _, _ -> 112 + // Case 2: at-uri string field 113 + case property.type_, property.format { 114 + "string", Some(fmt) if fmt == "at-uri" -> Some(AtUriField) 115 + _, _ -> None 116 + } 117 + } 118 + } 119 + 120 + /// Check if a property is a reverse join field (has at-uri format) 121 + fn is_reverse_join_field(property: types.Property) -> Bool { 122 + case property.format { 123 + Some(fmt) if fmt == "at-uri" -> True 124 + _ -> False 125 + } 126 + } 127 + 128 + /// Get all forward join field names from metadata 129 + pub fn get_forward_join_field_names(meta: CollectionMeta) -> List(String) { 130 + list.map(meta.forward_join_fields, fn(field) { 131 + case field { 132 + StrongRefField(name) -> name 133 + AtUriField(name) -> name 134 + } 135 + }) 136 + } 137 + 138 + /// Check if a field is a strongRef field 139 + pub fn is_strong_ref_field(meta: CollectionMeta, field_name: String) -> Bool { 140 + list.any(meta.forward_join_fields, fn(field) { 141 + case field { 142 + StrongRefField(name) if name == field_name -> True 143 + _ -> False 144 + } 145 + }) 146 + } 147 + 148 + /// Check if a field is an at-uri field 149 + pub fn is_at_uri_field(meta: CollectionMeta, field_name: String) -> Bool { 150 + list.any(meta.forward_join_fields, fn(field) { 151 + case field { 152 + AtUriField(name) if name == field_name -> True 153 + _ -> False 154 + } 155 + }) 156 + }
+13 -5
lexicon_graphql/src/lexicon_graphql/connection.gleam
··· 54 54 } 55 55 56 56 /// SortFieldInput type with a custom field enum 57 - pub fn sort_field_input_type_with_enum(field_enum: schema.Type) -> schema.Type { 57 + /// Creates a unique input type per collection (e.g., "SocialGrainGalleryItemSortFieldInput") 58 + pub fn sort_field_input_type_with_enum( 59 + type_name: String, 60 + field_enum: schema.Type, 61 + ) -> schema.Type { 62 + let input_type_name = type_name <> "SortFieldInput" 63 + 58 64 schema.input_object_type( 59 - "SortFieldInput", 60 - "Specifies a field to sort by and its direction", 65 + input_type_name, 66 + "Specifies a field to sort by and its direction for " <> type_name, 61 67 [ 62 68 schema.input_field( 63 69 "field", ··· 167 173 168 174 /// Connection arguments with sortBy using a custom field enum and where filtering 169 175 pub fn lexicon_connection_args_with_field_enum_and_where( 176 + type_name: String, 170 177 field_enum: schema.Type, 171 178 where_input_type: schema.Type, 172 179 ) -> List(schema.Argument) { ··· 177 184 schema.argument( 178 185 "sortBy", 179 186 schema.list_type(schema.non_null( 180 - sort_field_input_type_with_enum(field_enum), 187 + sort_field_input_type_with_enum(type_name, field_enum), 181 188 )), 182 189 "Sort order for the connection", 183 190 None, ··· 194 201 195 202 /// Connection arguments with sortBy using a custom field enum (backward compatibility) 196 203 pub fn lexicon_connection_args_with_field_enum( 204 + type_name: String, 197 205 field_enum: schema.Type, 198 206 ) -> List(schema.Argument) { 199 207 list.flatten([ ··· 203 211 schema.argument( 204 212 "sortBy", 205 213 schema.list_type(schema.non_null( 206 - sort_field_input_type_with_enum(field_enum), 214 + sort_field_input_type_with_enum(type_name, field_enum), 207 215 )), 208 216 "Sort order for the connection", 209 217 None,
+266
lexicon_graphql/src/lexicon_graphql/dataloader.gleam
··· 1 + /// DataLoader for batching database queries 2 + /// 3 + /// This module provides batch query functions for join operations to prevent N+1 queries. 4 + /// It works with the existing RecordFetcher pattern to batch URI lookups. 5 + import gleam/dict.{type Dict} 6 + import gleam/dynamic.{type Dynamic} 7 + import gleam/list 8 + import gleam/option.{type Option, None, Some} 9 + import gleam/result 10 + import gleam/string 11 + import graphql/value 12 + import lexicon_graphql/collection_meta 13 + import lexicon_graphql/uri_extractor 14 + import lexicon_graphql/where_input.{type WhereClause} 15 + 16 + /// Result of a batch query: maps URIs to their records 17 + pub type BatchResult = 18 + Dict(String, List(value.Value)) 19 + 20 + /// Batch query function type - takes a list of URIs and field constraints, 21 + /// returns records grouped by the URI they match 22 + pub type BatchFetcher = 23 + fn(List(String), String, Option(String)) -> 24 + Result(BatchResult, String) 25 + 26 + /// Pagination parameters for join queries 27 + /// Re-exported from db_schema_builder to avoid circular dependency 28 + pub type PaginationParams { 29 + PaginationParams( 30 + first: Option(Int), 31 + after: Option(String), 32 + last: Option(Int), 33 + before: Option(String), 34 + sort_by: Option(List(#(String, String))), 35 + where: Option(WhereClause), 36 + ) 37 + } 38 + 39 + /// Result of a paginated batch query 40 + pub type PaginatedBatchResult { 41 + PaginatedBatchResult( 42 + /// Records with their cursors 43 + edges: List(#(value.Value, String)), 44 + /// Whether there are more records after this page 45 + has_next_page: Bool, 46 + /// Whether there are more records before this page 47 + has_previous_page: Bool, 48 + /// Total count of records (if available) 49 + total_count: Option(Int), 50 + ) 51 + } 52 + 53 + /// Paginated batch query function type - takes a parent key, collection, field, 54 + /// and pagination params, returns paginated records 55 + pub type PaginatedBatchFetcher = 56 + fn(String, String, Option(String), PaginationParams) -> 57 + Result(PaginatedBatchResult, String) 58 + 59 + /// Key for forward join batching - identifies a unique batch request 60 + pub type ForwardJoinKey { 61 + ForwardJoinKey( 62 + /// Target collection to fetch 63 + target_collection: String, 64 + /// URIs to fetch 65 + uris: List(String), 66 + ) 67 + } 68 + 69 + /// Key for reverse join batching - identifies records that reference parent URIs 70 + pub type ReverseJoinKey { 71 + ReverseJoinKey( 72 + /// Collection to search in 73 + collection: String, 74 + /// Field name that contains the reference (e.g., "subject", "reply") 75 + reference_field: String, 76 + /// Parent URIs to find references to 77 + parent_uris: List(String), 78 + ) 79 + } 80 + 81 + /// Extract the collection name from an AT URI 82 + /// Format: at://did:plc:abc123/app.bsky.feed.post/rkey 83 + /// Returns: app.bsky.feed.post 84 + pub fn uri_to_collection(uri: String) -> Option(String) { 85 + case string.split(uri, "/") { 86 + ["at:", "", _did, collection, _rkey] -> Some(collection) 87 + _ -> None 88 + } 89 + } 90 + 91 + /// Batch fetch records by URI (forward joins) 92 + /// 93 + /// Given a list of URIs, fetches all target records in a single query. 94 + /// Returns a Dict mapping each URI to its record (if found). 95 + pub fn batch_fetch_by_uri( 96 + uris: List(String), 97 + fetcher: BatchFetcher, 98 + ) -> Result(Dict(String, value.Value), String) { 99 + // Group URIs by collection for optimal batching 100 + let grouped = group_uris_by_collection(uris) 101 + 102 + // Fetch each collection's records in a batch 103 + list.try_fold(grouped, dict.new(), fn(acc, group) { 104 + let #(collection, collection_uris) = group 105 + 106 + case fetcher(collection_uris, collection, None) { 107 + Ok(batch_result) -> { 108 + // Merge results into accumulator 109 + // For forward joins, we expect single records per URI 110 + let merged = 111 + dict.fold(batch_result, acc, fn(result_acc, uri, records) { 112 + case records { 113 + [first, ..] -> dict.insert(result_acc, uri, first) 114 + [] -> result_acc 115 + } 116 + }) 117 + Ok(merged) 118 + } 119 + Error(e) -> Error(e) 120 + } 121 + }) 122 + } 123 + 124 + /// Batch fetch records by reverse join (records that reference parent URIs) 125 + /// 126 + /// Given parent URIs and a field name, finds all records whose field references those URIs. 127 + /// Returns a Dict mapping each parent URI to the list of records that reference it. 128 + pub fn batch_fetch_by_reverse_join( 129 + parent_uris: List(String), 130 + collection: String, 131 + reference_field: String, 132 + fetcher: BatchFetcher, 133 + ) -> Result(Dict(String, List(value.Value)), String) { 134 + // Fetch all records that reference any of the parent URIs 135 + fetcher(parent_uris, collection, Some(reference_field)) 136 + } 137 + 138 + /// Batch fetch records by DID (records that share the same DID) 139 + /// 140 + /// Given a list of DIDs and a target collection, fetches all records in that collection 141 + /// that belong to those DIDs. 142 + /// Returns a Dict mapping each DID to the list of records (or single record for unique collections). 143 + pub fn batch_fetch_by_did( 144 + dids: List(String), 145 + target_collection: String, 146 + fetcher: BatchFetcher, 147 + ) -> Result(Dict(String, List(value.Value)), String) { 148 + // Use the fetcher to get records by DID 149 + // The fetcher will need to be updated to handle DID-based queries 150 + // For now, we pass the DIDs as URIs and use None for the field 151 + // The actual database layer will interpret this correctly 152 + fetcher(dids, target_collection, None) 153 + } 154 + 155 + /// Batch fetch records by reverse join with pagination 156 + /// 157 + /// Given a parent URI, field name, and pagination params, finds all records whose field 158 + /// references that URI, with cursor-based pagination. 159 + pub fn batch_fetch_by_reverse_join_paginated( 160 + parent_uri: String, 161 + collection: String, 162 + reference_field: String, 163 + pagination: PaginationParams, 164 + fetcher: PaginatedBatchFetcher, 165 + ) -> Result(PaginatedBatchResult, String) { 166 + // Fetch paginated records that reference the parent URI 167 + fetcher(parent_uri, collection, Some(reference_field), pagination) 168 + } 169 + 170 + /// Batch fetch records by DID with pagination 171 + /// 172 + /// Given a DID, target collection, and pagination params, fetches all records in that collection 173 + /// that belong to that DID, with cursor-based pagination. 174 + pub fn batch_fetch_by_did_paginated( 175 + did: String, 176 + target_collection: String, 177 + pagination: PaginationParams, 178 + fetcher: PaginatedBatchFetcher, 179 + ) -> Result(PaginatedBatchResult, String) { 180 + // Fetch paginated records by DID 181 + fetcher(did, target_collection, None, pagination) 182 + } 183 + 184 + /// Group URIs by their collection for batching 185 + fn group_uris_by_collection(uris: List(String)) -> List(#(String, List(String))) { 186 + // Group URIs by collection 187 + let grouped = 188 + list.fold(uris, dict.new(), fn(acc, uri) { 189 + case uri_to_collection(uri) { 190 + Some(collection) -> { 191 + let existing = dict.get(acc, collection) |> result.unwrap([]) 192 + dict.insert(acc, collection, [uri, ..existing]) 193 + } 194 + None -> acc 195 + } 196 + }) 197 + 198 + dict.to_list(grouped) 199 + } 200 + 201 + /// Extract URIs from a list of records based on field metadata 202 + /// 203 + /// This is used to collect URIs from parent records that need to be resolved. 204 + /// For example, extracting all "subject" URIs from a list of Like records. 205 + pub fn extract_uris_from_records( 206 + records: List(value.Value), 207 + field_name: String, 208 + _meta: collection_meta.CollectionMeta, 209 + ) -> List(String) { 210 + list.filter_map(records, fn(record) { 211 + // Extract the field value from the record 212 + case extract_field_value(record, field_name) { 213 + Some(field_value) -> { 214 + // Use uri_extractor to get the URI (handles both strongRef and at-uri) 215 + case uri_extractor.extract_uri(field_value) { 216 + Some(uri) -> Ok(uri) 217 + None -> Error(Nil) 218 + } 219 + } 220 + None -> Error(Nil) 221 + } 222 + }) 223 + } 224 + 225 + /// Extract a field value from a GraphQL Value 226 + fn extract_field_value(value: value.Value, field_name: String) -> Option(Dynamic) { 227 + case value { 228 + value.Object(fields) -> { 229 + // fields is a List(#(String, value.Value)), find the matching field 230 + list.find(fields, fn(pair) { pair.0 == field_name }) 231 + |> result.map(fn(pair) { value_to_dynamic(pair.1) }) 232 + |> option.from_result 233 + } 234 + _ -> None 235 + } 236 + } 237 + 238 + /// Convert a GraphQL Value to Dynamic for uri_extractor 239 + /// Properly extracts the underlying value from GraphQL Value types 240 + fn value_to_dynamic(v: value.Value) -> Dynamic { 241 + case v { 242 + value.String(s) -> unsafe_coerce_to_dynamic(s) 243 + value.Int(i) -> unsafe_coerce_to_dynamic(i) 244 + value.Float(f) -> unsafe_coerce_to_dynamic(f) 245 + value.Boolean(b) -> unsafe_coerce_to_dynamic(b) 246 + value.Null -> unsafe_coerce_to_dynamic(None) 247 + value.Object(fields) -> { 248 + // Convert object fields to a format uri_extractor can work with 249 + // Create a dict-like structure for the object 250 + let field_map = 251 + list.fold(fields, dict.new(), fn(acc, field) { 252 + let #(key, val) = field 253 + dict.insert(acc, key, value_to_dynamic(val)) 254 + }) 255 + unsafe_coerce_to_dynamic(field_map) 256 + } 257 + value.List(items) -> { 258 + let converted = list.map(items, value_to_dynamic) 259 + unsafe_coerce_to_dynamic(converted) 260 + } 261 + value.Enum(name) -> unsafe_coerce_to_dynamic(name) 262 + } 263 + } 264 + 265 + @external(erlang, "dataloader_ffi", "identity") 266 + fn unsafe_coerce_to_dynamic(value: a) -> Dynamic
+1069 -53
lexicon_graphql/src/lexicon_graphql/db_schema_builder.gleam
··· 2 2 /// 3 3 /// Builds GraphQL schemas from AT Protocol lexicon definitions with database-backed resolvers. 4 4 /// This extends the base schema_builder with actual data resolution. 5 + import gleam/dict.{type Dict} 6 + import gleam/dynamic.{type Dynamic} 5 7 import gleam/int 6 8 import gleam/list 7 9 import gleam/option 8 10 import gleam/result 11 + import gleam/string 9 12 import graphql/connection 10 13 import graphql/schema 11 14 import graphql/value 15 + import lexicon_graphql/collection_meta 12 16 import lexicon_graphql/connection as lexicon_connection 17 + import lexicon_graphql/dataloader 18 + import lexicon_graphql/lexicon_registry 13 19 import lexicon_graphql/mutation_builder 14 20 import lexicon_graphql/nsid 21 + import lexicon_graphql/object_type_builder 15 22 import lexicon_graphql/type_mapper 16 23 import lexicon_graphql/types 24 + import lexicon_graphql/uri_extractor 17 25 import lexicon_graphql/where_input 18 26 27 + /// Represents a reverse join relationship discovered from lexicon analysis 28 + type ReverseJoinRelationship { 29 + ReverseJoinRelationship( 30 + /// Collection that has the reference field (e.g., "app.bsky.feed.like") 31 + source_collection: String, 32 + /// Type name of the source collection (e.g., "AppBskyFeedLike") 33 + source_type_name: String, 34 + /// Field name in source that references target (e.g., "subject") 35 + source_field: String, 36 + ) 37 + } 38 + 19 39 /// Record type metadata with database resolver info 20 40 type RecordType { 21 41 RecordType( ··· 23 43 type_name: String, 24 44 field_name: String, 25 45 fields: List(schema.Field), 26 - ) 27 - } 28 - 29 - /// Pagination parameters for connection queries 30 - pub type PaginationParams { 31 - PaginationParams( 32 - first: option.Option(Int), 33 - after: option.Option(String), 34 - last: option.Option(Int), 35 - before: option.Option(String), 36 - sort_by: option.Option(List(#(String, String))), 37 - where: option.Option(where_input.WhereClause), 46 + /// Original lexicon properties for this record (used for filtering sortable fields) 47 + properties: List(#(String, types.Property)), 48 + /// Metadata extracted from lexicon for join field generation 49 + meta: collection_meta.CollectionMeta, 50 + /// Reverse join relationships where this type is the target 51 + reverse_joins: List(ReverseJoinRelationship), 38 52 ) 39 53 } 40 54 ··· 42 56 /// Takes a collection NSID and pagination params, returns Connection data 43 57 /// Returns: (records_with_cursors, end_cursor, has_next_page, has_previous_page, total_count) 44 58 pub type RecordFetcher = 45 - fn(String, PaginationParams) -> 59 + fn(String, dataloader.PaginationParams) -> 46 60 Result( 47 61 #( 48 62 List(#(value.Value, String)), ··· 56 70 57 71 /// Build a GraphQL schema from lexicons with database-backed resolvers 58 72 /// 59 - /// The fetcher parameter should be a function that queries the database for records 73 + /// The fetcher parameter should be a function that queries the database for records with pagination 74 + /// The batch_fetcher parameter is used for join operations (forward and reverse joins) 60 75 /// The mutation resolver factories are optional - if None, mutations will return stub errors 61 76 pub fn build_schema_with_fetcher( 62 77 lexicons: List(types.Lexicon), 63 78 fetcher: RecordFetcher, 79 + batch_fetcher: option.Option(dataloader.BatchFetcher), 80 + paginated_batch_fetcher: option.Option(dataloader.PaginatedBatchFetcher), 64 81 create_factory: option.Option(mutation_builder.ResolverFactory), 65 82 update_factory: option.Option(mutation_builder.ResolverFactory), 66 83 delete_factory: option.Option(mutation_builder.ResolverFactory), ··· 69 86 case lexicons { 70 87 [] -> Error("Cannot build schema from empty lexicon list") 71 88 _ -> { 72 - // Extract record types from lexicons 73 - let record_types = extract_record_types(lexicons) 89 + // Extract record types and object types from lexicons with batch fetcher for joins 90 + let #(record_types, object_types) = 91 + extract_record_types_and_object_types( 92 + lexicons, 93 + batch_fetcher, 94 + paginated_batch_fetcher, 95 + ) 74 96 75 - // Build the query type with fields for each record 76 - let query_type = build_query_type(record_types, fetcher) 97 + // Build the query type with fields for each record using shared object types 98 + let query_type = build_query_type(record_types, object_types, fetcher) 77 99 78 100 // Build the mutation type with provided resolver factories 101 + // Pass the complete object types so mutations use the same types as queries 79 102 let mutation_type = 80 103 mutation_builder.build_mutation_type( 81 104 lexicons, 105 + object_types, 82 106 create_factory, 83 107 update_factory, 84 108 delete_factory, ··· 91 115 } 92 116 } 93 117 94 - /// Extract record types from lexicon definitions 95 - fn extract_record_types(lexicons: List(types.Lexicon)) -> List(RecordType) { 96 - lexicons 97 - |> list.filter_map(parse_lexicon) 118 + /// Extract record types and object types from lexicon definitions 119 + /// 120 + /// NEW 3-PASS ARCHITECTURE: 121 + /// Pass 0: Build object types from lexicon defs (e.g., aspectRatio) 122 + /// Pass 1: Extract metadata and build basic types (base fields only) 123 + /// Pass 2: Build complete RecordTypes with ALL join fields 124 + /// Pass 3: Rebuild ONLY Connection fields using final types 125 + fn extract_record_types_and_object_types( 126 + lexicons: List(types.Lexicon), 127 + batch_fetcher: option.Option(dataloader.BatchFetcher), 128 + paginated_batch_fetcher: option.Option(dataloader.PaginatedBatchFetcher), 129 + ) -> #(List(RecordType), dict.Dict(String, schema.Type)) { 130 + // ============================================================================= 131 + // PASS 0: Build object types from lexicon defs 132 + // ============================================================================= 133 + 134 + // Create a registry from all lexicons 135 + let registry = lexicon_registry.from_lexicons(lexicons) 136 + 137 + // Build all object types from defs (e.g., social.grain.defs#aspectRatio) 138 + let ref_object_types = object_type_builder.build_all_object_types(registry) 139 + 140 + // ============================================================================= 141 + // PASS 1: Extract metadata and build basic types 142 + // ============================================================================= 143 + 144 + // Extract metadata from all lexicons 145 + let metadata_list = 146 + list.filter_map(lexicons, fn(lex) { 147 + case lex { 148 + types.Lexicon(id, types.Defs(option.Some(types.RecordDef("record", _, _)), _)) -> { 149 + let meta = collection_meta.extract_metadata(lex) 150 + Ok(#(id, meta)) 151 + } 152 + _ -> Error(Nil) 153 + } 154 + }) 155 + 156 + // Build reverse join map: target_nsid -> List(ReverseJoinRelationship) 157 + let reverse_join_map = build_reverse_join_map(metadata_list) 158 + 159 + // Build DID join map: source_nsid -> List(#(target_nsid, target_meta)) 160 + let did_join_map = build_did_join_map(metadata_list) 161 + 162 + // Parse lexicons to create basic RecordTypes (base fields only, no joins yet) 163 + let basic_record_types_without_forward_joins = 164 + lexicons 165 + |> list.filter_map(fn(lex) { 166 + parse_lexicon_with_reverse_joins(lex, [], batch_fetcher, ref_object_types) 167 + }) 168 + 169 + // Build basic object types WITHOUT forward joins (needed as generic type for forward joins) 170 + let basic_object_types_without_forward_joins = 171 + list.fold(basic_record_types_without_forward_joins, dict.new(), fn(acc, record_type) { 172 + let object_type = 173 + schema.object_type( 174 + record_type.type_name, 175 + "Record type: " <> record_type.nsid, 176 + record_type.fields, 177 + ) 178 + dict.insert(acc, record_type.nsid, object_type) 179 + }) 180 + 181 + // Build Record union for forward joins to reference 182 + let basic_possible_types = dict.values(basic_object_types_without_forward_joins) 183 + let basic_record_union = build_record_union(basic_possible_types) 184 + let basic_object_types_with_generic = 185 + dict.insert(basic_object_types_without_forward_joins, "_generic_record", basic_record_union) 186 + 187 + // Now add forward join fields to basic types 188 + // This ensures Connection types built in Pass 2 reference types with forward joins 189 + let basic_record_types = 190 + list.map(basic_record_types_without_forward_joins, fn(record_type) { 191 + let forward_join_fields = 192 + build_forward_join_fields_with_types( 193 + record_type.meta, 194 + batch_fetcher, 195 + basic_object_types_with_generic, 196 + ) 197 + 198 + let all_fields = 199 + list.flatten([record_type.fields, forward_join_fields]) 200 + 201 + RecordType(..record_type, fields: all_fields) 202 + }) 203 + 204 + // Build basic object types WITH forward joins 205 + // These are the types that Pass 2 Connections will reference 206 + let basic_object_types_without_generic = 207 + list.fold(basic_record_types, dict.new(), fn(acc, record_type) { 208 + let object_type = 209 + schema.object_type( 210 + record_type.type_name, 211 + "Record type: " <> record_type.nsid, 212 + record_type.fields, 213 + ) 214 + dict.insert(acc, record_type.nsid, object_type) 215 + }) 216 + 217 + // Rebuild Record union with complete basic types 218 + let basic_possible_types_complete = dict.values(basic_object_types_without_generic) 219 + let basic_record_union_complete = build_record_union(basic_possible_types_complete) 220 + let basic_object_types = 221 + dict.insert(basic_object_types_without_generic, "_generic_record", basic_record_union_complete) 222 + 223 + // Build sort field enums dict BEFORE Pass 2 - create each enum once and reuse 224 + let sort_field_enums: dict.Dict(String, schema.Type) = 225 + list.fold(basic_record_types, dict.new(), fn(acc, rt: RecordType) { 226 + let enum = build_sort_field_enum(rt) 227 + dict.insert(acc, rt.type_name, enum) 228 + }) 229 + 230 + // ============================================================================= 231 + // PASS 2: Build complete RecordTypes with ALL join fields 232 + // ============================================================================= 233 + 234 + let complete_record_types = 235 + list.map(basic_record_types, fn(record_type) { 236 + let reverse_joins = 237 + dict.get(reverse_join_map, record_type.nsid) |> result.unwrap([]) 238 + let did_join_targets = 239 + dict.get(did_join_map, record_type.nsid) |> result.unwrap([]) 240 + 241 + // Build ALL join fields using basic object types 242 + // These Connection types will reference basic types initially 243 + let forward_join_fields = 244 + build_forward_join_fields_with_types( 245 + record_type.meta, 246 + batch_fetcher, 247 + basic_object_types, 248 + ) 249 + 250 + let reverse_join_fields = 251 + build_reverse_join_fields_with_types( 252 + reverse_joins, 253 + paginated_batch_fetcher, 254 + basic_object_types, 255 + basic_record_types, 256 + sort_field_enums, 257 + ) 258 + 259 + let did_join_fields = 260 + build_did_join_fields_with_types( 261 + did_join_targets, 262 + batch_fetcher, 263 + paginated_batch_fetcher, 264 + basic_object_types, 265 + basic_record_types, 266 + sort_field_enums, 267 + ) 268 + 269 + // Combine all fields (base + forward + reverse + DID joins) 270 + let all_fields = 271 + list.flatten([ 272 + record_type.fields, 273 + forward_join_fields, 274 + reverse_join_fields, 275 + did_join_fields, 276 + ]) 277 + 278 + RecordType(..record_type, fields: all_fields, reverse_joins: reverse_joins) 279 + }) 280 + 281 + // Build complete object types from complete RecordTypes 282 + let complete_object_types_without_generic: dict.Dict(String, schema.Type) = 283 + list.fold(complete_record_types, dict.new(), fn(acc, rt: RecordType) { 284 + let object_type = 285 + schema.object_type( 286 + rt.type_name, 287 + "Record type: " <> rt.nsid, 288 + rt.fields, 289 + ) 290 + dict.insert(acc, rt.nsid, object_type) 291 + }) 292 + 293 + // Build Record union with complete object types 294 + let complete_possible_types = dict.values(complete_object_types_without_generic) 295 + let complete_record_union = build_record_union(complete_possible_types) 296 + let complete_object_types = 297 + dict.insert( 298 + complete_object_types_without_generic, 299 + "_generic_record", 300 + complete_record_union, 301 + ) 302 + 303 + // ============================================================================= 304 + // PASS 3: Rebuild join fields using COMPLETE object types 305 + // ============================================================================= 306 + // This fixes the issue where DID/reverse join fields reference incomplete types. 307 + // Even though basic_object_types have forward joins now, they don't have reverse/did joins. 308 + // So we need to rebuild all Connections to reference complete_object_types. 309 + 310 + let final_record_types = 311 + list.map(basic_record_types, fn(record_type) { 312 + // Rebuild ALL join fields with complete types 313 + let reverse_joins = 314 + dict.get(reverse_join_map, record_type.nsid) |> result.unwrap([]) 315 + let did_join_targets = 316 + dict.get(did_join_map, record_type.nsid) |> result.unwrap([]) 317 + 318 + let forward_join_fields = 319 + build_forward_join_fields_with_types( 320 + record_type.meta, 321 + batch_fetcher, 322 + complete_object_types, 323 + ) 324 + 325 + let reverse_join_fields = 326 + build_reverse_join_fields_with_types( 327 + reverse_joins, 328 + paginated_batch_fetcher, 329 + complete_object_types, 330 + complete_record_types, 331 + sort_field_enums, 332 + ) 333 + 334 + let did_join_fields = 335 + build_did_join_fields_with_types( 336 + did_join_targets, 337 + batch_fetcher, 338 + paginated_batch_fetcher, 339 + complete_object_types, 340 + complete_record_types, 341 + sort_field_enums, 342 + ) 343 + 344 + // Combine all fields 345 + let all_fields = 346 + list.flatten([ 347 + record_type.fields, 348 + forward_join_fields, 349 + reverse_join_fields, 350 + did_join_fields, 351 + ]) 352 + 353 + RecordType(..record_type, fields: all_fields, reverse_joins: reverse_joins) 354 + }) 355 + 356 + // Rebuild final object types with all fields 357 + let final_object_types_without_generic = 358 + list.fold(final_record_types, dict.new(), fn(acc, record_type) { 359 + let object_type = 360 + schema.object_type( 361 + record_type.type_name, 362 + "Record type: " <> record_type.nsid, 363 + record_type.fields, 364 + ) 365 + 366 + dict.insert(acc, record_type.nsid, object_type) 367 + }) 368 + 369 + // Rebuild Record union with final object types 370 + let final_possible_types = dict.values(final_object_types_without_generic) 371 + let final_record_union = build_record_union(final_possible_types) 372 + let final_object_types = 373 + dict.insert( 374 + final_object_types_without_generic, 375 + "_generic_record", 376 + final_record_union, 377 + ) 378 + 379 + // Merge ref_object_types (from lexicon defs) into final_object_types 380 + // This makes object types like "social.grain.defs#aspectRatio" available for ref resolution 381 + let final_object_types_with_refs = 382 + dict.fold(ref_object_types, final_object_types, fn(acc, ref, obj_type) { 383 + dict.insert(acc, ref, obj_type) 384 + }) 385 + 386 + #(final_record_types, final_object_types_with_refs) 98 387 } 99 388 100 - /// Parse a single lexicon into a RecordType 101 - fn parse_lexicon(lexicon: types.Lexicon) -> Result(RecordType, Nil) { 389 + /// Build a map of reverse join relationships from metadata 390 + /// Returns: Dict(target_nsid, List(ReverseJoinRelationship)) 391 + fn build_reverse_join_map( 392 + metadata_list: List(#(String, collection_meta.CollectionMeta)), 393 + ) -> Dict(String, List(ReverseJoinRelationship)) { 394 + // For each collection with forward join fields, create reverse join entries 395 + let result = 396 + list.fold(metadata_list, dict.new(), fn(acc, meta_pair) { 397 + let #(source_nsid, source_meta) = meta_pair 398 + 399 + // For each forward join field in the source collection 400 + list.fold(source_meta.forward_join_fields, acc, fn(map_acc, join_field) { 401 + let field_name = case join_field { 402 + collection_meta.StrongRefField(name) -> name 403 + collection_meta.AtUriField(name) -> name 404 + } 405 + 406 + let relationship = 407 + ReverseJoinRelationship( 408 + source_collection: source_nsid, 409 + source_type_name: source_meta.type_name, 410 + source_field: field_name, 411 + ) 412 + 413 + // Since at-uri and strongRef can reference ANY collection, 414 + // we add this relationship to ALL other collections as potential targets 415 + // This is a conservative approach - in production you might want to be more selective 416 + list.fold(metadata_list, map_acc, fn(target_acc, target_pair) { 417 + let #(target_nsid, _target_meta) = target_pair 418 + 419 + // Don't create self-referencing reverse joins (for now) 420 + case target_nsid == source_nsid { 421 + True -> target_acc 422 + False -> { 423 + let existing = 424 + dict.get(target_acc, target_nsid) |> result.unwrap([]) 425 + dict.insert(target_acc, target_nsid, [relationship, ..existing]) 426 + } 427 + } 428 + }) 429 + }) 430 + }) 431 + 432 + result 433 + } 434 + 435 + /// Build a map of DID-based join relationships from metadata 436 + /// Every collection can join to every other collection by DID 437 + /// Returns: Dict(source_nsid, List(#(target_nsid, target_meta))) 438 + fn build_did_join_map( 439 + metadata_list: List(#(String, collection_meta.CollectionMeta)), 440 + ) -> Dict(String, List(#(String, collection_meta.CollectionMeta))) { 441 + // For each collection, create DID join entries to all other collections 442 + list.fold(metadata_list, dict.new(), fn(acc, source_pair) { 443 + let #(source_nsid, _source_meta) = source_pair 444 + 445 + // Build list of target collections (all collections except self) 446 + let targets = 447 + list.filter_map(metadata_list, fn(target_pair) { 448 + let #(target_nsid, target_meta) = target_pair 449 + case target_nsid == source_nsid { 450 + True -> Error(Nil) 451 + // Don't join to self 452 + False -> Ok(#(target_nsid, target_meta)) 453 + } 454 + }) 455 + 456 + dict.insert(acc, source_nsid, targets) 457 + }) 458 + } 459 + 460 + /// Parse a single lexicon into a RecordType with all fields including reverse joins 461 + fn parse_lexicon_with_reverse_joins( 462 + lexicon: types.Lexicon, 463 + reverse_joins: List(ReverseJoinRelationship), 464 + batch_fetcher: option.Option(dataloader.BatchFetcher), 465 + ref_object_types: Dict(String, schema.Type), 466 + ) -> Result(RecordType, Nil) { 102 467 case lexicon { 103 - types.Lexicon(id, types.Defs(types.RecordDef("record", properties))) -> { 468 + types.Lexicon(id, types.Defs(option.Some(types.RecordDef("record", _, properties)), _)) -> { 104 469 let type_name = nsid.to_type_name(id) 105 470 let field_name = nsid.to_field_name(id) 106 - let fields = build_fields(properties) 471 + 472 + // Extract metadata for join field generation 473 + let meta = collection_meta.extract_metadata(lexicon) 474 + 475 + // Build regular and forward join fields 476 + let base_fields = 477 + build_fields_with_meta(properties, meta, batch_fetcher, ref_object_types) 478 + 479 + // Note: Reverse join fields are NOT built here - they will be added later 480 + // once object types exist (see build_reverse_join_fields_with_types) 481 + 482 + // Use only base fields for initial schema building 483 + let all_fields = base_fields 107 484 108 485 Ok(RecordType( 109 486 nsid: id, 110 487 type_name: type_name, 111 488 field_name: field_name, 112 - fields: fields, 489 + fields: all_fields, 490 + properties: properties, 491 + meta: meta, 492 + reverse_joins: reverse_joins, 113 493 )) 114 494 } 115 495 _ -> Error(Nil) 116 496 } 117 497 } 118 498 499 + /// Build GraphQL fields from lexicon properties (WITHOUT forward or reverse joins) 500 + /// Join fields are added later once object types exist 501 + fn build_fields_with_meta( 502 + properties: List(#(String, types.Property)), 503 + _meta: collection_meta.CollectionMeta, 504 + _batch_fetcher: option.Option(dataloader.BatchFetcher), 505 + ref_object_types: Dict(String, schema.Type), 506 + ) -> List(schema.Field) { 507 + let regular_fields = build_fields(properties, ref_object_types) 508 + // Note: Forward join fields are NOT built here - they need object types first 509 + 510 + regular_fields 511 + } 512 + 513 + /// Build forward join fields from collection metadata with proper types 514 + /// These fields resolve to the referenced records using the DataLoader 515 + fn build_forward_join_fields_with_types( 516 + meta: collection_meta.CollectionMeta, 517 + batch_fetcher: option.Option(dataloader.BatchFetcher), 518 + object_types: dict.Dict(String, schema.Type), 519 + ) -> List(schema.Field) { 520 + // Get the generic record type 521 + let generic_type = case dict.get(object_types, "_generic_record") { 522 + Ok(t) -> t 523 + Error(_) -> schema.string_type() 524 + // Fallback 525 + } 526 + 527 + list.map(meta.forward_join_fields, fn(join_field) { 528 + let field_name = case join_field { 529 + collection_meta.StrongRefField(name) -> name 530 + collection_meta.AtUriField(name) -> name 531 + } 532 + 533 + schema.field( 534 + field_name <> "Resolved", 535 + generic_type, 536 + "Forward join to referenced record", 537 + fn(ctx) { 538 + // Extract the field value from the parent record 539 + case get_nested_field_from_context_dynamic(ctx, "value", field_name) { 540 + Ok(field_value) -> { 541 + // Extract URI using uri_extractor 542 + case uri_extractor.extract_uri(field_value) { 543 + option.Some(uri) -> { 544 + // Use batch fetcher if available to resolve the record 545 + case batch_fetcher { 546 + option.Some(fetcher) -> { 547 + case dataloader.batch_fetch_by_uri([uri], fetcher) { 548 + Ok(results) -> { 549 + case dict.get(results, uri) { 550 + Ok(record) -> Ok(record) 551 + Error(_) -> Ok(value.Null) 552 + } 553 + } 554 + Error(_) -> Ok(value.Null) 555 + } 556 + } 557 + option.None -> { 558 + // No batch fetcher - return URI as string 559 + Ok(value.String(uri)) 560 + } 561 + } 562 + } 563 + option.None -> Ok(value.Null) 564 + } 565 + } 566 + Error(_) -> Ok(value.Null) 567 + } 568 + }, 569 + ) 570 + }) 571 + } 572 + 573 + /// Build reverse join fields with proper object type references 574 + /// These fields return lists of records that reference the current record 575 + fn build_reverse_join_fields_with_types( 576 + reverse_joins: List(ReverseJoinRelationship), 577 + paginated_batch_fetcher: option.Option(dataloader.PaginatedBatchFetcher), 578 + object_types: dict.Dict(String, schema.Type), 579 + record_types: List(RecordType), 580 + sort_field_enums: dict.Dict(String, schema.Type), 581 + ) -> List(schema.Field) { 582 + list.map(reverse_joins, fn(relationship) { 583 + // Generate field name: <sourceTypeName>Via<FieldName> 584 + // Example: appBskyFeedLikeViaSubject (camelCase) 585 + let field_name = 586 + lowercase_first(relationship.source_type_name) 587 + <> "Via" 588 + <> capitalize_first(relationship.source_field) 589 + 590 + // Get the source object type from the dict 591 + let source_object_type = case 592 + dict.get(object_types, relationship.source_collection) 593 + { 594 + Ok(obj_type) -> obj_type 595 + Error(_) -> schema.string_type() 596 + // Fallback to string type if not found 597 + } 598 + 599 + // Create Connection types for this join 600 + let edge_type = 601 + connection.edge_type(relationship.source_type_name, source_object_type) 602 + let connection_type = 603 + connection.connection_type(relationship.source_type_name, edge_type) 604 + 605 + // Find the RecordType for the source collection to build sortBy and where args 606 + let source_record_type = 607 + list.find(record_types, fn(rt) { 608 + rt.nsid == relationship.source_collection 609 + }) 610 + 611 + // Build connection args with sortBy and where support 612 + let connection_args = case source_record_type { 613 + Ok(record_type) -> { 614 + // Look up pre-built sort field enum from dict 615 + let sort_field_enum = case 616 + dict.get(sort_field_enums, record_type.type_name) 617 + { 618 + Ok(enum) -> enum 619 + Error(_) -> build_sort_field_enum(record_type) 620 + // Fallback: build if not found (shouldn't happen) 621 + } 622 + let where_input_type = build_where_input_type(record_type) 623 + 624 + lexicon_connection.lexicon_connection_args_with_field_enum_and_where( 625 + record_type.type_name, 626 + sort_field_enum, 627 + where_input_type, 628 + ) 629 + } 630 + Error(_) -> { 631 + // Fallback to basic pagination args if record type not found 632 + list.flatten([ 633 + connection.forward_pagination_args(), 634 + connection.backward_pagination_args(), 635 + ]) 636 + } 637 + } 638 + 639 + schema.field_with_args( 640 + field_name, 641 + connection_type, 642 + "Reverse join: records in " 643 + <> relationship.source_collection 644 + <> " that reference this record via " 645 + <> relationship.source_field, 646 + connection_args, 647 + fn(ctx) { 648 + // Extract the current record's URI 649 + case get_field_from_context(ctx, "uri") { 650 + Ok(parent_uri) -> { 651 + case paginated_batch_fetcher { 652 + option.Some(fetcher) -> { 653 + // Extract pagination params from context (includes sortBy and where) 654 + let pagination_params = extract_pagination_params(ctx) 655 + 656 + // Use paginated DataLoader to fetch records that reference this URI 657 + case 658 + dataloader.batch_fetch_by_reverse_join_paginated( 659 + parent_uri, 660 + relationship.source_collection, 661 + relationship.source_field, 662 + pagination_params, 663 + fetcher, 664 + ) 665 + { 666 + Ok(batch_result) -> { 667 + // Build edges from records with their cursors 668 + let edges = 669 + list.map(batch_result.edges, fn(edge_tuple) { 670 + let #(record_value, record_cursor) = edge_tuple 671 + connection.Edge( 672 + node: record_value, 673 + cursor: record_cursor, 674 + ) 675 + }) 676 + 677 + // Build PageInfo 678 + let page_info = 679 + connection.PageInfo( 680 + has_next_page: batch_result.has_next_page, 681 + has_previous_page: batch_result.has_previous_page, 682 + start_cursor: case list.first(edges) { 683 + Ok(edge) -> option.Some(edge.cursor) 684 + Error(_) -> option.None 685 + }, 686 + end_cursor: case list.last(edges) { 687 + Ok(edge) -> option.Some(edge.cursor) 688 + Error(_) -> option.None 689 + }, 690 + ) 691 + 692 + // Build Connection 693 + let conn = 694 + connection.Connection( 695 + edges: edges, 696 + page_info: page_info, 697 + total_count: batch_result.total_count, 698 + ) 699 + 700 + Ok(connection.connection_to_value(conn)) 701 + } 702 + Error(_) -> { 703 + // Return empty connection on error 704 + Ok(empty_connection_value()) 705 + } 706 + } 707 + } 708 + option.None -> { 709 + // No paginated batch fetcher - return empty connection 710 + Ok(empty_connection_value()) 711 + } 712 + } 713 + } 714 + Error(_) -> { 715 + // Can't get parent URI - return empty connection 716 + Ok(empty_connection_value()) 717 + } 718 + } 719 + }, 720 + ) 721 + }) 722 + } 723 + 724 + /// Build DID join fields from metadata with proper types 725 + /// These fields allow joining from any record to related records that share the same DID 726 + fn build_did_join_fields_with_types( 727 + did_join_targets: List(#(String, collection_meta.CollectionMeta)), 728 + batch_fetcher: option.Option(dataloader.BatchFetcher), 729 + paginated_batch_fetcher: option.Option(dataloader.PaginatedBatchFetcher), 730 + object_types: dict.Dict(String, schema.Type), 731 + record_types: List(RecordType), 732 + sort_field_enums: dict.Dict(String, schema.Type), 733 + ) -> List(schema.Field) { 734 + list.map(did_join_targets, fn(target_pair) { 735 + let #(target_nsid, target_meta) = target_pair 736 + 737 + // Generate field name: <targetTypeName>ByDid 738 + // Example: appBskyActorProfileByDid (camelCase with first letter lowercase) 739 + let field_name = lowercase_first(target_meta.type_name) <> "ByDid" 740 + 741 + // Get the target object type from the dict 742 + let target_object_type = case dict.get(object_types, target_nsid) { 743 + Ok(obj_type) -> obj_type 744 + Error(_) -> schema.string_type() 745 + // Fallback 746 + } 747 + 748 + // Determine field type and resolver based on cardinality 749 + // If has_unique_did is true, return single nullable object (no pagination) 750 + // Otherwise, return connection (with pagination) 751 + case target_meta.has_unique_did { 752 + True -> { 753 + // Unique DID join - returns single nullable object (e.g., profile) 754 + schema.field( 755 + field_name, 756 + target_object_type, 757 + "DID join: record in " 758 + <> target_nsid 759 + <> " that shares the same DID as this record", 760 + fn(ctx) { 761 + // Extract the DID from the current record's URI 762 + case get_field_from_context(ctx, "uri") { 763 + Ok(uri_value) -> { 764 + case extract_did_from_uri(uri_value) { 765 + option.Some(did) -> { 766 + case batch_fetcher { 767 + option.Some(fetcher) -> { 768 + // Use DataLoader to fetch records by DID 769 + case 770 + dataloader.batch_fetch_by_did( 771 + [did], 772 + target_nsid, 773 + fetcher, 774 + ) 775 + { 776 + Ok(results) -> { 777 + case dict.get(results, did) { 778 + Ok(records) -> { 779 + // Return single nullable object 780 + case records { 781 + [first, ..] -> Ok(first) 782 + [] -> Ok(value.Null) 783 + } 784 + } 785 + Error(_) -> Ok(value.Null) 786 + } 787 + } 788 + Error(_) -> Ok(value.Null) 789 + } 790 + } 791 + option.None -> Ok(value.Null) 792 + } 793 + } 794 + option.None -> Ok(value.Null) 795 + } 796 + } 797 + Error(_) -> Ok(value.Null) 798 + } 799 + }, 800 + ) 801 + } 802 + False -> { 803 + // Non-unique DID join - returns connection (e.g., posts) 804 + // Create Connection types for this join 805 + let edge_type = 806 + connection.edge_type(target_meta.type_name, target_object_type) 807 + let connection_type = 808 + connection.connection_type(target_meta.type_name, edge_type) 809 + 810 + // Find the RecordType for the target collection to build sortBy and where args 811 + let target_record_type = 812 + list.find(record_types, fn(rt) { rt.nsid == target_nsid }) 813 + 814 + // Build connection args with sortBy and where support 815 + let connection_args = case target_record_type { 816 + Ok(record_type) -> { 817 + // Look up pre-built sort field enum from dict 818 + let sort_field_enum = case 819 + dict.get(sort_field_enums, record_type.type_name) 820 + { 821 + Ok(enum) -> enum 822 + Error(_) -> build_sort_field_enum(record_type) 823 + // Fallback: build if not found (shouldn't happen) 824 + } 825 + let where_input_type = build_where_input_type(record_type) 826 + 827 + lexicon_connection.lexicon_connection_args_with_field_enum_and_where( 828 + record_type.type_name, 829 + sort_field_enum, 830 + where_input_type, 831 + ) 832 + } 833 + Error(_) -> { 834 + // Fallback to basic pagination args if record type not found 835 + list.flatten([ 836 + connection.forward_pagination_args(), 837 + connection.backward_pagination_args(), 838 + ]) 839 + } 840 + } 841 + 842 + schema.field_with_args( 843 + field_name, 844 + connection_type, 845 + "DID join: records in " 846 + <> target_nsid 847 + <> " that share the same DID as this record", 848 + connection_args, 849 + fn(ctx) { 850 + // Extract the DID from the current record's URI 851 + case get_field_from_context(ctx, "uri") { 852 + Ok(uri_value) -> { 853 + case extract_did_from_uri(uri_value) { 854 + option.Some(did) -> { 855 + case paginated_batch_fetcher { 856 + option.Some(fetcher) -> { 857 + // Extract pagination params from context (includes sortBy and where) 858 + let pagination_params = extract_pagination_params(ctx) 859 + 860 + // Use paginated DataLoader to fetch records by DID 861 + case 862 + dataloader.batch_fetch_by_did_paginated( 863 + did, 864 + target_nsid, 865 + pagination_params, 866 + fetcher, 867 + ) 868 + { 869 + Ok(batch_result) -> { 870 + // Build edges from records with their cursors 871 + let edges = 872 + list.map(batch_result.edges, fn(edge_tuple) { 873 + let #(record_value, record_cursor) = edge_tuple 874 + connection.Edge( 875 + node: record_value, 876 + cursor: record_cursor, 877 + ) 878 + }) 879 + 880 + // Build PageInfo 881 + let page_info = 882 + connection.PageInfo( 883 + has_next_page: batch_result.has_next_page, 884 + has_previous_page: batch_result.has_previous_page, 885 + start_cursor: case list.first(edges) { 886 + Ok(edge) -> option.Some(edge.cursor) 887 + Error(_) -> option.None 888 + }, 889 + end_cursor: case list.last(edges) { 890 + Ok(edge) -> option.Some(edge.cursor) 891 + Error(_) -> option.None 892 + }, 893 + ) 894 + 895 + // Build Connection 896 + let conn = 897 + connection.Connection( 898 + edges: edges, 899 + page_info: page_info, 900 + total_count: batch_result.total_count, 901 + ) 902 + 903 + Ok(connection.connection_to_value(conn)) 904 + } 905 + Error(_) -> { 906 + // Return empty connection on error 907 + Ok(empty_connection_value()) 908 + } 909 + } 910 + } 911 + option.None -> { 912 + // No paginated batch fetcher - return empty connection 913 + Ok(empty_connection_value()) 914 + } 915 + } 916 + } 917 + option.None -> { 918 + // Can't extract DID - return empty connection 919 + Ok(empty_connection_value()) 920 + } 921 + } 922 + } 923 + Error(_) -> { 924 + // Can't get URI - return empty connection 925 + Ok(empty_connection_value()) 926 + } 927 + } 928 + }, 929 + ) 930 + } 931 + } 932 + }) 933 + } 934 + 935 + /// Extract DID from an AT Protocol URI 936 + /// Format: at://did:plc:abc123/collection.nsid/rkey 937 + /// Returns: Some("did:plc:abc123") 938 + fn extract_did_from_uri(uri: String) -> option.Option(String) { 939 + case string.starts_with(uri, "at://") { 940 + True -> { 941 + let without_prefix = string.drop_start(uri, 5) 942 + // Remove "at://" 943 + case string.split(without_prefix, "/") { 944 + [did, ..] -> option.Some(did) 945 + _ -> option.None 946 + } 947 + } 948 + False -> option.None 949 + } 950 + } 951 + 952 + /// Capitalize the first letter of a string 953 + fn capitalize_first(s: String) -> String { 954 + case string.pop_grapheme(s) { 955 + Ok(#(first, rest)) -> string.uppercase(first) <> rest 956 + Error(_) -> "" 957 + } 958 + } 959 + 960 + /// Lowercase the first letter of a string (for camelCase field names) 961 + fn lowercase_first(s: String) -> String { 962 + case string.pop_grapheme(s) { 963 + Ok(#(first, rest)) -> string.lowercase(first) <> rest 964 + Error(_) -> "" 965 + } 966 + } 967 + 968 + /// Create an empty connection value with no edges 969 + /// Helper function to reduce code duplication across error handling paths 970 + /// Returns a GraphQL Value representing an empty connection 971 + fn empty_connection_value() -> value.Value { 972 + // Create empty connection structure directly as a Value 973 + // This avoids the need for type parameters while maintaining the connection structure 974 + value.Object([ 975 + #("edges", value.List([])), 976 + #( 977 + "pageInfo", 978 + value.Object([ 979 + #("hasNextPage", value.Boolean(False)), 980 + #("hasPreviousPage", value.Boolean(False)), 981 + #("startCursor", value.Null), 982 + #("endCursor", value.Null), 983 + ]), 984 + ), 985 + #("totalCount", value.Null), 986 + ]) 987 + } 988 + 989 + /// Build a Record union that can represent any record 990 + /// This is used for forward joins which can point to any collection 991 + fn build_record_union(possible_types: List(schema.Type)) -> schema.Type { 992 + // Type resolver examines the "collection" field to determine the concrete type 993 + let type_resolver = fn(ctx: schema.Context) -> Result(String, String) { 994 + case get_field_from_context(ctx, "collection") { 995 + Ok(collection_nsid) -> { 996 + // Convert NSID to type name (e.g., "app.bsky.feed.post" -> "AppBskyFeedPost") 997 + Ok(nsid.to_type_name(collection_nsid)) 998 + } 999 + Error(_) -> 1000 + Error("Could not determine record type: missing 'collection' field") 1001 + } 1002 + } 1003 + 1004 + schema.union_type( 1005 + "Record", 1006 + "A record from any collection", 1007 + possible_types, 1008 + type_resolver, 1009 + ) 1010 + } 1011 + 119 1012 /// Build GraphQL fields from lexicon properties 120 1013 fn build_fields( 121 1014 properties: List(#(String, types.Property)), 1015 + ref_object_types: Dict(String, schema.Type), 122 1016 ) -> List(schema.Field) { 123 1017 // Add standard AT Proto fields 124 1018 let standard_fields = [ ··· 179 1073 // Build fields from lexicon properties 180 1074 let lexicon_fields = 181 1075 list.map(properties, fn(prop) { 182 - let #(name, types.Property(type_, _required)) = prop 183 - let graphql_type = type_mapper.map_type(type_) 1076 + let #(name, types.Property(type_, _required, format, ref)) = prop 1077 + // Use map_type_with_registry to resolve refs to object types 1078 + let graphql_type = 1079 + type_mapper.map_type_with_registry(type_, format, ref, ref_object_types) 184 1080 185 1081 schema.field(name, graphql_type, "Field from lexicon", fn(ctx) { 186 1082 // Special handling for blob fields ··· 194 1090 } 195 1091 _ -> { 196 1092 // Try to extract field from the value object in context 197 - case get_nested_field_from_context(ctx, "value", name) { 198 - Ok(val) -> Ok(value.String(val)) 1093 + // Use the type-safe version that preserves Int, Float, Boolean types 1094 + case get_nested_field_value_from_context(ctx, "value", name) { 1095 + Ok(val) -> Ok(val) 199 1096 Error(_) -> Ok(value.Null) 200 1097 } 201 1098 } ··· 207 1104 list.append(standard_fields, lexicon_fields) 208 1105 } 209 1106 1107 + /// Check if a lexicon property is sortable based on its type 1108 + /// Sortable types: string, integer, boolean, number (primitive types) 1109 + /// Non-sortable types: blob, ref (complex types) 1110 + fn is_sortable_property(property: types.Property) -> Bool { 1111 + case property.type_, property.format { 1112 + // Primitive types are sortable 1113 + "string", _ -> True 1114 + "integer", _ -> True 1115 + "boolean", _ -> True 1116 + "number", _ -> True 1117 + // Blob and ref types are not sortable 1118 + "blob", _ -> False 1119 + "ref", _ -> False 1120 + // Default to non-sortable for unknown types 1121 + _, _ -> False 1122 + } 1123 + } 1124 + 1125 + /// Get all sortable field names for sort enums 1126 + /// Returns only database-sortable fields (excludes computed fields like actorHandle) 1127 + fn get_sortable_field_names_for_sorting(record_type: RecordType) -> List(String) { 1128 + // Filter properties to only sortable types, then get their field names 1129 + let sortable_property_names = 1130 + list.filter_map(record_type.properties, fn(prop) { 1131 + let #(field_name, property) = prop 1132 + case is_sortable_property(property) { 1133 + True -> Ok(field_name) 1134 + False -> Error(Nil) 1135 + } 1136 + }) 1137 + 1138 + // Add standard sortable fields from AT Protocol 1139 + // Note: actorHandle is NOT included because it's a computed field that requires a join 1140 + // and can't be used directly in ORDER BY clauses 1141 + let standard_sortable_fields = ["uri", "cid", "did", "collection", "indexedAt"] 1142 + list.append(standard_sortable_fields, sortable_property_names) 1143 + } 1144 + 1145 + /// Get all filterable field names for WHERE inputs 1146 + /// Returns all primitive fields including computed fields like actorHandle 1147 + fn get_filterable_field_names(record_type: RecordType) -> List(String) { 1148 + // Filter properties to only sortable types, then get their field names 1149 + let filterable_property_names = 1150 + list.filter_map(record_type.properties, fn(prop) { 1151 + let #(field_name, property) = prop 1152 + case is_sortable_property(property) { 1153 + True -> Ok(field_name) 1154 + False -> Error(Nil) 1155 + } 1156 + }) 1157 + 1158 + // Add standard filterable fields from AT Protocol 1159 + // Note: actorHandle IS included because WHERE clauses support filtering by it 1160 + // via a join with the actor table 1161 + let standard_filterable_fields = [ 1162 + "uri", 1163 + "cid", 1164 + "did", 1165 + "collection", 1166 + "indexedAt", 1167 + "actorHandle", 1168 + ] 1169 + list.append(standard_filterable_fields, filterable_property_names) 1170 + } 1171 + 210 1172 /// Build a SortFieldEnum for a record type with all its sortable fields 1173 + /// Only includes primitive fields (string, integer, boolean, number) from the original lexicon 1174 + /// Excludes complex types (blob, ref), join fields, and computed fields like actorHandle 211 1175 fn build_sort_field_enum(record_type: RecordType) -> schema.Type { 212 - // Get field names from the record type, excluding non-sortable fields 213 - let field_names = 214 - list.map(record_type.fields, fn(field) { schema.field_name(field) }) 215 - |> list.filter(fn(name) { name != "actorHandle" }) 1176 + // Get all sortable field names (excludes actorHandle and other computed fields) 1177 + let sortable_fields = get_sortable_field_names_for_sorting(record_type) 216 1178 217 1179 // Convert field names to enum values 218 1180 let enum_values = 219 - list.map(field_names, fn(field_name) { 1181 + list.map(sortable_fields, fn(field_name) { 220 1182 schema.enum_value(field_name, "Sort by " <> field_name) 221 1183 }) 222 1184 ··· 228 1190 } 229 1191 230 1192 /// Build a WhereInput type for a record type with all its filterable fields 1193 + /// Only includes primitive fields (string, integer, boolean, number) from the original lexicon 1194 + /// Excludes complex types (blob, ref) and join fields, but includes computed fields like actorHandle 231 1195 fn build_where_input_type(record_type: RecordType) -> schema.Type { 232 - // Get field names from the record type 233 - let field_names = 234 - list.map(record_type.fields, fn(field) { schema.field_name(field) }) 1196 + // Get filterable field names (includes actorHandle and other filterable computed fields) 1197 + let field_names = get_filterable_field_names(record_type) 235 1198 236 1199 // Use the connection module to build the where input type 237 1200 lexicon_connection.build_where_input_type(record_type.type_name, field_names) ··· 240 1203 /// Build the root Query type with fields for each record type 241 1204 fn build_query_type( 242 1205 record_types: List(RecordType), 1206 + object_types: dict.Dict(String, schema.Type), 243 1207 fetcher: RecordFetcher, 244 1208 ) -> schema.Type { 245 1209 let query_fields = 246 1210 list.map(record_types, fn(record_type) { 247 - // Build the object type for this record 248 - let object_type = 249 - schema.object_type( 250 - record_type.type_name, 251 - "Record type: " <> record_type.nsid, 252 - record_type.fields, 253 - ) 1211 + // Get the pre-built object type 1212 + let assert Ok(object_type) = dict.get(object_types, record_type.nsid) 254 1213 255 1214 // Create Connection types 256 1215 let edge_type = connection.edge_type(record_type.type_name, object_type) ··· 266 1225 // Build custom connection args with type-specific sort field enum and where input 267 1226 let connection_args = 268 1227 lexicon_connection.lexicon_connection_args_with_field_enum_and_where( 1228 + record_type.type_name, 269 1229 sort_field_enum, 270 1230 where_input_type, 271 1231 ) ··· 329 1289 } 330 1290 331 1291 /// Extract pagination parameters from GraphQL context 332 - fn extract_pagination_params(ctx: schema.Context) -> PaginationParams { 1292 + fn extract_pagination_params(ctx: schema.Context) -> dataloader.PaginationParams { 333 1293 // Extract sortBy argument 334 1294 let sort_by = case schema.get_argument(ctx, "sortBy") { 335 1295 option.Some(value.List(items)) -> { ··· 369 1329 370 1330 // Extract first/after/last/before arguments 371 1331 let first = case schema.get_argument(ctx, "first") { 1332 + option.Some(value.Int(n)) -> option.Some(n) 372 1333 option.Some(value.String(s)) -> { 373 1334 case int.parse(s) { 374 1335 Ok(n) -> option.Some(n) 375 - Error(_) -> option.Some(50) 1336 + Error(_) -> option.None 376 1337 } 377 1338 } 378 - _ -> option.Some(50) 1339 + _ -> option.None 379 1340 } 380 1341 381 1342 let after = case schema.get_argument(ctx, "after") { ··· 384 1345 } 385 1346 386 1347 let last = case schema.get_argument(ctx, "last") { 1348 + option.Some(value.Int(n)) -> option.Some(n) 387 1349 option.Some(value.String(s)) -> { 388 1350 case int.parse(s) { 389 1351 Ok(n) -> option.Some(n) ··· 410 1372 _ -> option.None 411 1373 } 412 1374 413 - PaginationParams( 1375 + dataloader.PaginationParams( 414 1376 first: first, 415 1377 after: after, 416 1378 last: last, ··· 437 1399 } 438 1400 439 1401 /// Helper to extract a nested field value from resolver context 440 - fn get_nested_field_from_context( 1402 + /// Returns the value as a GraphQL value, handling all types (String, Int, Float, Boolean) 1403 + fn get_nested_field_value_from_context( 441 1404 ctx: schema.Context, 442 1405 parent_field: String, 443 1406 field_name: String, 444 - ) -> Result(String, Nil) { 1407 + ) -> Result(value.Value, Nil) { 445 1408 case ctx.data { 446 1409 option.Some(value.Object(fields)) -> { 447 1410 case list.key_find(fields, parent_field) { 448 1411 Ok(value.Object(nested_fields)) -> { 449 1412 case list.key_find(nested_fields, field_name) { 450 - Ok(value.String(val)) -> Ok(val) 1413 + Ok(val) -> Ok(val) 451 1414 _ -> Error(Nil) 452 1415 } 453 1416 } ··· 457 1420 _ -> Error(Nil) 458 1421 } 459 1422 } 1423 + /// Helper to extract a nested field value as Dynamic from resolver context 1424 + /// This version returns the raw Dynamic value for processing by uri_extractor 1425 + fn get_nested_field_from_context_dynamic( 1426 + ctx: schema.Context, 1427 + parent_field: String, 1428 + field_name: String, 1429 + ) -> Result(Dynamic, Nil) { 1430 + case ctx.data { 1431 + option.Some(value.Object(fields)) -> { 1432 + case list.key_find(fields, parent_field) { 1433 + Ok(value.Object(nested_fields)) -> { 1434 + case list.key_find(nested_fields, field_name) { 1435 + Ok(field_value) -> { 1436 + // Convert the GraphQL Value to Dynamic for uri_extractor 1437 + Ok(value_to_dynamic(field_value)) 1438 + } 1439 + _ -> Error(Nil) 1440 + } 1441 + } 1442 + _ -> Error(Nil) 1443 + } 1444 + } 1445 + _ -> Error(Nil) 1446 + } 1447 + } 1448 + 1449 + /// Convert a GraphQL Value to Dynamic 1450 + fn value_to_dynamic(v: value.Value) -> Dynamic { 1451 + case v { 1452 + value.String(s) -> unsafe_coerce_to_dynamic(s) 1453 + value.Int(i) -> unsafe_coerce_to_dynamic(i) 1454 + value.Float(f) -> unsafe_coerce_to_dynamic(f) 1455 + value.Boolean(b) -> unsafe_coerce_to_dynamic(b) 1456 + value.Null -> unsafe_coerce_to_dynamic(option.None) 1457 + value.Object(fields) -> { 1458 + // Convert object fields to a dict-like structure 1459 + let field_map = 1460 + list.fold(fields, dict.new(), fn(acc, field) { 1461 + let #(key, val) = field 1462 + dict.insert(acc, key, value_to_dynamic(val)) 1463 + }) 1464 + unsafe_coerce_to_dynamic(field_map) 1465 + } 1466 + value.List(items) -> { 1467 + let converted = list.map(items, value_to_dynamic) 1468 + unsafe_coerce_to_dynamic(converted) 1469 + } 1470 + value.Enum(name) -> unsafe_coerce_to_dynamic(name) 1471 + } 1472 + } 1473 + 1474 + @external(erlang, "dataloader_ffi", "identity") 1475 + fn unsafe_coerce_to_dynamic(value: a) -> Dynamic 460 1476 461 1477 /// Extract blob data from AT Protocol format and convert to Blob type format 462 1478 /// AT Protocol blob format:
+78 -14
lexicon_graphql/src/lexicon_graphql/lexicon_parser.gleam
··· 6 6 import gleam/dynamic/decode 7 7 import gleam/json 8 8 import gleam/list 9 + import gleam/option.{None} 9 10 import gleam/result 10 11 import lexicon_graphql/types 11 12 ··· 24 25 25 26 /// Create a decoder for the defs object 26 27 fn decode_defs() -> decode.Decoder(types.Defs) { 27 - use main <- decode.field("main", decode_record_def()) 28 - decode.success(types.Defs(main:)) 28 + use main <- decode.optional_field("main", None, decode.optional(decode_record_def())) 29 + use defs_dict <- decode.then(decode.dict(decode.string, decode.dynamic)) 30 + 31 + // Filter out "main" and parse remaining defs 32 + let others_list = 33 + defs_dict 34 + |> dict.to_list 35 + |> list.filter(fn(entry) { entry.0 != "main" }) 36 + |> list.filter_map(fn(entry) { 37 + let #(name, def_dyn) = entry 38 + case decode_def(def_dyn) { 39 + Ok(def) -> Ok(#(name, def)) 40 + Error(_) -> Error(Nil) 41 + } 42 + }) 43 + 44 + let others = dict.from_list(others_list) 45 + decode.success(types.Defs(main:, others:)) 46 + } 47 + 48 + /// Decode a definition (either record or object) 49 + fn decode_def(dyn: decode.Dynamic) -> Result(types.Def, List(decode.DecodeError)) { 50 + // Try to decode as object first (simpler structure) 51 + case decode_object_def_inner(dyn) { 52 + Ok(obj_def) -> Ok(types.Object(obj_def)) 53 + Error(_) -> { 54 + // Try as record 55 + case decode.run(dyn, decode_record_def()) { 56 + Ok(rec_def) -> Ok(types.Record(rec_def)) 57 + Error(e) -> Error(e) 58 + } 59 + } 60 + } 61 + } 62 + 63 + /// Decode an object definition (inner, returns ObjectDef not Def) 64 + fn decode_object_def_inner(dyn: decode.Dynamic) -> Result(types.ObjectDef, List(decode.DecodeError)) { 65 + let decoder = { 66 + use type_ <- decode.field("type", decode.string) 67 + use required_fields <- decode.optional_field("required", [], decode.list(decode.string)) 68 + use properties_dict <- decode.field("properties", decode.dict(decode.string, decode.dynamic)) 69 + 70 + // Convert dict to list of properties 71 + let properties = 72 + properties_dict 73 + |> dict.to_list 74 + |> list.map(fn(entry) { 75 + let #(name, prop_dyn) = entry 76 + let is_required = list.contains(required_fields, name) 77 + 78 + let #(prop_type, prop_format, prop_ref) = case decode_property(prop_dyn) { 79 + Ok(#(t, f, r)) -> #(t, f, r) 80 + Error(_) -> #("string", None, None) 81 + } 82 + 83 + #(name, types.Property(prop_type, is_required, prop_format, prop_ref)) 84 + }) 85 + 86 + decode.success(types.ObjectDef(type_:, required_fields:, properties:)) 87 + } 88 + decode.run(dyn, decoder) 29 89 } 30 90 31 91 /// Create a decoder for a record definition 32 92 fn decode_record_def() -> decode.Decoder(types.RecordDef) { 33 93 use type_ <- decode.field("type", decode.string) 94 + use key <- decode.optional_field("key", None, decode.optional(decode.string)) 34 95 use record <- decode.field("record", decode_record_object()) 35 - decode.success(types.RecordDef(type_:, properties: record)) 96 + decode.success(types.RecordDef(type_:, key:, properties: record)) 36 97 } 37 98 38 99 /// Create a decoder for the record object which contains properties ··· 56 117 let #(name, prop_dyn) = entry 57 118 let is_required = list.contains(required_list, name) 58 119 59 - // Extract the type from the property 60 - let prop_type = case decode_property_type(prop_dyn) { 61 - Ok(type_) -> type_ 62 - Error(_) -> "string" 120 + // Extract type, format, and ref from the property 121 + let #(prop_type, prop_format, prop_ref) = case decode_property(prop_dyn) { 122 + Ok(#(t, f, r)) -> #(t, f, r) 123 + Error(_) -> #("string", None, None) 63 124 // Default fallback 64 125 } 65 126 66 - #(name, types.Property(prop_type, is_required)) 127 + #(name, types.Property(prop_type, is_required, prop_format, prop_ref)) 67 128 }) 68 129 69 130 decode.success(properties) 70 131 } 71 132 72 - /// Decode a property's type field 73 - fn decode_property_type( 133 + /// Decode a property's type, format, and ref fields 134 + fn decode_property( 74 135 dyn: decode.Dynamic, 75 - ) -> Result(String, List(decode.DecodeError)) { 76 - let type_decoder = { 136 + ) -> Result(#(String, option.Option(String), option.Option(String)), List(decode.DecodeError)) { 137 + let property_decoder = { 77 138 use type_ <- decode.field("type", decode.string) 78 - decode.success(type_) 139 + use format <- decode.optional_field("format", None, decode.optional(decode.string)) 140 + use ref <- decode.optional_field("ref", None, decode.optional(decode.string)) 141 + 142 + decode.success(#(type_, format, ref)) 79 143 } 80 - decode.run(dyn, type_decoder) 144 + decode.run(dyn, property_decoder) 81 145 }
+82
lexicon_graphql/src/lexicon_graphql/lexicon_registry.gleam
··· 1 + /// Lexicon Registry 2 + /// 3 + /// Provides cross-lexicon ref resolution for object types. 4 + /// This allows looking up definitions like "social.grain.defs#aspectRatio" 5 + /// across all loaded lexicons. 6 + import gleam/dict.{type Dict} 7 + import gleam/list 8 + import gleam/option.{type Option} 9 + import gleam/string 10 + import lexicon_graphql/types 11 + 12 + /// Registry that holds all lexicons and allows ref lookups 13 + pub type Registry { 14 + Registry( 15 + /// All lexicons indexed by ID 16 + lexicons: Dict(String, types.Lexicon), 17 + /// All object definitions indexed by fully-qualified ref (e.g., "social.grain.defs#aspectRatio") 18 + object_defs: Dict(String, types.ObjectDef), 19 + ) 20 + } 21 + 22 + /// Create a registry from a list of lexicons 23 + pub fn from_lexicons(lexicons: List(types.Lexicon)) -> Registry { 24 + // Build lexicons dict 25 + let lexicons_dict = 26 + lexicons 27 + |> list.map(fn(lex) { #(lex.id, lex) }) 28 + |> dict.from_list 29 + 30 + // Build object defs dict by extracting all object definitions from all lexicons 31 + let object_defs_dict = 32 + lexicons 33 + |> list.flat_map(fn(lex) { 34 + // Extract all object definitions from this lexicon's "others" dict 35 + lex.defs.others 36 + |> dict.to_list 37 + |> list.filter_map(fn(entry) { 38 + let #(name, def) = entry 39 + case def { 40 + types.Object(obj_def) -> { 41 + // Create fully-qualified ref: "lexicon.id#defName" 42 + let ref = lex.id <> "#" <> name 43 + Ok(#(ref, obj_def)) 44 + } 45 + types.Record(_) -> Error(Nil) 46 + } 47 + }) 48 + }) 49 + |> dict.from_list 50 + 51 + Registry(lexicons: lexicons_dict, object_defs: object_defs_dict) 52 + } 53 + 54 + /// Look up an object definition by ref (e.g., "social.grain.defs#aspectRatio") 55 + pub fn get_object_def( 56 + registry: Registry, 57 + ref: String, 58 + ) -> Option(types.ObjectDef) { 59 + dict.get(registry.object_defs, ref) 60 + |> option.from_result 61 + } 62 + 63 + /// Look up a lexicon by ID 64 + pub fn get_lexicon(registry: Registry, id: String) -> Option(types.Lexicon) { 65 + dict.get(registry.lexicons, id) 66 + |> option.from_result 67 + } 68 + 69 + /// Parse a ref into lexicon ID and definition name 70 + /// Example: "social.grain.defs#aspectRatio" -> #("social.grain.defs", "aspectRatio") 71 + pub fn parse_ref(ref: String) -> Option(#(String, String)) { 72 + case string.split(ref, "#") { 73 + [lexicon_id, def_name] -> option.Some(#(lexicon_id, def_name)) 74 + _ -> option.None 75 + } 76 + } 77 + 78 + /// Get all object definition refs from the registry 79 + pub fn get_all_object_refs(registry: Registry) -> List(String) { 80 + registry.object_defs 81 + |> dict.keys 82 + }
+16 -98
lexicon_graphql/src/lexicon_graphql/mutation_builder.gleam
··· 5 5 /// 6 6 /// Note: The resolvers are currently stubs. Actual mutation logic should be 7 7 /// implemented in the server layer by extracting data from the GraphQL context. 8 + import gleam/dict 8 9 import gleam/list 9 10 import gleam/option 10 11 import graphql/schema ··· 35 36 /// Resolver factories are optional - if None, mutations will return errors 36 37 pub fn build_mutation_type( 37 38 lexicons: List(types.Lexicon), 39 + object_types: dict.Dict(String, schema.Type), 38 40 create_factory: option.Option(ResolverFactory), 39 41 update_factory: option.Option(ResolverFactory), 40 42 delete_factory: option.Option(ResolverFactory), ··· 43 45 // Extract record types 44 46 let record_types = extract_record_types(lexicons) 45 47 46 - // Build mutation fields for each record type 48 + // Build mutation fields for each record type using complete object types 47 49 let record_mutation_fields = 48 50 list.flat_map(record_types, fn(record) { 49 51 build_mutations_for_record( 50 52 record, 53 + object_types, 51 54 create_factory, 52 55 update_factory, 53 56 delete_factory, ··· 85 88 case lexicon { 86 89 types.Lexicon( 87 90 id, 88 - types.Defs(types.RecordDef("record", properties)), 91 + types.Defs(option.Some(types.RecordDef("record", _, props)), _others), 89 92 ) -> { 90 93 let type_name = nsid.to_type_name(id) 91 - Ok(RecordInfo(nsid: id, type_name: type_name, properties: properties)) 94 + Ok(RecordInfo(nsid: id, type_name: type_name, properties: props)) 92 95 } 93 96 _ -> Error(Nil) 94 97 } ··· 98 101 /// Build all three mutations (create, update, delete) for a record type 99 102 fn build_mutations_for_record( 100 103 record: RecordInfo, 104 + object_types: dict.Dict(String, schema.Type), 101 105 create_factory: option.Option(ResolverFactory), 102 106 update_factory: option.Option(ResolverFactory), 103 107 delete_factory: option.Option(ResolverFactory), 104 108 ) -> List(schema.Field) { 105 - let create_mutation = build_create_mutation(record, create_factory) 106 - let update_mutation = build_update_mutation(record, update_factory) 109 + let create_mutation = build_create_mutation(record, object_types, create_factory) 110 + let update_mutation = build_update_mutation(record, object_types, update_factory) 107 111 let delete_mutation = build_delete_mutation(record, delete_factory) 108 112 109 113 [create_mutation, update_mutation, delete_mutation] ··· 113 117 /// Signature: create{TypeName}(input: {TypeName}Input!, rkey: String): {TypeName} 114 118 fn build_create_mutation( 115 119 record: RecordInfo, 120 + object_types: dict.Dict(String, schema.Type), 116 121 factory: option.Option(ResolverFactory), 117 122 ) -> schema.Field { 118 123 let mutation_name = "create" <> record.type_name ··· 121 126 // Build the input type 122 127 let input_type = build_input_type(input_type_name, record.properties) 123 128 124 - // Build the return type (the record object type) 125 - let return_type = build_record_object_type(record) 129 + // Get the complete object type from the dict (includes all join fields) 130 + let assert Ok(return_type) = dict.get(object_types, record.nsid) 126 131 127 132 // Create arguments 128 133 let arguments = [ ··· 168 173 /// Signature: update{TypeName}(rkey: String!, input: {TypeName}Input!): {TypeName} 169 174 fn build_update_mutation( 170 175 record: RecordInfo, 176 + object_types: dict.Dict(String, schema.Type), 171 177 factory: option.Option(ResolverFactory), 172 178 ) -> schema.Field { 173 179 let mutation_name = "update" <> record.type_name ··· 176 182 // Build the input type 177 183 let input_type = build_input_type(input_type_name, record.properties) 178 184 179 - // Build the return type (the record object type) 180 - let return_type = build_record_object_type(record) 185 + // Get the complete object type from the dict (includes all join fields) 186 + let assert Ok(return_type) = dict.get(object_types, record.nsid) 181 187 182 188 // Create arguments 183 189 let arguments = [ ··· 271 277 ) -> schema.Type { 272 278 let input_fields = 273 279 list.map(properties, fn(prop) { 274 - let #(name, types.Property(type_, required)) = prop 280 + let #(name, types.Property(type_, required, _, _)) = prop 275 281 // Use map_input_type to get input-compatible types (e.g., BlobInput instead of Blob) 276 282 let graphql_type = type_mapper.map_input_type(type_) 277 283 ··· 308 314 ] 309 315 310 316 schema.object_type("DeleteResult", "Result of a delete mutation", fields) 311 - } 312 - 313 - /// Build an ObjectType representing a record 314 - fn build_record_object_type(record: RecordInfo) -> schema.Type { 315 - // Build standard AT Proto fields that extract data from parent context 316 - let standard_fields = [ 317 - schema.field("uri", schema.string_type(), "Record URI", fn(ctx) { 318 - // Extract from parent object data 319 - case ctx.data { 320 - option.Some(value.Object(fields)) -> { 321 - case list.key_find(fields, "uri") { 322 - Ok(val) -> Ok(val) 323 - Error(_) -> Ok(value.String("at://did:plc:example/collection/rkey")) 324 - } 325 - } 326 - _ -> Ok(value.String("at://did:plc:example/collection/rkey")) 327 - } 328 - }), 329 - schema.field("cid", schema.string_type(), "Record CID", fn(ctx) { 330 - case ctx.data { 331 - option.Some(value.Object(fields)) -> { 332 - case list.key_find(fields, "cid") { 333 - Ok(val) -> Ok(val) 334 - Error(_) -> Ok(value.String("bafyreicid")) 335 - } 336 - } 337 - _ -> Ok(value.String("bafyreicid")) 338 - } 339 - }), 340 - schema.field("did", schema.string_type(), "DID of record author", fn(ctx) { 341 - case ctx.data { 342 - option.Some(value.Object(fields)) -> { 343 - case list.key_find(fields, "did") { 344 - Ok(val) -> Ok(val) 345 - Error(_) -> Ok(value.String("did:plc:example")) 346 - } 347 - } 348 - _ -> Ok(value.String("did:plc:example")) 349 - } 350 - }), 351 - schema.field( 352 - "indexedAt", 353 - schema.string_type(), 354 - "When record was indexed", 355 - fn(ctx) { 356 - case ctx.data { 357 - option.Some(value.Object(fields)) -> { 358 - case list.key_find(fields, "indexedAt") { 359 - Ok(val) -> Ok(val) 360 - Error(_) -> Ok(value.String("2024-01-01T00:00:00Z")) 361 - } 362 - } 363 - _ -> Ok(value.String("2024-01-01T00:00:00Z")) 364 - } 365 - }, 366 - ), 367 - schema.field("collection", schema.string_type(), "Collection name", fn(ctx) { 368 - case ctx.data { 369 - option.Some(value.Object(fields)) -> { 370 - case list.key_find(fields, "collection") { 371 - Ok(val) -> Ok(val) 372 - Error(_) -> Ok(value.Null) 373 - } 374 - } 375 - _ -> Ok(value.Null) 376 - } 377 - }), 378 - ] 379 - 380 - // Build fields from lexicon properties 381 - let lexicon_fields = 382 - list.map(record.properties, fn(prop) { 383 - let #(name, types.Property(type_, _required)) = prop 384 - let graphql_type = type_mapper.map_type(type_) 385 - 386 - schema.field(name, graphql_type, "Field from lexicon", fn(_ctx) { 387 - Ok(value.Null) 388 - }) 389 - }) 390 - 391 - // Combine all fields 392 - let all_fields = list.append(standard_fields, lexicon_fields) 393 - 394 - schema.object_type( 395 - record.type_name, 396 - "Record type: " <> record.nsid, 397 - all_fields, 398 - ) 399 317 } 400 318 401 319 /// Build BlobUploadResponse type
+100
lexicon_graphql/src/lexicon_graphql/object_type_builder.gleam
··· 1 + /// Object Type Builder 2 + /// 3 + /// Builds GraphQL object types from ObjectDef definitions. 4 + /// Used for nested object types like aspectRatio that are defined 5 + /// as refs in lexicons (e.g., "social.grain.defs#aspectRatio") 6 + import gleam/dict.{type Dict} 7 + import gleam/list 8 + import gleam/option 9 + import gleam/string 10 + import graphql/schema 11 + import graphql/value 12 + import lexicon_graphql/lexicon_registry 13 + import lexicon_graphql/nsid 14 + import lexicon_graphql/type_mapper 15 + import lexicon_graphql/types 16 + 17 + /// Build a GraphQL object type from an ObjectDef 18 + /// object_types_dict is used to resolve refs to other object types 19 + pub fn build_object_type( 20 + obj_def: types.ObjectDef, 21 + type_name: String, 22 + object_types_dict: Dict(String, schema.Type), 23 + ) -> schema.Type { 24 + let fields = build_object_fields(obj_def.properties, object_types_dict) 25 + 26 + schema.object_type( 27 + type_name, 28 + "Object type from lexicon definition", 29 + fields, 30 + ) 31 + } 32 + 33 + /// Build GraphQL fields from object properties 34 + fn build_object_fields( 35 + properties: List(#(String, types.Property)), 36 + object_types_dict: Dict(String, schema.Type), 37 + ) -> List(schema.Field) { 38 + list.map(properties, fn(prop) { 39 + let #(name, types.Property(type_, required, format, ref)) = prop 40 + 41 + // Map the type, using the object_types_dict to resolve refs 42 + let graphql_type = 43 + type_mapper.map_type_with_registry(type_, format, ref, object_types_dict) 44 + 45 + // Make required fields non-null 46 + let field_type = case required { 47 + True -> schema.non_null(graphql_type) 48 + False -> graphql_type 49 + } 50 + 51 + // Create field with a resolver that extracts the value from context 52 + schema.field(name, field_type, "Field from object definition", fn(ctx) { 53 + case ctx.data { 54 + option.Some(value.Object(fields)) -> { 55 + case list.key_find(fields, name) { 56 + Ok(val) -> Ok(val) 57 + Error(_) -> Ok(value.Null) 58 + } 59 + } 60 + _ -> Ok(value.Null) 61 + } 62 + }) 63 + }) 64 + } 65 + 66 + /// Build a dict of all object types from the registry 67 + /// Keys are the fully-qualified refs (e.g., "social.grain.defs#aspectRatio") 68 + /// 69 + /// Note: This builds types recursively. Object types that reference other object types 70 + /// will have those refs resolved using the same dict (which gets built incrementally). 71 + pub fn build_all_object_types( 72 + registry: lexicon_registry.Registry, 73 + ) -> Dict(String, schema.Type) { 74 + let object_refs = lexicon_registry.get_all_object_refs(registry) 75 + 76 + // Build all object types in a single pass 77 + // For simple cases (no circular refs), this works fine 78 + // TODO: Handle circular refs if needed 79 + list.fold(object_refs, dict.new(), fn(acc, ref) { 80 + case lexicon_registry.get_object_def(registry, ref) { 81 + option.Some(obj_def) -> { 82 + // Generate a GraphQL type name from the ref 83 + // e.g., "social.grain.defs#aspectRatio" -> "SocialGrainDefsAspectRatio" 84 + let type_name = ref_to_type_name(ref) 85 + // Pass acc as the object_types_dict so we can resolve refs to previously built types 86 + let object_type = build_object_type(obj_def, type_name, acc) 87 + dict.insert(acc, ref, object_type) 88 + } 89 + option.None -> acc 90 + } 91 + }) 92 + } 93 + 94 + /// Convert a ref to a GraphQL type name 95 + /// Example: "social.grain.defs#aspectRatio" -> "SocialGrainDefsAspectRatio" 96 + fn ref_to_type_name(ref: String) -> String { 97 + // Replace # with . first, then convert to PascalCase 98 + let normalized = string.replace(ref, "#", ".") 99 + nsid.to_type_name(normalized) 100 + }
+30 -12
lexicon_graphql/src/lexicon_graphql/schema_builder.gleam
··· 2 2 /// 3 3 /// Builds GraphQL schemas from AT Protocol lexicon definitions. 4 4 /// Simplified MVP version - handles basic record types only. 5 + import gleam/dict 5 6 import gleam/list 6 7 import gleam/option 7 8 import graphql/schema ··· 45 46 // Extract record types from lexicons 46 47 let record_types = extract_record_types(lexicons) 47 48 49 + // Build object types dict for sharing between queries and mutations 50 + let object_types = build_object_types_dict(record_types) 51 + 48 52 // Build the query type with fields for each record 49 - let query_type = build_query_type(record_types) 53 + let query_type = build_query_type(record_types, object_types) 50 54 51 - // Build the mutation type with stub resolvers 55 + // Build the mutation type with stub resolvers, using shared object types 52 56 let mutation_type = 53 57 mutation_builder.build_mutation_type( 54 58 lexicons, 59 + object_types, 55 60 option.None, 56 61 option.None, 57 62 option.None, ··· 73 78 /// Parse a single lexicon into a RecordType 74 79 fn parse_lexicon(lexicon: Lexicon) -> Result(RecordType, Nil) { 75 80 case lexicon { 76 - types.Lexicon(id, types.Defs(types.RecordDef("record", properties))) -> { 81 + types.Lexicon(id, types.Defs(option.Some(types.RecordDef("record", _, properties)), _)) -> { 77 82 let type_name = nsid.to_type_name(id) 78 83 let field_name = nsid.to_field_name(id) 79 84 let fields = build_fields(properties) ··· 113 118 // Build fields from lexicon properties 114 119 let lexicon_fields = 115 120 list.map(properties, fn(prop) { 116 - let #(name, types.Property(type_, _required)) = prop 121 + let #(name, types.Property(type_, _required, _, _)) = prop 117 122 let graphql_type = type_mapper.map_type(type_) 118 123 119 124 schema.field(name, graphql_type, "Field from lexicon", fn(_ctx) { ··· 125 130 list.append(standard_fields, lexicon_fields) 126 131 } 127 132 133 + /// Build a dict of object types keyed by nsid 134 + fn build_object_types_dict( 135 + record_types: List(RecordType), 136 + ) -> dict.Dict(String, schema.Type) { 137 + list.fold(record_types, dict.new(), fn(acc, record_type) { 138 + let object_type = 139 + schema.object_type( 140 + record_type.type_name, 141 + "Record type: " <> record_type.nsid, 142 + record_type.fields, 143 + ) 144 + dict.insert(acc, record_type.nsid, object_type) 145 + }) 146 + } 147 + 128 148 /// Build the root Query type with fields for each record type 129 - fn build_query_type(record_types: List(RecordType)) -> schema.Type { 149 + fn build_query_type( 150 + record_types: List(RecordType), 151 + object_types: dict.Dict(String, schema.Type), 152 + ) -> schema.Type { 130 153 let query_fields = 131 154 list.map(record_types, fn(record_type) { 132 - // Build the object type for this record 133 - let object_type = 134 - schema.object_type( 135 - record_type.type_name, 136 - "Record type: " <> record_type.nsid, 137 - record_type.fields, 138 - ) 155 + // Get the pre-built object type from dict 156 + let assert Ok(object_type) = dict.get(object_types, record_type.nsid) 139 157 140 158 // Create a list type for the query field 141 159 let list_type = schema.list_type(object_type)
+30
lexicon_graphql/src/lexicon_graphql/type_mapper.gleam
··· 4 4 /// Simplified MVP version - handles basic types only. 5 5 /// 6 6 /// Based on the Elixir implementation but adapted for the pure Gleam GraphQL library. 7 + import gleam/dict.{type Dict} 8 + import gleam/option.{type Option} 7 9 import graphql/schema 8 10 import lexicon_graphql/blob_type 9 11 ··· 76 78 pub fn get_blob_type() -> schema.Type { 77 79 blob_type.create_blob_type() 78 80 } 81 + 82 + /// Maps a lexicon type to a GraphQL type, resolving refs using a registry 83 + /// and object types dict. 84 + /// 85 + /// This function handles: 86 + /// - Regular types: maps using map_type() 87 + /// - Refs: looks up the ref in object_types_dict to get the actual object type 88 + /// 89 + /// Used by object_type_builder to build nested object types. 90 + pub fn map_type_with_registry( 91 + lexicon_type: String, 92 + _format: Option(String), 93 + ref: Option(String), 94 + object_types_dict: Dict(String, schema.Type), 95 + ) -> schema.Type { 96 + case lexicon_type { 97 + "ref" -> 98 + case ref { 99 + option.Some(ref_str) -> 100 + case dict.get(object_types_dict, ref_str) { 101 + Ok(object_type) -> object_type 102 + Error(_) -> schema.string_type() 103 + } 104 + option.None -> schema.string_type() 105 + } 106 + _ -> map_type(lexicon_type) 107 + } 108 + }
+32 -4
lexicon_graphql/src/lexicon_graphql/types.gleam
··· 2 2 /// 3 3 /// Common type definitions used across lexicon_graphql modules. 4 4 /// This module exists to break import cycles between schema_builder and mutation_builder. 5 + import gleam/dict.{type Dict} 6 + import gleam/option.{type Option} 5 7 6 8 /// Lexicon definition structure (simplified) 7 9 pub type Lexicon { ··· 9 11 } 10 12 11 13 /// Lexicon definitions container 14 + /// Contains an optional main record definition and any other named definitions (e.g., object types) 15 + /// Some lexicons (like social.grain.defs) only contain helper object types, not a main record 12 16 pub type Defs { 13 - Defs(main: RecordDef) 17 + Defs(main: Option(RecordDef), others: Dict(String, Def)) 14 18 } 15 19 16 - /// Record definition 20 + /// A definition can be either a record or an object 21 + pub type Def { 22 + Record(RecordDef) 23 + Object(ObjectDef) 24 + } 25 + 26 + /// Record definition (a collection/record type) 17 27 pub type RecordDef { 18 - RecordDef(type_: String, properties: List(#(String, Property))) 28 + RecordDef( 29 + type_: String, 30 + key: Option(String), 31 + properties: List(#(String, Property)), 32 + ) 33 + } 34 + 35 + /// Object definition (a nested object type like aspectRatio) 36 + pub type ObjectDef { 37 + ObjectDef( 38 + type_: String, 39 + required_fields: List(String), 40 + properties: List(#(String, Property)), 41 + ) 19 42 } 20 43 21 44 /// Property definition 22 45 pub type Property { 23 - Property(type_: String, required: Bool) 46 + Property( 47 + type_: String, 48 + required: Bool, 49 + format: Option(String), 50 + ref: Option(String), 51 + ) 24 52 }
+79
lexicon_graphql/src/lexicon_graphql/uri_extractor.gleam
··· 1 + /// URI Extraction Utility 2 + /// 3 + /// Extracts AT Protocol URIs from both strongRef objects and plain at-uri strings. 4 + /// This is used at runtime to resolve forward joins by extracting the target URI 5 + /// from a record's field value. 6 + import gleam/dynamic.{type Dynamic} 7 + import gleam/dynamic/decode 8 + import gleam/option.{type Option, None, Some} 9 + import gleam/string 10 + 11 + /// Extract a URI from a dynamic value 12 + /// 13 + /// Handles two cases: 14 + /// 1. strongRef object: { "$type": "com.atproto.repo.strongRef", "uri": "at://...", "cid": "..." } 15 + /// 2. Plain at-uri string: "at://did:plc:abc123/collection/rkey" 16 + /// 17 + /// Returns None if the value is not a valid URI or strongRef 18 + pub fn extract_uri(value: Dynamic) -> Option(String) { 19 + // Try to decode as a string first (at-uri) 20 + case decode.run(value, decode.string) { 21 + Ok(uri_str) -> { 22 + case is_valid_at_uri(uri_str) { 23 + True -> Some(uri_str) 24 + False -> None 25 + } 26 + } 27 + Error(_) -> { 28 + // Not a string, try as strongRef object 29 + extract_from_strong_ref(value) 30 + } 31 + } 32 + } 33 + 34 + /// Extract URI from a strongRef object 35 + /// strongRef format: { "$type": "com.atproto.repo.strongRef", "uri": "at://...", "cid": "..." } 36 + fn extract_from_strong_ref(value: Dynamic) -> Option(String) { 37 + // Try to decode as an object with uri field 38 + let uri_decoder = { 39 + use uri <- decode.field("uri", decode.string) 40 + decode.success(uri) 41 + } 42 + 43 + case decode.run(value, uri_decoder) { 44 + Ok(uri) -> { 45 + case is_valid_at_uri(uri) { 46 + True -> Some(uri) 47 + False -> None 48 + } 49 + } 50 + Error(_) -> None 51 + } 52 + } 53 + 54 + /// Check if a string is a valid AT Protocol URI 55 + /// AT URIs have the format: at://did/collection/rkey 56 + fn is_valid_at_uri(uri: String) -> Bool { 57 + string.starts_with(uri, "at://") 58 + } 59 + 60 + /// Check if a value is a strongRef object 61 + pub fn is_strong_ref(value: Dynamic) -> Bool { 62 + let type_decoder = { 63 + use type_str <- decode.field("$type", decode.string) 64 + decode.success(type_str) 65 + } 66 + 67 + case decode.run(value, type_decoder) { 68 + Ok(type_str) -> type_str == "com.atproto.repo.strongRef" 69 + Error(_) -> False 70 + } 71 + } 72 + 73 + /// Check if a value is a plain at-uri string 74 + pub fn is_at_uri_string(value: Dynamic) -> Bool { 75 + case decode.run(value, decode.string) { 76 + Ok(uri_str) -> is_valid_at_uri(uri_str) 77 + Error(_) -> False 78 + } 79 + }
+281
lexicon_graphql/test/collection_meta_test.gleam
··· 1 + /// Tests for Collection Metadata Extraction 2 + /// 3 + /// Tests that we correctly identify forward and reverse join fields from lexicons 4 + import gleam/dict 5 + import gleam/option.{None, Some} 6 + import gleeunit/should 7 + import lexicon_graphql/collection_meta 8 + import lexicon_graphql/types 9 + 10 + // Test extracting metadata from a lexicon with strongRef fields 11 + pub fn extract_metadata_with_strong_ref_test() { 12 + let lexicon = 13 + types.Lexicon( 14 + id: "app.bsky.actor.profile", 15 + defs: types.Defs( 16 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 17 + #( 18 + "displayName", 19 + types.Property( 20 + type_: "string", 21 + required: True, 22 + format: None, 23 + ref: None, 24 + ), 25 + ), 26 + #( 27 + "pinnedPost", 28 + types.Property( 29 + type_: "ref", 30 + required: False, 31 + format: None, 32 + ref: Some("com.atproto.repo.strongRef"), 33 + ), 34 + ), 35 + #( 36 + "joinedViaStarterPack", 37 + types.Property( 38 + type_: "ref", 39 + required: False, 40 + format: None, 41 + ref: Some("com.atproto.repo.strongRef"), 42 + ), 43 + ), 44 + ])), 45 + others: dict.new(), 46 + ), 47 + ) 48 + 49 + let meta = collection_meta.extract_metadata(lexicon) 50 + 51 + // Check basic metadata 52 + meta.nsid 53 + |> should.equal("app.bsky.actor.profile") 54 + 55 + meta.type_name 56 + |> should.equal("AppBskyActorProfile") 57 + 58 + // Check forward join fields 59 + collection_meta.get_forward_join_field_names(meta) 60 + |> should.equal(["joinedViaStarterPack", "pinnedPost"]) 61 + 62 + // Check that they're identified as strongRef fields 63 + collection_meta.is_strong_ref_field(meta, "pinnedPost") 64 + |> should.be_true 65 + 66 + collection_meta.is_strong_ref_field(meta, "joinedViaStarterPack") 67 + |> should.be_true 68 + 69 + // Check reverse join fields (none in this case) 70 + meta.reverse_join_fields 71 + |> should.equal([]) 72 + } 73 + 74 + // Test extracting metadata from a lexicon with at-uri fields 75 + pub fn extract_metadata_with_at_uri_test() { 76 + let lexicon = 77 + types.Lexicon( 78 + id: "app.bsky.feed.like", 79 + defs: types.Defs( 80 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 81 + #( 82 + "subject", 83 + types.Property( 84 + type_: "string", 85 + required: True, 86 + format: Some("at-uri"), 87 + ref: None, 88 + ), 89 + ), 90 + #( 91 + "createdAt", 92 + types.Property( 93 + type_: "string", 94 + required: True, 95 + format: Some("datetime"), 96 + ref: None, 97 + ), 98 + ), 99 + ])), 100 + others: dict.new(), 101 + ), 102 + ) 103 + 104 + let meta = collection_meta.extract_metadata(lexicon) 105 + 106 + // Check basic metadata 107 + meta.nsid 108 + |> should.equal("app.bsky.feed.like") 109 + 110 + meta.type_name 111 + |> should.equal("AppBskyFeedLike") 112 + 113 + // Check forward join fields (at-uri string) 114 + collection_meta.get_forward_join_field_names(meta) 115 + |> should.equal(["subject"]) 116 + 117 + // Check that it's identified as at-uri field 118 + collection_meta.is_at_uri_field(meta, "subject") 119 + |> should.be_true 120 + 121 + collection_meta.is_strong_ref_field(meta, "subject") 122 + |> should.be_false 123 + 124 + // Check reverse join fields (at-uri fields can also be used for reverse joins) 125 + meta.reverse_join_fields 126 + |> should.equal(["subject"]) 127 + } 128 + 129 + // Test extracting metadata from a lexicon with both types 130 + pub fn extract_metadata_with_mixed_fields_test() { 131 + let lexicon = 132 + types.Lexicon( 133 + id: "app.bsky.feed.post", 134 + defs: types.Defs( 135 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 136 + #( 137 + "text", 138 + types.Property( 139 + type_: "string", 140 + required: True, 141 + format: None, 142 + ref: None, 143 + ), 144 + ), 145 + #( 146 + "reply", 147 + types.Property( 148 + type_: "ref", 149 + required: False, 150 + format: None, 151 + ref: Some("com.atproto.repo.strongRef"), 152 + ), 153 + ), 154 + #( 155 + "via", 156 + types.Property( 157 + type_: "string", 158 + required: False, 159 + format: Some("at-uri"), 160 + ref: None, 161 + ), 162 + ), 163 + ])), 164 + others: dict.new(), 165 + ), 166 + ) 167 + 168 + let meta = collection_meta.extract_metadata(lexicon) 169 + 170 + // Check forward join fields (should have both strongRef and at-uri) 171 + let forward_field_names = collection_meta.get_forward_join_field_names(meta) 172 + 173 + // Should contain both fields (order doesn't matter for this test) 174 + forward_field_names 175 + |> should.equal(["via", "reply"]) 176 + 177 + // Check types 178 + collection_meta.is_strong_ref_field(meta, "reply") 179 + |> should.be_true 180 + 181 + collection_meta.is_at_uri_field(meta, "via") 182 + |> should.be_true 183 + 184 + // Check reverse join fields (only at-uri) 185 + meta.reverse_join_fields 186 + |> should.equal(["via"]) 187 + } 188 + 189 + // Test extracting metadata from a lexicon with no join fields 190 + pub fn extract_metadata_with_no_join_fields_test() { 191 + let lexicon = 192 + types.Lexicon( 193 + id: "xyz.statusphere.status", 194 + defs: types.Defs( 195 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 196 + #( 197 + "status", 198 + types.Property( 199 + type_: "string", 200 + required: True, 201 + format: None, 202 + ref: None, 203 + ), 204 + ), 205 + #( 206 + "createdAt", 207 + types.Property( 208 + type_: "string", 209 + required: True, 210 + format: Some("datetime"), 211 + ref: None, 212 + ), 213 + ), 214 + ])), 215 + others: dict.new(), 216 + ), 217 + ) 218 + 219 + let meta = collection_meta.extract_metadata(lexicon) 220 + 221 + // Check that no join fields are identified 222 + collection_meta.get_forward_join_field_names(meta) 223 + |> should.equal([]) 224 + 225 + meta.reverse_join_fields 226 + |> should.equal([]) 227 + } 228 + 229 + // Test helper function: is_strong_ref_field with non-existent field 230 + pub fn is_strong_ref_field_non_existent_test() { 231 + let lexicon = 232 + types.Lexicon( 233 + id: "test.collection", 234 + defs: types.Defs( 235 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 236 + #( 237 + "text", 238 + types.Property( 239 + type_: "string", 240 + required: True, 241 + format: None, 242 + ref: None, 243 + ), 244 + ), 245 + ])), 246 + others: dict.new(), 247 + ), 248 + ) 249 + 250 + let meta = collection_meta.extract_metadata(lexicon) 251 + 252 + collection_meta.is_strong_ref_field(meta, "nonexistent") 253 + |> should.be_false 254 + } 255 + 256 + // Test helper function: is_at_uri_field with non-existent field 257 + pub fn is_at_uri_field_non_existent_test() { 258 + let lexicon = 259 + types.Lexicon( 260 + id: "test.collection", 261 + defs: types.Defs( 262 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 263 + #( 264 + "text", 265 + types.Property( 266 + type_: "string", 267 + required: True, 268 + format: None, 269 + ref: None, 270 + ), 271 + ), 272 + ])), 273 + others: dict.new(), 274 + ), 275 + ) 276 + 277 + let meta = collection_meta.extract_metadata(lexicon) 278 + 279 + collection_meta.is_at_uri_field(meta, "nonexistent") 280 + |> should.be_false 281 + }
+101
lexicon_graphql/test/connection_test.gleam
··· 1 + /// Tests for lexicon_graphql/connection module 2 + /// 3 + /// Tests the creation of unique SortFieldInput types per collection 4 + import gleeunit/should 5 + import graphql/schema 6 + import lexicon_graphql/connection as lexicon_connection 7 + 8 + pub fn sort_field_input_type_with_enum_creates_types_test() { 9 + // Test: sort_field_input_type_with_enum should create input object types 10 + // Since Type is opaque, we just verify the function executes without error 11 + 12 + let enum_type = 13 + schema.enum_type("TestSortField", "Sort fields", [ 14 + schema.enum_value("field1", "Field 1"), 15 + ]) 16 + 17 + // Create input type - should not crash 18 + let _input_type = 19 + lexicon_connection.sort_field_input_type_with_enum("TestCollection", enum_type) 20 + 21 + // If we got here without crashing, test passes 22 + True 23 + |> should.be_true() 24 + } 25 + 26 + pub fn lexicon_connection_args_with_field_enum_and_where_creates_args_test() { 27 + // Test: Connection args function should create a list of arguments including sortBy 28 + 29 + let sort_enum = 30 + schema.enum_type("TestCollectionSortField", "Sort fields", [ 31 + schema.enum_value("field1", "Field 1"), 32 + ]) 33 + 34 + let where_input = 35 + schema.input_object_type("TestCollectionWhereInput", "Where input", []) 36 + 37 + let args = 38 + lexicon_connection.lexicon_connection_args_with_field_enum_and_where( 39 + "TestCollection", 40 + sort_enum, 41 + where_input, 42 + ) 43 + 44 + // Should create multiple args (first, last, after, before, sortBy, where) 45 + // Verify we got a non-empty list 46 + args 47 + |> should.not_equal([]) 48 + } 49 + 50 + pub fn lexicon_connection_args_with_field_enum_creates_args_test() { 51 + // Test: Backward compat function should also work 52 + 53 + let sort_enum = 54 + schema.enum_type("TestCollectionSortField", "Sort fields", [ 55 + schema.enum_value("field1", "Field 1"), 56 + ]) 57 + 58 + let args = 59 + lexicon_connection.lexicon_connection_args_with_field_enum( 60 + "TestCollection", 61 + sort_enum, 62 + ) 63 + 64 + // Should create multiple args (first, last, after, before, sortBy) 65 + args 66 + |> should.not_equal([]) 67 + } 68 + 69 + pub fn different_collections_can_have_different_sort_input_types_test() { 70 + // Test: Multiple collections should be able to create their own SortFieldInput types 71 + // This ensures the fix for the enum validation bug works 72 + 73 + let enum_type = 74 + schema.enum_type("GenericSortField", "Generic sort", [ 75 + schema.enum_value("createdAt", "Created at"), 76 + ]) 77 + 78 + // Create SortFieldInput for multiple collections - should not crash 79 + let _gallery_input = 80 + lexicon_connection.sort_field_input_type_with_enum( 81 + "SocialGrainGalleryItem", 82 + enum_type, 83 + ) 84 + 85 + let _favorite_input = 86 + lexicon_connection.sort_field_input_type_with_enum( 87 + "SocialGrainFavorite", 88 + enum_type, 89 + ) 90 + 91 + let _photo_input = 92 + lexicon_connection.sort_field_input_type_with_enum( 93 + "SocialGrainPhoto", 94 + enum_type, 95 + ) 96 + 97 + // If we got here without crashing, test passes 98 + // The real verification happens in the integration test 99 + True 100 + |> should.be_true() 101 + }
+549
lexicon_graphql/test/dataloader_test.gleam
··· 1 + /// Tests for DataLoader batching logic 2 + /// 3 + /// Verifies that URIs are correctly grouped and batched to prevent N+1 queries 4 + import gleam/dict 5 + import gleam/int 6 + import gleam/list 7 + import gleam/option.{None, Some} 8 + import gleeunit/should 9 + import graphql/value 10 + import lexicon_graphql/collection_meta 11 + import lexicon_graphql/dataloader 12 + import lexicon_graphql/types 13 + 14 + // Test URI-to-collection extraction 15 + pub fn uri_to_collection_test() { 16 + // Valid URI 17 + dataloader.uri_to_collection("at://did:plc:abc123/app.bsky.feed.post/3k7h8") 18 + |> should.equal(Some("app.bsky.feed.post")) 19 + 20 + // Valid URI with different collection 21 + dataloader.uri_to_collection( 22 + "at://did:plc:xyz789/app.bsky.actor.profile/self", 23 + ) 24 + |> should.equal(Some("app.bsky.actor.profile")) 25 + 26 + // Invalid URI format 27 + dataloader.uri_to_collection("https://example.com") 28 + |> should.equal(None) 29 + 30 + // Empty string 31 + dataloader.uri_to_collection("") 32 + |> should.equal(None) 33 + } 34 + 35 + // Test batch fetching by URI with mock fetcher 36 + pub fn batch_fetch_by_uri_test() { 37 + // Create a mock fetcher that returns records based on URI collection 38 + let mock_fetcher = fn(uris: List(String), collection: String, _field) { 39 + // Simulate fetching records - return one record per URI 40 + let _records = 41 + list.map(uris, fn(uri) { 42 + value.Object([ 43 + #("uri", value.String(uri)), 44 + #("collection", value.String(collection)), 45 + #("text", value.String("Test post")), 46 + ]) 47 + }) 48 + 49 + // Group by URI for the result 50 + let result = 51 + list.fold(uris, dict.new(), fn(acc, uri) { 52 + let record = 53 + value.Object([ 54 + #("uri", value.String(uri)), 55 + #("collection", value.String(collection)), 56 + #("text", value.String("Test post for " <> uri)), 57 + ]) 58 + dict.insert(acc, uri, [record]) 59 + }) 60 + 61 + Ok(result) 62 + } 63 + 64 + // Test with URIs from the same collection 65 + let uris = [ 66 + "at://did:plc:a/app.bsky.feed.post/1", 67 + "at://did:plc:b/app.bsky.feed.post/2", 68 + ] 69 + 70 + case dataloader.batch_fetch_by_uri(uris, mock_fetcher) { 71 + Ok(results) -> { 72 + // Should have 2 results 73 + dict.size(results) 74 + |> should.equal(2) 75 + 76 + // Each URI should have its record 77 + dict.has_key(results, "at://did:plc:a/app.bsky.feed.post/1") 78 + |> should.be_true 79 + 80 + dict.has_key(results, "at://did:plc:b/app.bsky.feed.post/2") 81 + |> should.be_true 82 + } 83 + Error(_) -> should.fail() 84 + } 85 + } 86 + 87 + // Test batch fetching with mixed collections 88 + pub fn batch_fetch_by_uri_mixed_collections_test() { 89 + // Mock fetcher that tracks which collections were queried 90 + let mock_fetcher = fn(uris: List(String), collection: String, _field) { 91 + let result = 92 + list.fold(uris, dict.new(), fn(acc, uri) { 93 + let record = 94 + value.Object([ 95 + #("uri", value.String(uri)), 96 + #("collection", value.String(collection)), 97 + ]) 98 + dict.insert(acc, uri, [record]) 99 + }) 100 + Ok(result) 101 + } 102 + 103 + // URIs from different collections 104 + let uris = [ 105 + "at://did:plc:a/app.bsky.feed.post/1", 106 + "at://did:plc:b/app.bsky.actor.profile/self", 107 + ] 108 + 109 + case dataloader.batch_fetch_by_uri(uris, mock_fetcher) { 110 + Ok(results) -> { 111 + // Should batch by collection and fetch both 112 + dict.size(results) 113 + |> should.equal(2) 114 + } 115 + Error(_) -> should.fail() 116 + } 117 + } 118 + 119 + // Test reverse join batching 120 + pub fn batch_fetch_by_reverse_join_test() { 121 + // Mock fetcher that simulates finding records that reference parent URIs 122 + let mock_fetcher = fn(parent_uris: List(String), _collection: String, field) { 123 + case field { 124 + Some(ref_field) -> { 125 + // Simulate finding records that reference each parent URI 126 + let result = 127 + list.fold(parent_uris, dict.new(), fn(acc, parent_uri) { 128 + // Create 2 child records per parent 129 + let child1 = 130 + value.Object([ 131 + #("uri", value.String("at://did:plc:child1/collection/key1")), 132 + #(ref_field, value.String(parent_uri)), 133 + ]) 134 + let child2 = 135 + value.Object([ 136 + #("uri", value.String("at://did:plc:child2/collection/key2")), 137 + #(ref_field, value.String(parent_uri)), 138 + ]) 139 + dict.insert(acc, parent_uri, [child1, child2]) 140 + }) 141 + Ok(result) 142 + } 143 + None -> Error("Reference field required for reverse joins") 144 + } 145 + } 146 + 147 + let parent_uris = ["at://did:plc:parent/app.bsky.feed.post/1"] 148 + 149 + case 150 + dataloader.batch_fetch_by_reverse_join( 151 + parent_uris, 152 + "app.bsky.feed.like", 153 + "subject", 154 + mock_fetcher, 155 + ) 156 + { 157 + Ok(results) -> { 158 + // Should have results for the parent URI 159 + case dict.get(results, "at://did:plc:parent/app.bsky.feed.post/1") { 160 + Ok(children) -> { 161 + // Should have 2 child records 162 + list.length(children) 163 + |> should.equal(2) 164 + } 165 + Error(_) -> should.fail() 166 + } 167 + } 168 + Error(_) -> should.fail() 169 + } 170 + } 171 + 172 + // Test extracting URIs from records 173 + pub fn extract_uris_from_records_test() { 174 + // Create a test lexicon for likes with subject field 175 + let lexicon = 176 + types.Lexicon( 177 + id: "app.bsky.feed.like", 178 + defs: types.Defs( 179 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 180 + #( 181 + "subject", 182 + types.Property( 183 + type_: "string", 184 + required: True, 185 + format: Some("at-uri"), 186 + ref: None, 187 + ), 188 + ), 189 + ])), 190 + others: dict.new(), 191 + ), 192 + ) 193 + 194 + let meta = collection_meta.extract_metadata(lexicon) 195 + 196 + // Create test records with at-uri subject fields 197 + let records = [ 198 + value.Object([ 199 + #("uri", value.String("at://did:plc:a/app.bsky.feed.like/1")), 200 + #("subject", value.String("at://did:plc:target/app.bsky.feed.post/1")), 201 + ]), 202 + value.Object([ 203 + #("uri", value.String("at://did:plc:b/app.bsky.feed.like/2")), 204 + #("subject", value.String("at://did:plc:target/app.bsky.feed.post/2")), 205 + ]), 206 + ] 207 + 208 + let uris = dataloader.extract_uris_from_records(records, "subject", meta) 209 + 210 + // Should extract 2 URIs 211 + list.length(uris) 212 + |> should.equal(2) 213 + 214 + // Should contain the expected URIs 215 + list.contains(uris, "at://did:plc:target/app.bsky.feed.post/1") 216 + |> should.be_true 217 + 218 + list.contains(uris, "at://did:plc:target/app.bsky.feed.post/2") 219 + |> should.be_true 220 + } 221 + 222 + // Test extracting URIs from records with strongRef 223 + pub fn extract_uris_from_records_with_strong_ref_test() { 224 + // Create a test lexicon with strongRef field 225 + let lexicon = 226 + types.Lexicon( 227 + id: "app.bsky.actor.profile", 228 + defs: types.Defs( 229 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 230 + #( 231 + "pinnedPost", 232 + types.Property( 233 + type_: "ref", 234 + required: False, 235 + format: None, 236 + ref: Some("com.atproto.repo.strongRef"), 237 + ), 238 + ), 239 + ])), 240 + others: dict.new(), 241 + ), 242 + ) 243 + 244 + let meta = collection_meta.extract_metadata(lexicon) 245 + 246 + // Create test record with strongRef - using test_helpers to create the nested object 247 + // For now, we'll skip this test since it requires more complex Dynamic construction 248 + // TODO: Implement once we have better strongRef test helpers 249 + 250 + // Placeholder assertion 251 + meta.nsid 252 + |> should.equal("app.bsky.actor.profile") 253 + } 254 + 255 + // Test error handling when fetcher fails 256 + pub fn batch_fetch_error_handling_test() { 257 + // Mock fetcher that always fails 258 + let failing_fetcher = fn(_uris, _collection, _field) { 259 + Error("Database connection failed") 260 + } 261 + 262 + let uris = ["at://did:plc:a/app.bsky.feed.post/1"] 263 + 264 + case dataloader.batch_fetch_by_uri(uris, failing_fetcher) { 265 + Ok(_) -> should.fail() 266 + Error(msg) -> msg |> should.equal("Database connection failed") 267 + } 268 + } 269 + 270 + // Test with empty URI list 271 + pub fn batch_fetch_empty_list_test() { 272 + let mock_fetcher = fn(_uris, _collection, _field) { Ok(dict.new()) } 273 + 274 + case dataloader.batch_fetch_by_uri([], mock_fetcher) { 275 + Ok(results) -> { 276 + dict.size(results) 277 + |> should.equal(0) 278 + } 279 + Error(_) -> should.fail() 280 + } 281 + } 282 + 283 + // Test paginated reverse join batching 284 + pub fn batch_fetch_by_reverse_join_paginated_test() { 285 + // Mock paginated fetcher that simulates paginated results 286 + let mock_paginated_fetcher = fn( 287 + parent_uri: String, 288 + _collection: String, 289 + field: option.Option(String), 290 + params: dataloader.PaginationParams, 291 + ) { 292 + case field { 293 + Some(_ref_field) -> { 294 + // Create mock edges based on pagination parameters 295 + let page_size = case params.first { 296 + Some(n) -> n 297 + None -> 10 298 + } 299 + 300 + // Create mock records with cursors 301 + let edges = 302 + list.range(1, page_size) 303 + |> list.map(fn(i) { 304 + let record = 305 + value.Object([ 306 + #( 307 + "uri", 308 + value.String( 309 + "at://did:plc:child" <> int.to_string(i) <> "/collection/key", 310 + ), 311 + ), 312 + #("parentUri", value.String(parent_uri)), 313 + ]) 314 + let cursor = "cursor_" <> int.to_string(i) 315 + #(record, cursor) 316 + }) 317 + 318 + Ok(dataloader.PaginatedBatchResult( 319 + edges: edges, 320 + has_next_page: True, 321 + has_previous_page: False, 322 + total_count: Some(20), 323 + )) 324 + } 325 + None -> Error("Reference field required for paginated reverse joins") 326 + } 327 + } 328 + 329 + let parent_uri = "at://did:plc:parent/app.bsky.feed.post/1" 330 + let params = 331 + dataloader.PaginationParams( 332 + first: Some(5), 333 + after: None, 334 + last: None, 335 + before: None, 336 + sort_by: None, 337 + where: None, 338 + ) 339 + 340 + case 341 + dataloader.batch_fetch_by_reverse_join_paginated( 342 + parent_uri, 343 + "app.bsky.feed.like", 344 + "subject", 345 + params, 346 + mock_paginated_fetcher, 347 + ) 348 + { 349 + Ok(result) -> { 350 + // Should have 5 edges (as requested by first: 5) 351 + list.length(result.edges) 352 + |> should.equal(5) 353 + 354 + // Should have next page 355 + result.has_next_page 356 + |> should.be_true 357 + 358 + // Should not have previous page 359 + result.has_previous_page 360 + |> should.be_false 361 + 362 + // Should have total count 363 + result.total_count 364 + |> should.equal(Some(20)) 365 + } 366 + Error(_) -> should.fail() 367 + } 368 + } 369 + 370 + // Test paginated DID batching 371 + pub fn batch_fetch_by_did_paginated_test() { 372 + // Mock paginated fetcher for DID-based queries 373 + let mock_paginated_fetcher = fn( 374 + did: String, 375 + _collection: String, 376 + _field: option.Option(String), 377 + params: dataloader.PaginationParams, 378 + ) { 379 + let page_size = case params.first { 380 + Some(n) -> n 381 + None -> 10 382 + } 383 + 384 + // Simulate records belonging to the DID 385 + let edges = 386 + list.range(1, page_size) 387 + |> list.map(fn(i) { 388 + let record = 389 + value.Object([ 390 + #( 391 + "uri", 392 + value.String( 393 + "at://" <> did <> "/app.bsky.feed.post/" <> int.to_string(i), 394 + ), 395 + ), 396 + #("did", value.String(did)), 397 + #("text", value.String("Post " <> int.to_string(i))), 398 + ]) 399 + let cursor = "did_cursor_" <> int.to_string(i) 400 + #(record, cursor) 401 + }) 402 + 403 + Ok(dataloader.PaginatedBatchResult( 404 + edges: edges, 405 + has_next_page: True, 406 + has_previous_page: False, 407 + total_count: Some(50), 408 + )) 409 + } 410 + 411 + let did = "did:plc:abc123" 412 + let params = 413 + dataloader.PaginationParams( 414 + first: Some(3), 415 + after: None, 416 + last: None, 417 + before: None, 418 + sort_by: None, 419 + where: None, 420 + ) 421 + 422 + case 423 + dataloader.batch_fetch_by_did_paginated( 424 + did, 425 + "app.bsky.feed.post", 426 + params, 427 + mock_paginated_fetcher, 428 + ) 429 + { 430 + Ok(result) -> { 431 + // Should have 3 edges (as requested by first: 3) 432 + list.length(result.edges) 433 + |> should.equal(3) 434 + 435 + // Should have next page 436 + result.has_next_page 437 + |> should.be_true 438 + 439 + // Should have total count 440 + result.total_count 441 + |> should.equal(Some(50)) 442 + } 443 + Error(_) -> should.fail() 444 + } 445 + } 446 + 447 + // Test paginated error handling 448 + pub fn batch_fetch_paginated_error_handling_test() { 449 + // Mock fetcher that always fails 450 + let failing_fetcher = fn(_key, _collection, _field, _params) { 451 + Error("Pagination query failed") 452 + } 453 + 454 + let params = 455 + dataloader.PaginationParams( 456 + first: Some(10), 457 + after: None, 458 + last: None, 459 + before: None, 460 + sort_by: None, 461 + where: None, 462 + ) 463 + 464 + case 465 + dataloader.batch_fetch_by_did_paginated( 466 + "did:plc:test", 467 + "app.bsky.feed.post", 468 + params, 469 + failing_fetcher, 470 + ) 471 + { 472 + Ok(_) -> should.fail() 473 + Error(msg) -> msg |> should.equal("Pagination query failed") 474 + } 475 + } 476 + 477 + // Test backward pagination parameters 478 + pub fn batch_fetch_backward_pagination_test() { 479 + // Mock paginated fetcher that handles backward pagination 480 + let mock_paginated_fetcher = fn( 481 + _did: String, 482 + _collection: String, 483 + _field: option.Option(String), 484 + params: dataloader.PaginationParams, 485 + ) { 486 + // Check if backward pagination is requested 487 + let is_backward = case params.last { 488 + Some(_) -> True 489 + None -> False 490 + } 491 + 492 + let page_size = case params.last { 493 + Some(n) -> n 494 + None -> 5 495 + } 496 + 497 + let edges = 498 + list.range(1, page_size) 499 + |> list.map(fn(i) { 500 + let record = 501 + value.Object([ 502 + #("uri", value.String("at://did:plc:test/collection/" <> int.to_string(i))), 503 + ]) 504 + let cursor = "cursor_" <> int.to_string(i) 505 + #(record, cursor) 506 + }) 507 + 508 + Ok(dataloader.PaginatedBatchResult( 509 + edges: edges, 510 + has_next_page: False, 511 + has_previous_page: is_backward, 512 + total_count: Some(10), 513 + )) 514 + } 515 + 516 + let params = 517 + dataloader.PaginationParams( 518 + first: None, 519 + after: None, 520 + last: Some(3), 521 + before: Some("cursor_10"), 522 + sort_by: None, 523 + where: None, 524 + ) 525 + 526 + case 527 + dataloader.batch_fetch_by_did_paginated( 528 + "did:plc:test", 529 + "app.bsky.feed.post", 530 + params, 531 + mock_paginated_fetcher, 532 + ) 533 + { 534 + Ok(result) -> { 535 + // Should have 3 edges (as requested by last: 3) 536 + list.length(result.edges) 537 + |> should.equal(3) 538 + 539 + // Should not have next page (at the end) 540 + result.has_next_page 541 + |> should.be_false 542 + 543 + // Should have previous page (backward pagination) 544 + result.has_previous_page 545 + |> should.be_true 546 + } 547 + Error(_) -> should.fail() 548 + } 549 + }
+287
lexicon_graphql/test/did_join_test.gleam
··· 1 + /// Tests for DID-based join field generation 2 + /// 3 + /// Verifies that DID join fields are added to GraphQL schemas correctly: 4 + /// - All collections get DID join fields to all other collections 5 + /// - Cardinality is determined by has_unique_did (literal:self key) 6 + /// - Field naming follows {TypeName}ByDid pattern 7 + import gleam/dict 8 + import gleam/option 9 + import gleam/string 10 + import gleeunit/should 11 + import graphql/introspection 12 + import graphql/schema 13 + import graphql/sdl 14 + import lexicon_graphql/db_schema_builder 15 + import lexicon_graphql/types 16 + 17 + // Helper to create a test schema with a mock fetcher 18 + fn create_test_schema_from_lexicons( 19 + lexicons: List(types.Lexicon), 20 + ) -> schema.Schema { 21 + // Mock fetcher that returns empty results (we're only testing schema generation) 22 + let fetcher = fn(_collection, _params) { 23 + Ok(#([], option.None, False, False, option.None)) 24 + } 25 + 26 + case 27 + db_schema_builder.build_schema_with_fetcher( 28 + lexicons, 29 + fetcher, 30 + option.None, 31 + option.None, 32 + option.None, 33 + option.None, 34 + option.None, 35 + option.None, 36 + ) 37 + { 38 + Ok(s) -> s 39 + Error(_) -> panic as "Failed to build test schema" 40 + } 41 + } 42 + 43 + // Test that collections get DID join fields to other collections 44 + pub fn collections_get_did_join_fields_test() { 45 + // Create two collections: a status and a profile (with literal:self) 46 + let status_lexicon = 47 + types.Lexicon( 48 + id: "xyz.statusphere.status", 49 + defs: types.Defs( 50 + main: option.Some(types.RecordDef(type_: "record", key: option.None, properties: [ 51 + #( 52 + "text", 53 + types.Property(type_: "string", required: True, format: option.None, ref: option.None), 54 + ), 55 + ])), 56 + others: dict.new(), 57 + ), 58 + ) 59 + 60 + let profile_lexicon = 61 + types.Lexicon( 62 + id: "app.bsky.actor.profile", 63 + defs: types.Defs( 64 + main: option.Some(types.RecordDef( 65 + type_: "record", 66 + key: option.Some("literal:self"), 67 + properties: [ 68 + #( 69 + "displayName", 70 + types.Property( 71 + type_: "string", 72 + required: False, 73 + format: option.None, 74 + ref: option.None, 75 + ), 76 + ), 77 + ], 78 + )), 79 + others: dict.new(), 80 + ), 81 + ) 82 + 83 + let test_schema = 84 + create_test_schema_from_lexicons([status_lexicon, profile_lexicon]) 85 + 86 + // Get all types and serialize to SDL 87 + let all_types = introspection.get_all_schema_types(test_schema) 88 + let serialized = sdl.print_types(all_types) 89 + 90 + // Verify that Status has a DID join field to Profile 91 + string.contains(serialized, "appBskyActorProfileByDid") 92 + |> should.be_true 93 + 94 + // Verify that Profile has a DID join field to Status 95 + string.contains(serialized, "xyzStatusphereStatusByDid") 96 + |> should.be_true 97 + } 98 + 99 + // Test that literal:self collections return single nullable objects 100 + pub fn literal_self_returns_single_object_test() { 101 + let status_lexicon = 102 + types.Lexicon( 103 + id: "xyz.statusphere.status", 104 + defs: types.Defs( 105 + main: option.Some(types.RecordDef(type_: "record", key: option.None, properties: [ 106 + #( 107 + "text", 108 + types.Property(type_: "string", required: True, format: option.None, ref: option.None), 109 + ), 110 + ])), 111 + others: dict.new(), 112 + ), 113 + ) 114 + 115 + let profile_lexicon = 116 + types.Lexicon( 117 + id: "app.bsky.actor.profile", 118 + defs: types.Defs( 119 + main: option.Some(types.RecordDef( 120 + type_: "record", 121 + key: option.Some("literal:self"), 122 + properties: [ 123 + #( 124 + "displayName", 125 + types.Property( 126 + type_: "string", 127 + required: False, 128 + format: option.None, 129 + ref: option.None, 130 + ), 131 + ), 132 + ], 133 + )), 134 + others: dict.new(), 135 + ), 136 + ) 137 + 138 + let test_schema = 139 + create_test_schema_from_lexicons([status_lexicon, profile_lexicon]) 140 + 141 + let all_types = introspection.get_all_schema_types(test_schema) 142 + let serialized = sdl.print_types(all_types) 143 + 144 + // Profile should be returned as single object (not list) from Status 145 + // We check that it's NOT wrapped in a list (no brackets) 146 + // The field should be: appBskyActorProfileByDid: AppBskyActorProfile 147 + string.contains(serialized, "appBskyActorProfileByDid: AppBskyActorProfile") 148 + |> should.be_true 149 + } 150 + 151 + // Test that non-literal:self collections return lists 152 + pub fn non_literal_self_returns_list_test() { 153 + let status_lexicon = 154 + types.Lexicon( 155 + id: "xyz.statusphere.status", 156 + defs: types.Defs( 157 + main: option.Some(types.RecordDef(type_: "record", key: option.None, properties: [ 158 + #( 159 + "text", 160 + types.Property(type_: "string", required: True, format: option.None, ref: option.None), 161 + ), 162 + ])), 163 + others: dict.new(), 164 + ), 165 + ) 166 + 167 + let post_lexicon = 168 + types.Lexicon( 169 + id: "app.bsky.feed.post", 170 + defs: types.Defs( 171 + main: option.Some(types.RecordDef(type_: "record", key: option.None, properties: [ 172 + #( 173 + "text", 174 + types.Property(type_: "string", required: True, format: option.None, ref: option.None), 175 + ), 176 + ])), 177 + others: dict.new(), 178 + ), 179 + ) 180 + 181 + let test_schema = 182 + create_test_schema_from_lexicons([status_lexicon, post_lexicon]) 183 + 184 + let all_types = introspection.get_all_schema_types(test_schema) 185 + let serialized = sdl.print_types(all_types) 186 + 187 + // Post should be returned as a connection (paginated list) from Status 188 + // The field should be: appBskyFeedPostByDid: AppBskyFeedPostConnection 189 + string.contains(serialized, "appBskyFeedPostByDid: AppBskyFeedPostConnection") 190 + |> should.be_true 191 + } 192 + 193 + // Test that multiple collections all get DID joins to each other 194 + pub fn multiple_collections_get_cross_joins_test() { 195 + let status_lexicon = 196 + types.Lexicon( 197 + id: "xyz.statusphere.status", 198 + defs: types.Defs( 199 + main: option.Some(types.RecordDef(type_: "record", key: option.None, properties: [ 200 + #( 201 + "text", 202 + types.Property(type_: "string", required: True, format: option.None, ref: option.None), 203 + ), 204 + ])), 205 + others: dict.new(), 206 + ), 207 + ) 208 + 209 + let post_lexicon = 210 + types.Lexicon( 211 + id: "app.bsky.feed.post", 212 + defs: types.Defs( 213 + main: option.Some(types.RecordDef(type_: "record", key: option.None, properties: [ 214 + #( 215 + "text", 216 + types.Property(type_: "string", required: True, format: option.None, ref: option.None), 217 + ), 218 + ])), 219 + others: dict.new(), 220 + ), 221 + ) 222 + 223 + let like_lexicon = 224 + types.Lexicon( 225 + id: "app.bsky.feed.like", 226 + defs: types.Defs( 227 + main: option.Some(types.RecordDef(type_: "record", key: option.None, properties: [ 228 + #( 229 + "subject", 230 + types.Property( 231 + type_: "string", 232 + required: True, 233 + format: option.Some("at-uri"), 234 + ref: option.None, 235 + ), 236 + ), 237 + ])), 238 + others: dict.new(), 239 + ), 240 + ) 241 + 242 + let test_schema = 243 + create_test_schema_from_lexicons([status_lexicon, post_lexicon, like_lexicon]) 244 + 245 + let all_types = introspection.get_all_schema_types(test_schema) 246 + let serialized = sdl.print_types(all_types) 247 + 248 + // Status should have joins to Post and Like 249 + string.contains(serialized, "appBskyFeedPostByDid") 250 + |> should.be_true 251 + 252 + string.contains(serialized, "appBskyFeedLikeByDid") 253 + |> should.be_true 254 + 255 + // Post should have joins to Status and Like 256 + string.contains(serialized, "xyzStatusphereStatusByDid") 257 + |> should.be_true 258 + 259 + // Like should have joins to Status and Post 260 + // (already checked above) 261 + } 262 + 263 + // Test that collections don't get DID join fields to themselves 264 + pub fn no_self_join_test() { 265 + let status_lexicon = 266 + types.Lexicon( 267 + id: "xyz.statusphere.status", 268 + defs: types.Defs( 269 + main: option.Some(types.RecordDef(type_: "record", key: option.None, properties: [ 270 + #( 271 + "text", 272 + types.Property(type_: "string", required: True, format: option.None, ref: option.None), 273 + ), 274 + ])), 275 + others: dict.new(), 276 + ), 277 + ) 278 + 279 + let test_schema = create_test_schema_from_lexicons([status_lexicon]) 280 + 281 + let all_types = introspection.get_all_schema_types(test_schema) 282 + let serialized = sdl.print_types(all_types) 283 + 284 + // Status should NOT have a join field to itself 285 + string.contains(serialized, "xyzStatusphereStatusByDid") 286 + |> should.be_false 287 + }
+273
lexicon_graphql/test/forward_join_test.gleam
··· 1 + /// Tests for forward join field generation 2 + /// 3 + /// Verifies that forward join fields are added to GraphQL schemas based on lexicon metadata 4 + import gleam/dict 5 + import gleam/option.{None, Some} 6 + import gleam/string 7 + import gleeunit/should 8 + import graphql/introspection 9 + import graphql/schema 10 + import graphql/sdl 11 + import lexicon_graphql/db_schema_builder 12 + import lexicon_graphql/types 13 + 14 + // Helper to create a test schema with a mock fetcher 15 + fn create_test_schema_from_lexicons( 16 + lexicons: List(types.Lexicon), 17 + ) -> schema.Schema { 18 + // Mock fetcher that returns empty results (we're only testing schema generation) 19 + let fetcher = fn(_collection, _params) { 20 + Ok(#([], option.None, False, False, option.None)) 21 + } 22 + 23 + case 24 + db_schema_builder.build_schema_with_fetcher( 25 + lexicons, 26 + fetcher, 27 + option.None, 28 + option.None, 29 + option.None, 30 + option.None, 31 + option.None, 32 + option.None, 33 + ) 34 + { 35 + Ok(s) -> s 36 + Error(_) -> panic as "Failed to build test schema" 37 + } 38 + } 39 + 40 + // Test that strongRef fields generate forward join fields 41 + pub fn strong_ref_generates_forward_join_field_test() { 42 + let lexicon = 43 + types.Lexicon( 44 + id: "app.bsky.actor.profile", 45 + defs: types.Defs( 46 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 47 + #( 48 + "displayName", 49 + types.Property(type_: "string", required: True, format: None, ref: None), 50 + ), 51 + #( 52 + "pinnedPost", 53 + types.Property( 54 + type_: "ref", 55 + required: False, 56 + format: None, 57 + ref: Some("com.atproto.repo.strongRef"), 58 + ), 59 + ), 60 + ])), 61 + others: dict.new(), 62 + ), 63 + ) 64 + 65 + let test_schema = create_test_schema_from_lexicons([lexicon]) 66 + 67 + // Get all types and serialize to SDL 68 + let all_types = introspection.get_all_schema_types(test_schema) 69 + let serialized = sdl.print_types(all_types) 70 + 71 + // Verify that pinnedPostResolved field appears in the schema 72 + string.contains(serialized, "pinnedPostResolved") 73 + |> should.be_true 74 + } 75 + 76 + // Test that at-uri fields generate forward join fields 77 + pub fn at_uri_generates_forward_join_field_test() { 78 + let lexicon = 79 + types.Lexicon( 80 + id: "app.bsky.feed.like", 81 + defs: types.Defs( 82 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 83 + #( 84 + "subject", 85 + types.Property( 86 + type_: "string", 87 + required: True, 88 + format: Some("at-uri"), 89 + ref: None, 90 + ), 91 + ), 92 + #( 93 + "createdAt", 94 + types.Property( 95 + type_: "string", 96 + required: True, 97 + format: Some("datetime"), 98 + ref: None, 99 + ), 100 + ), 101 + ])), 102 + others: dict.new(), 103 + ), 104 + ) 105 + 106 + let test_schema = create_test_schema_from_lexicons([lexicon]) 107 + 108 + let all_types = introspection.get_all_schema_types(test_schema) 109 + let serialized = sdl.print_types(all_types) 110 + 111 + // Verify that subjectResolved field appears in the schema 112 + string.contains(serialized, "subjectResolved") 113 + |> should.be_true 114 + } 115 + 116 + // Test that multiple forward join fields are all generated 117 + pub fn multiple_forward_join_fields_test() { 118 + let lexicon = 119 + types.Lexicon( 120 + id: "app.bsky.feed.post", 121 + defs: types.Defs( 122 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 123 + #( 124 + "text", 125 + types.Property(type_: "string", required: True, format: None, ref: None), 126 + ), 127 + #( 128 + "reply", 129 + types.Property( 130 + type_: "ref", 131 + required: False, 132 + format: None, 133 + ref: Some("com.atproto.repo.strongRef"), 134 + ), 135 + ), 136 + #( 137 + "via", 138 + types.Property( 139 + type_: "string", 140 + required: False, 141 + format: Some("at-uri"), 142 + ref: None, 143 + ), 144 + ), 145 + ])), 146 + others: dict.new(), 147 + ), 148 + ) 149 + 150 + let test_schema = create_test_schema_from_lexicons([lexicon]) 151 + 152 + let all_types = introspection.get_all_schema_types(test_schema) 153 + let serialized = sdl.print_types(all_types) 154 + 155 + // Check both forward join fields exist 156 + string.contains(serialized, "replyResolved") 157 + |> should.be_true 158 + 159 + string.contains(serialized, "viaResolved") 160 + |> should.be_true 161 + } 162 + 163 + // Test that collections without join fields don't generate extra fields 164 + pub fn no_join_fields_test() { 165 + let lexicon = 166 + types.Lexicon( 167 + id: "xyz.statusphere.status", 168 + defs: types.Defs( 169 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 170 + #( 171 + "status", 172 + types.Property(type_: "string", required: True, format: None, ref: None), 173 + ), 174 + #( 175 + "createdAt", 176 + types.Property( 177 + type_: "string", 178 + required: True, 179 + format: Some("datetime"), 180 + ref: None, 181 + ), 182 + ), 183 + ])), 184 + others: dict.new(), 185 + ), 186 + ) 187 + 188 + let test_schema = create_test_schema_from_lexicons([lexicon]) 189 + 190 + let all_types = introspection.get_all_schema_types(test_schema) 191 + let serialized = sdl.print_types(all_types) 192 + 193 + // Should not have any "Resolved" fields for non-join fields 194 + let has_status_resolved = string.contains(serialized, "statusResolved") 195 + let has_created_at_resolved = string.contains(serialized, "createdAtResolved") 196 + 197 + has_status_resolved 198 + |> should.be_false 199 + 200 + has_created_at_resolved 201 + |> should.be_false 202 + } 203 + 204 + // Test that Record union is generated for forward joins 205 + pub fn record_union_type_exists_test() { 206 + let lexicon = 207 + types.Lexicon( 208 + id: "app.bsky.feed.post", 209 + defs: types.Defs( 210 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 211 + #( 212 + "text", 213 + types.Property(type_: "string", required: True, format: None, ref: None), 214 + ), 215 + #( 216 + "reply", 217 + types.Property( 218 + type_: "ref", 219 + required: False, 220 + format: None, 221 + ref: Some("com.atproto.repo.strongRef"), 222 + ), 223 + ), 224 + ])), 225 + others: dict.new(), 226 + ), 227 + ) 228 + 229 + let test_schema = create_test_schema_from_lexicons([lexicon]) 230 + 231 + let all_types = introspection.get_all_schema_types(test_schema) 232 + let serialized = sdl.print_types(all_types) 233 + 234 + // Verify that Record union exists 235 + string.contains(serialized, "union Record") 236 + |> should.be_true 237 + } 238 + 239 + // Test that forward join fields have Record union type 240 + pub fn forward_join_field_has_union_type_test() { 241 + let lexicon = 242 + types.Lexicon( 243 + id: "app.bsky.feed.post", 244 + defs: types.Defs( 245 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 246 + #( 247 + "text", 248 + types.Property(type_: "string", required: True, format: None, ref: None), 249 + ), 250 + #( 251 + "reply", 252 + types.Property( 253 + type_: "ref", 254 + required: False, 255 + format: None, 256 + ref: Some("com.atproto.repo.strongRef"), 257 + ), 258 + ), 259 + ])), 260 + others: dict.new(), 261 + ), 262 + ) 263 + 264 + let test_schema = create_test_schema_from_lexicons([lexicon]) 265 + 266 + let all_types = introspection.get_all_schema_types(test_schema) 267 + let serialized = sdl.print_types(all_types) 268 + 269 + // Verify that replyResolved field has Record type 270 + string.contains(serialized, "replyResolved: Record") 271 + |> should.be_true 272 + } 273 +
+6 -2
lexicon_graphql/test/lexicon_parser_test.gleam
··· 2 2 /// 3 3 /// Parses AT Protocol lexicon JSON into structured Lexicon types 4 4 import gleam/list 5 + import gleam/option 5 6 import gleeunit/should 6 7 import lexicon_graphql/lexicon_parser 7 8 import lexicon_graphql/types ··· 37 38 should.equal(lexicon.id, "xyz.statusphere.status") 38 39 // Verify it has properties 39 40 case lexicon.defs.main { 40 - types.RecordDef(type_: "record", properties: props) -> { 41 + option.Some(types.RecordDef(type_: "record", key: _, properties: props)) -> { 41 42 // Should have at least text and createdAt properties 42 43 should.be_true(list.length(props) >= 2) 43 44 } 44 - types.RecordDef(type_: _, properties: _) -> { 45 + option.Some(types.RecordDef(type_: _, key: _, properties: _)) -> { 46 + should.fail() 47 + } 48 + option.None -> { 45 49 should.fail() 46 50 } 47 51 }
+11 -10
lexicon_graphql/test/mutation_builder_test.gleam
··· 1 1 /// Tests for Mutation Builder - uploadBlob mutation 2 2 /// 3 3 /// Tests the uploadBlob mutation and BlobUploadResponse type with flat structure 4 + import gleam/dict 4 5 import gleam/list 5 6 import gleam/option.{None, Some} 6 7 import gleeunit/should ··· 25 26 26 27 // Build mutation type with uploadBlob factory 27 28 let mutation_type = 28 - mutation_builder.build_mutation_type([], None, None, None, Some(upload_factory)) 29 + mutation_builder.build_mutation_type([], dict.new(), None, None, None, Some(upload_factory)) 29 30 30 31 // Verify the mutation type has uploadBlob field 31 32 let fields = schema.get_fields(mutation_type) ··· 41 42 pub fn build_mutation_type_without_upload_blob_test() { 42 43 // Build mutation type without uploadBlob factory 43 44 let mutation_type = 44 - mutation_builder.build_mutation_type([], None, None, None, None) 45 + mutation_builder.build_mutation_type([], dict.new(), None, None, None, None) 45 46 46 47 // Verify the mutation type does NOT have uploadBlob field 47 48 let fields = schema.get_fields(mutation_type) ··· 70 71 71 72 // Build mutation type 72 73 let mutation_type = 73 - mutation_builder.build_mutation_type([], None, None, None, Some(upload_factory)) 74 + mutation_builder.build_mutation_type([], dict.new(), None, None, None, Some(upload_factory)) 74 75 75 76 // Get uploadBlob field 76 77 let fields = schema.get_fields(mutation_type) ··· 112 113 } 113 114 114 115 let mutation_type = 115 - mutation_builder.build_mutation_type([], None, None, None, Some(upload_factory)) 116 + mutation_builder.build_mutation_type([], dict.new(), None, None, None, Some(upload_factory)) 116 117 117 118 // Get uploadBlob field 118 119 let fields = schema.get_fields(mutation_type) ··· 163 164 } 164 165 165 166 let mutation_type = 166 - mutation_builder.build_mutation_type([], None, None, None, Some(upload_factory)) 167 + mutation_builder.build_mutation_type([], dict.new(), None, None, None, Some(upload_factory)) 167 168 168 169 // Get uploadBlob field 169 170 let fields = schema.get_fields(mutation_type) ··· 214 215 } 215 216 216 217 let mutation_type = 217 - mutation_builder.build_mutation_type([], None, None, None, Some(upload_factory)) 218 + mutation_builder.build_mutation_type([], dict.new(), None, None, None, Some(upload_factory)) 218 219 219 220 // Get uploadBlob field 220 221 let fields = schema.get_fields(mutation_type) ··· 255 256 } 256 257 257 258 let mutation_type = 258 - mutation_builder.build_mutation_type([], None, None, None, Some(upload_factory)) 259 + mutation_builder.build_mutation_type([], dict.new(), None, None, None, Some(upload_factory)) 259 260 260 261 // Get uploadBlob field 261 262 let fields = schema.get_fields(mutation_type) ··· 296 297 } 297 298 298 299 let mutation_type = 299 - mutation_builder.build_mutation_type([], None, None, None, Some(upload_factory)) 300 + mutation_builder.build_mutation_type([], dict.new(), None, None, None, Some(upload_factory)) 300 301 301 302 // Get uploadBlob field 302 303 let fields = schema.get_fields(mutation_type) ··· 337 338 } 338 339 339 340 let mutation_type = 340 - mutation_builder.build_mutation_type([], None, None, None, Some(upload_factory)) 341 + mutation_builder.build_mutation_type([], dict.new(), None, None, None, Some(upload_factory)) 341 342 342 343 // Get uploadBlob field 343 344 let fields = schema.get_fields(mutation_type) ··· 378 379 } 379 380 380 381 let mutation_type = 381 - mutation_builder.build_mutation_type([], None, None, None, Some(upload_factory)) 382 + mutation_builder.build_mutation_type([], dict.new(), None, None, None, Some(upload_factory)) 382 383 383 384 // Get uploadBlob field 384 385 let fields = schema.get_fields(mutation_type)
+20 -14
lexicon_graphql/test/ref_resolver_test.gleam
··· 1 1 /// Tests for Lexicon Reference Resolver 2 2 /// 3 3 /// Resolves ref types in lexicon definitions to their actual types 4 + import gleam/dict 5 + import gleam/option.{None, Some} 4 6 import gleeunit/should 5 7 import lexicon_graphql/ref_resolver 6 8 import lexicon_graphql/types ··· 12 14 types.Lexicon( 13 15 id: "xyz.statusphere.post", 14 16 defs: types.Defs( 15 - main: types.RecordDef(type_: "record", properties: [ 16 - #("text", types.Property("string", True)), 17 - #("embed", types.Property("ref", False)), 18 - ]), 17 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 18 + #("text", types.Property(type_: "string", required: True, format: None, ref: None)), 19 + #("embed", types.Property(type_: "ref", required: False, format: None, ref: None)), 20 + ])), 21 + others: dict.new(), 19 22 ), 20 23 ) 21 24 ··· 31 34 types.Lexicon( 32 35 id: "xyz.statusphere.post", 33 36 defs: types.Defs( 34 - main: types.RecordDef(type_: "record", properties: [ 35 - #("text", types.Property("string", True)), 36 - #("author", types.Property("ref", False)), 37 - ]), 37 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 38 + #("text", types.Property(type_: "string", required: True, format: None, ref: None)), 39 + #("author", types.Property(type_: "ref", required: False, format: None, ref: None)), 40 + ])), 41 + others: dict.new(), 38 42 ), 39 43 ) 40 44 ··· 42 46 types.Lexicon( 43 47 id: "xyz.statusphere.profile", 44 48 defs: types.Defs( 45 - main: types.RecordDef(type_: "record", properties: [ 46 - #("displayName", types.Property("string", True)), 47 - ]), 49 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 50 + #("displayName", types.Property(type_: "string", required: True, format: None, ref: None)), 51 + ])), 52 + others: dict.new(), 48 53 ), 49 54 ) 50 55 ··· 64 69 types.Lexicon( 65 70 id: "xyz.statusphere.post", 66 71 defs: types.Defs( 67 - main: types.RecordDef(type_: "record", properties: [ 68 - #("text", types.Property("string", True)), 69 - ]), 72 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 73 + #("text", types.Property(type_: "string", required: True, format: None, ref: None)), 74 + ])), 75 + others: dict.new(), 70 76 ), 71 77 ) 72 78
+251
lexicon_graphql/test/reverse_join_test.gleam
··· 1 + /// Tests for reverse join field generation 2 + /// 3 + /// Verifies that reverse join fields are discovered and added to GraphQL schemas 4 + import gleam/dict 5 + import gleam/option.{None, Some} 6 + import gleam/string 7 + import gleeunit/should 8 + import graphql/introspection 9 + import graphql/schema 10 + import graphql/sdl 11 + import lexicon_graphql/db_schema_builder 12 + import lexicon_graphql/types 13 + 14 + // Helper to create a test schema with a mock fetcher 15 + fn create_test_schema_from_lexicons( 16 + lexicons: List(types.Lexicon), 17 + ) -> schema.Schema { 18 + // Mock fetcher that returns empty results (we're only testing schema generation) 19 + let fetcher = fn(_collection, _params) { 20 + Ok(#([], option.None, False, False, option.None)) 21 + } 22 + 23 + case 24 + db_schema_builder.build_schema_with_fetcher( 25 + lexicons, 26 + fetcher, 27 + option.None, 28 + option.None, 29 + option.None, 30 + option.None, 31 + option.None, 32 + option.None, 33 + ) 34 + { 35 + Ok(s) -> s 36 + Error(_) -> panic as "Failed to build test schema" 37 + } 38 + } 39 + 40 + // Test that a forward join in one collection creates a reverse join field in the target 41 + pub fn forward_join_creates_reverse_join_test() { 42 + // Create a Post collection (target) 43 + let post_lexicon = 44 + types.Lexicon( 45 + id: "app.bsky.feed.post", 46 + defs: types.Defs( 47 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 48 + #( 49 + "text", 50 + types.Property(type_: "string", required: True, format: None, ref: None), 51 + ), 52 + ])), 53 + others: dict.new(), 54 + ), 55 + ) 56 + 57 + // Create a Like collection with a subject field that references posts 58 + let like_lexicon = 59 + types.Lexicon( 60 + id: "app.bsky.feed.like", 61 + defs: types.Defs( 62 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 63 + #( 64 + "subject", 65 + types.Property( 66 + type_: "string", 67 + required: True, 68 + format: Some("at-uri"), 69 + ref: None, 70 + ), 71 + ), 72 + ])), 73 + others: dict.new(), 74 + ), 75 + ) 76 + 77 + let test_schema = create_test_schema_from_lexicons([post_lexicon, like_lexicon]) 78 + 79 + // Get all types and serialize to SDL 80 + let all_types = introspection.get_all_schema_types(test_schema) 81 + let serialized = sdl.print_types(all_types) 82 + 83 + // Verify that the Post type has a reverse join field for likes 84 + // Field name should be: appBskyFeedLikeViaSubject (camelCase) 85 + string.contains(serialized, "appBskyFeedLikeViaSubject") 86 + |> should.be_true 87 + } 88 + 89 + // Test that strongRef fields also create reverse joins 90 + pub fn strong_ref_creates_reverse_join_test() { 91 + // Create a Post collection (target) 92 + let post_lexicon = 93 + types.Lexicon( 94 + id: "app.bsky.feed.post", 95 + defs: types.Defs( 96 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 97 + #( 98 + "text", 99 + types.Property(type_: "string", required: True, format: None, ref: None), 100 + ), 101 + ])), 102 + others: dict.new(), 103 + ), 104 + ) 105 + 106 + // Create a Profile collection with a pinnedPost strongRef field 107 + let profile_lexicon = 108 + types.Lexicon( 109 + id: "app.bsky.actor.profile", 110 + defs: types.Defs( 111 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 112 + #( 113 + "pinnedPost", 114 + types.Property( 115 + type_: "ref", 116 + required: False, 117 + format: None, 118 + ref: Some("com.atproto.repo.strongRef"), 119 + ), 120 + ), 121 + ])), 122 + others: dict.new(), 123 + ), 124 + ) 125 + 126 + let test_schema = 127 + create_test_schema_from_lexicons([post_lexicon, profile_lexicon]) 128 + 129 + let all_types = introspection.get_all_schema_types(test_schema) 130 + let serialized = sdl.print_types(all_types) 131 + 132 + // Verify that the Post type has a reverse join field for pinned posts 133 + // Field name should be: appBskyActorProfileViaPinnedPost (camelCase) 134 + string.contains(serialized, "appBskyActorProfileViaPinnedPost") 135 + |> should.be_true 136 + } 137 + 138 + // Test that multiple reverse joins are all generated 139 + pub fn multiple_reverse_joins_test() { 140 + // Create a Post collection (target) 141 + let post_lexicon = 142 + types.Lexicon( 143 + id: "app.bsky.feed.post", 144 + defs: types.Defs( 145 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 146 + #( 147 + "text", 148 + types.Property(type_: "string", required: True, format: None, ref: None), 149 + ), 150 + ])), 151 + others: dict.new(), 152 + ), 153 + ) 154 + 155 + // Create a Like collection 156 + let like_lexicon = 157 + types.Lexicon( 158 + id: "app.bsky.feed.like", 159 + defs: types.Defs( 160 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 161 + #( 162 + "subject", 163 + types.Property( 164 + type_: "string", 165 + required: True, 166 + format: Some("at-uri"), 167 + ref: None, 168 + ), 169 + ), 170 + ])), 171 + others: dict.new(), 172 + ), 173 + ) 174 + 175 + // Create a Repost collection 176 + let repost_lexicon = 177 + types.Lexicon( 178 + id: "app.bsky.feed.repost", 179 + defs: types.Defs( 180 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 181 + #( 182 + "subject", 183 + types.Property( 184 + type_: "string", 185 + required: True, 186 + format: Some("at-uri"), 187 + ref: None, 188 + ), 189 + ), 190 + ])), 191 + others: dict.new(), 192 + ), 193 + ) 194 + 195 + let test_schema = 196 + create_test_schema_from_lexicons([post_lexicon, like_lexicon, repost_lexicon]) 197 + 198 + let all_types = introspection.get_all_schema_types(test_schema) 199 + let serialized = sdl.print_types(all_types) 200 + 201 + // Check both reverse join fields exist on Post type (camelCase) 202 + string.contains(serialized, "appBskyFeedLikeViaSubject") 203 + |> should.be_true 204 + 205 + string.contains(serialized, "appBskyFeedRepostViaSubject") 206 + |> should.be_true 207 + } 208 + 209 + // Test that collections without forward join fields don't appear in reverse joins 210 + pub fn no_false_positive_reverse_joins_test() { 211 + // Create a Post collection 212 + let post_lexicon = 213 + types.Lexicon( 214 + id: "app.bsky.feed.post", 215 + defs: types.Defs( 216 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 217 + #( 218 + "text", 219 + types.Property(type_: "string", required: True, format: None, ref: None), 220 + ), 221 + ])), 222 + others: dict.new(), 223 + ), 224 + ) 225 + 226 + // Create a Status collection with no join fields 227 + let status_lexicon = 228 + types.Lexicon( 229 + id: "xyz.statusosphere.status", 230 + defs: types.Defs( 231 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 232 + #( 233 + "message", 234 + types.Property(type_: "string", required: True, format: None, ref: None), 235 + ), 236 + ])), 237 + others: dict.new(), 238 + ), 239 + ) 240 + 241 + let test_schema = 242 + create_test_schema_from_lexicons([post_lexicon, status_lexicon]) 243 + 244 + let all_types = introspection.get_all_schema_types(test_schema) 245 + let serialized = sdl.print_types(all_types) 246 + 247 + // Status collection has no forward joins, so Post should not have a reverse join field 248 + // that references Status. We check that the field name XyzStatusosphereStatusVia* doesn't appear 249 + string.contains(serialized, "XyzStatusosphereStatusVia") 250 + |> should.be_false 251 + }
+71
lexicon_graphql/test/schema_builder_integration_test.gleam
··· 1 + /// Integration test for the new topological sort-based schema builder 2 + /// This test verifies that types with circular dependencies (via joins) are built correctly 3 + import gleeunit/should 4 + 5 + pub fn extract_dependencies_from_metadata_test() { 6 + // Test: Given collection metadata with forward/reverse/DID joins, 7 + // extract a list of type dependencies 8 + // 9 + // Example: If SocialGrainGallery has: 10 + // - Forward join to SocialGrainActorProfile (via creator field) 11 + // - Reverse join from SocialGrainGalleryItem (via gallery field) 12 + // - DID join to SocialGrainPhoto (via did field) 13 + // 14 + // Then SocialGrainGallery depends on: [SocialGrainActorProfile, SocialGrainPhoto] 15 + // (Reverse joins don't create dependencies - the source depends on the target) 16 + 17 + // This is a placeholder test - actual implementation will work with real CollectionMeta 18 + True 19 + |> should.be_true() 20 + } 21 + 22 + pub fn build_types_in_topological_order_test() { 23 + // Test: Given a list of TypeNodes with dependencies, 24 + // build GraphQL types in the correct order 25 + // 26 + // Example: If we have: 27 + // - TypeA depends on TypeB 28 + // - TypeB depends on TypeC 29 + // - TypeC has no dependencies 30 + // 31 + // Then build order should be: C, B, A 32 + // And each type should have access to previously built types when creating join fields 33 + 34 + // This is a placeholder test - actual implementation will work with GraphQL schema types 35 + True 36 + |> should.be_true() 37 + } 38 + 39 + pub fn circular_reference_via_reverse_joins_test() { 40 + // Test: Circular references via reverse joins should work 41 + // 42 + // Example: 43 + // - SocialGrainGallery has reverse join from SocialGrainGalleryItem 44 + // - SocialGrainGalleryItem has forward join to SocialGrainGallery 45 + // 46 + // The topological sort should handle this because: 47 + // - Forward joins create dependencies (GalleryItem depends on Gallery) 48 + // - Reverse joins don't (Gallery doesn't depend on GalleryItem) 49 + // 50 + // Build order: Gallery, then GalleryItem 51 + // Then add reverse join field to Gallery pointing to GalleryItem 52 + 53 + True 54 + |> should.be_true() 55 + } 56 + 57 + pub fn circular_reference_via_did_joins_test() { 58 + // Test: Circular references via DID joins should work 59 + // 60 + // Example: 61 + // - SocialGrainActorProfile has DID join to SocialGrainGallery 62 + // - SocialGrainGallery has DID join to SocialGrainActorProfile 63 + // 64 + // Both types have the same DID field, so neither depends on the other structurally. 65 + // They can be built in any order, then DID join fields added. 66 + // 67 + // This should NOT create a circular dependency in the graph. 68 + 69 + True 70 + |> should.be_true() 71 + }
+25 -18
lexicon_graphql/test/schema_builder_test.gleam
··· 4 4 /// Uses birdie to capture and verify the generated schemas 5 5 6 6 import birdie 7 + import gleam/dict 8 + import gleam/option.{None, Some} 7 9 import gleeunit/should 8 10 import graphql/introspection 9 11 import graphql/schema ··· 18 20 types.Lexicon( 19 21 id: "xyz.statusphere.status", 20 22 defs: types.Defs( 21 - main: types.RecordDef(type_: "record", properties: [ 22 - #("text", types.Property("string", False)), 23 - #("createdAt", types.Property("string", True)), 24 - ]), 23 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 24 + #("text", types.Property(type_: "string", required: False, format: None, ref: None)), 25 + #("createdAt", types.Property(type_: "string", required: True, format: None, ref: None)), 26 + ])), 27 + others: dict.new(), 25 28 ), 26 29 ) 27 30 ··· 44 47 types.Lexicon( 45 48 id: "xyz.statusphere.status", 46 49 defs: types.Defs( 47 - main: types.RecordDef(type_: "record", properties: [ 48 - #("text", types.Property("string", False)), 49 - ]), 50 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 51 + #("text", types.Property(type_: "string", required: False, format: None, ref: None)), 52 + ])), 53 + others: dict.new(), 50 54 ), 51 55 ) 52 56 ··· 54 58 types.Lexicon( 55 59 id: "xyz.statusphere.profile", 56 60 defs: types.Defs( 57 - main: types.RecordDef(type_: "record", properties: [ 58 - #("displayName", types.Property("string", False)), 59 - ]), 61 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 62 + #("displayName", types.Property(type_: "string", required: False, format: None, ref: None)), 63 + ])), 64 + others: dict.new(), 60 65 ), 61 66 ) 62 67 ··· 79 84 types.Lexicon( 80 85 id: "app.bsky.feed.post", 81 86 defs: types.Defs( 82 - main: types.RecordDef(type_: "record", properties: [ 83 - #("text", types.Property("string", True)), 84 - #("replyCount", types.Property("integer", False)), 85 - ]), 87 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 88 + #("text", types.Property(type_: "string", required: True, format: None, ref: None)), 89 + #("replyCount", types.Property(type_: "integer", required: False, format: None, ref: None)), 90 + ])), 91 + others: dict.new(), 86 92 ), 87 93 ) 88 94 ··· 113 119 types.Lexicon( 114 120 id: "xyz.statusphere.status", 115 121 defs: types.Defs( 116 - main: types.RecordDef(type_: "record", properties: [ 117 - #("text", types.Property("string", False)), 118 - #("createdAt", types.Property("string", True)), 119 - ]), 122 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 123 + #("text", types.Property(type_: "string", required: False, format: None, ref: None)), 124 + #("createdAt", types.Property(type_: "string", required: True, format: None, ref: None)), 125 + ])), 126 + others: dict.new(), 120 127 ), 121 128 ) 122 129
+134 -23
lexicon_graphql/test/sorting_test.gleam
··· 8 8 /// 9 9 /// Uses birdie to capture and verify the generated schemas 10 10 import birdie 11 + import gleam/dict 11 12 import gleam/list 12 - import gleam/option.{Some} 13 + import gleam/option.{None, Some} 13 14 import gleeunit/should 14 15 import graphql/introspection 15 16 import graphql/schema ··· 35 36 option.None, 36 37 option.None, 37 38 option.None, 39 + option.None, 40 + option.None, 38 41 ) 39 42 { 40 43 Ok(s) -> s ··· 48 51 types.Lexicon( 49 52 "xyz.statusphere.status", 50 53 types.Defs( 51 - types.RecordDef("record", [ 52 - #("status", types.Property("string", False)), 53 - #("createdAt", types.Property("string", False)), 54 - ]), 54 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 55 + #("status", types.Property(type_: "string", required: False, format: None, ref: None)), 56 + #("createdAt", types.Property(type_: "string", required: False, format: None, ref: None)), 57 + ])), 58 + others: dict.new(), 55 59 ), 56 60 ) 57 61 ··· 72 76 types.Lexicon( 73 77 "xyz.statusphere.status", 74 78 types.Defs( 75 - types.RecordDef("record", [ 76 - #("status", types.Property("string", False)), 77 - #("createdAt", types.Property("string", False)), 78 - ]), 79 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 80 + #("status", types.Property(type_: "string", required: False, format: None, ref: None)), 81 + #("createdAt", types.Property(type_: "string", required: False, format: None, ref: None)), 82 + ])), 83 + others: dict.new(), 79 84 ), 80 85 ) 81 86 ··· 83 88 types.Lexicon( 84 89 "app.bsky.feed.post", 85 90 types.Defs( 86 - types.RecordDef("record", [ 87 - #("text", types.Property("string", False)), 88 - #("likeCount", types.Property("integer", False)), 89 - ]), 91 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 92 + #("text", types.Property(type_: "string", required: False, format: None, ref: None)), 93 + #("likeCount", types.Property(type_: "integer", required: False, format: None, ref: None)), 94 + ])), 95 + others: dict.new(), 90 96 ), 91 97 ) 92 98 ··· 107 113 types.Lexicon( 108 114 "xyz.statusphere.status", 109 115 types.Defs( 110 - types.RecordDef("record", [ 111 - #("status", types.Property("string", False)), 112 - ]), 116 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 117 + #("status", types.Property(type_: "string", required: False, format: None, ref: None)), 118 + ])), 119 + others: dict.new(), 113 120 ), 114 121 ) 115 122 ··· 140 147 types.Lexicon( 141 148 "xyz.statusphere.status", 142 149 types.Defs( 143 - types.RecordDef("record", [ 144 - #("status", types.Property("string", False)), 145 - ]), 150 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 151 + #("status", types.Property(type_: "string", required: False, format: None, ref: None)), 152 + ])), 153 + others: dict.new(), 146 154 ), 147 155 ) 148 156 ··· 171 179 types.Lexicon( 172 180 "xyz.statusphere.status", 173 181 types.Defs( 174 - types.RecordDef("record", [ 175 - #("text", types.Property("string", False)), 176 - #("createdAt", types.Property("string", False)), 177 - ]), 182 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 183 + #("text", types.Property(type_: "string", required: False, format: None, ref: None)), 184 + #("createdAt", types.Property(type_: "string", required: False, format: None, ref: None)), 185 + ])), 186 + others: dict.new(), 178 187 ), 179 188 ) 180 189 ··· 189 198 content: serialized, 190 199 ) 191 200 } 201 + 202 + // Test: Sort enum only includes primitive types (string, integer, boolean, number) 203 + pub fn sort_enum_excludes_blob_and_ref_types_test() { 204 + let lexicon = 205 + types.Lexicon( 206 + "app.bsky.test.record", 207 + types.Defs( 208 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 209 + #("stringField", types.Property(type_: "string", required: False, format: None, ref: None)), 210 + #("intField", types.Property(type_: "integer", required: False, format: None, ref: None)), 211 + #("boolField", types.Property(type_: "boolean", required: False, format: None, ref: None)), 212 + #("numberField", types.Property(type_: "number", required: False, format: None, ref: None)), 213 + #("uriField", types.Property(type_: "string", required: False, format: Some("at-uri"), ref: None)), 214 + // Non-sortable types that should be excluded 215 + #("blobField", types.Property(type_: "blob", required: False, format: None, ref: None)), 216 + #("refField", types.Property(type_: "ref", required: False, format: None, ref: Some("app.bsky.test.object"))), 217 + ])), 218 + others: dict.new(), 219 + ), 220 + ) 221 + 222 + let test_schema = create_test_schema_from_lexicons([lexicon]) 223 + let all_types = introspection.get_all_schema_types(test_schema) 224 + 225 + // Find the SortField enum 226 + let sort_enum = 227 + list.find(all_types, fn(t) { 228 + schema.type_name(t) == "AppBskyTestRecordSortField" 229 + }) 230 + 231 + case sort_enum { 232 + Ok(enum_type) -> { 233 + let enum_values = schema.get_enum_values(enum_type) 234 + let value_names = list.map(enum_values, schema.enum_value_name) 235 + 236 + // Should include primitive fields 237 + should.be_true(list.contains(value_names, "stringField")) 238 + should.be_true(list.contains(value_names, "intField")) 239 + should.be_true(list.contains(value_names, "boolField")) 240 + should.be_true(list.contains(value_names, "numberField")) 241 + should.be_true(list.contains(value_names, "uriField")) 242 + 243 + // Should include standard fields 244 + should.be_true(list.contains(value_names, "uri")) 245 + should.be_true(list.contains(value_names, "cid")) 246 + should.be_true(list.contains(value_names, "did")) 247 + should.be_true(list.contains(value_names, "collection")) 248 + should.be_true(list.contains(value_names, "indexedAt")) 249 + 250 + // Should NOT include blob or ref fields 251 + should.be_false(list.contains(value_names, "blobField")) 252 + should.be_false(list.contains(value_names, "refField")) 253 + 254 + // Should NOT include actorHandle (it's a computed field, not sortable) 255 + should.be_false(list.contains(value_names, "actorHandle")) 256 + } 257 + Error(_) -> should.fail() 258 + } 259 + } 260 + 261 + // Snapshot test: Sort enum with mixed field types 262 + pub fn sort_enum_with_mixed_field_types_snapshot_test() { 263 + let lexicon = 264 + types.Lexicon( 265 + "app.bsky.test.record", 266 + types.Defs( 267 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 268 + // Sortable primitive types 269 + #("stringField", types.Property(type_: "string", required: False, format: None, ref: None)), 270 + #("intField", types.Property(type_: "integer", required: False, format: None, ref: None)), 271 + #("boolField", types.Property(type_: "boolean", required: False, format: None, ref: None)), 272 + #("numberField", types.Property(type_: "number", required: False, format: None, ref: None)), 273 + #("datetimeField", types.Property(type_: "string", required: False, format: Some("datetime"), ref: None)), 274 + #("uriField", types.Property(type_: "string", required: False, format: Some("at-uri"), ref: None)), 275 + // Non-sortable types 276 + #("blobField", types.Property(type_: "blob", required: False, format: None, ref: None)), 277 + #("refField", types.Property(type_: "ref", required: False, format: None, ref: Some("com.atproto.repo.strongRef"))), 278 + ])), 279 + others: dict.new(), 280 + ), 281 + ) 282 + 283 + let test_schema = create_test_schema_from_lexicons([lexicon]) 284 + let all_types = introspection.get_all_schema_types(test_schema) 285 + 286 + // Find and print the SortField enum 287 + let sort_enum = 288 + list.find(all_types, fn(t) { 289 + schema.type_name(t) == "AppBskyTestRecordSortField" 290 + }) 291 + 292 + case sort_enum { 293 + Ok(enum_type) -> { 294 + let serialized = sdl.print_type(enum_type) 295 + birdie.snap( 296 + title: "SortField enum with mixed types - only includes primitives", 297 + content: serialized, 298 + ) 299 + } 300 + Error(_) -> should.fail() 301 + } 302 + }
+7
lexicon_graphql/test/test_helpers.gleam
··· 1 + /// Test helpers for creating Dynamic values in tests 2 + import gleam/dynamic.{type Dynamic} 3 + 4 + /// Convert any Gleam value to Dynamic for testing purposes 5 + /// This uses FFI to unsafely coerce values - only use in tests! 6 + @external(erlang, "test_helpers_ffi", "to_dynamic") 7 + pub fn to_dynamic(value: a) -> Dynamic
+6
lexicon_graphql/test/test_helpers_ffi.erl
··· 1 + -module(test_helpers_ffi). 2 + -export([to_dynamic/1]). 3 + 4 + %% Convert any value to Dynamic (which is just the identity function in Erlang) 5 + %% since all Erlang values are already "dynamic" 6 + to_dynamic(Value) -> Value.
+201
lexicon_graphql/test/uri_extractor_test.gleam
··· 1 + /// Tests for URI Extraction 2 + /// 3 + /// Tests that we correctly extract URIs from strongRef objects and at-uri strings 4 + import gleam/dict 5 + import gleam/option.{None, Some} 6 + import gleeunit/should 7 + import lexicon_graphql/uri_extractor 8 + import test_helpers 9 + 10 + // Test extracting URI from a strongRef object 11 + pub fn extract_uri_from_strong_ref_test() { 12 + // Create a strongRef object as a dynamic value 13 + let strong_ref = 14 + dict.from_list([ 15 + #("$type", test_helpers.to_dynamic("com.atproto.repo.strongRef")), 16 + #( 17 + "uri", 18 + test_helpers.to_dynamic("at://did:plc:abc123/app.bsky.feed.post/3k7h8xyz"), 19 + ), 20 + #("cid", test_helpers.to_dynamic("bafyreiabc123...")), 21 + ]) 22 + |> test_helpers.to_dynamic 23 + 24 + case uri_extractor.extract_uri(strong_ref) { 25 + Some(uri) -> 26 + uri 27 + |> should.equal("at://did:plc:abc123/app.bsky.feed.post/3k7h8xyz") 28 + None -> panic as "Expected to extract URI from strongRef" 29 + } 30 + } 31 + 32 + // Test extracting URI from a plain at-uri string 33 + pub fn extract_uri_from_at_uri_string_test() { 34 + let at_uri = 35 + test_helpers.to_dynamic("at://did:plc:xyz789/app.bsky.actor.profile/self") 36 + 37 + case uri_extractor.extract_uri(at_uri) { 38 + Some(uri) -> 39 + uri 40 + |> should.equal("at://did:plc:xyz789/app.bsky.actor.profile/self") 41 + None -> panic as "Expected to extract URI from at-uri string" 42 + } 43 + } 44 + 45 + // Test that non-URI strings return None 46 + pub fn extract_uri_from_non_uri_string_test() { 47 + let non_uri = test_helpers.to_dynamic("not-an-at-uri") 48 + 49 + uri_extractor.extract_uri(non_uri) 50 + |> should.equal(None) 51 + } 52 + 53 + // Test that invalid objects return None 54 + pub fn extract_uri_from_invalid_object_test() { 55 + let invalid_obj = 56 + dict.from_list([ 57 + #("foo", test_helpers.to_dynamic("bar")), 58 + #("baz", test_helpers.to_dynamic("qux")), 59 + ]) 60 + |> test_helpers.to_dynamic 61 + 62 + uri_extractor.extract_uri(invalid_obj) 63 + |> should.equal(None) 64 + } 65 + 66 + // Test that null/None returns None 67 + pub fn extract_uri_from_null_test() { 68 + let null_val = test_helpers.to_dynamic(None) 69 + 70 + uri_extractor.extract_uri(null_val) 71 + |> should.equal(None) 72 + } 73 + 74 + // Test strongRef without $type field (should still work) 75 + pub fn extract_uri_from_strong_ref_without_type_test() { 76 + let strong_ref_no_type = 77 + dict.from_list([ 78 + #("uri", test_helpers.to_dynamic("at://did:plc:test/collection/rkey")), 79 + #("cid", test_helpers.to_dynamic("bafyrei...")), 80 + ]) 81 + |> test_helpers.to_dynamic 82 + 83 + case uri_extractor.extract_uri(strong_ref_no_type) { 84 + Some(uri) -> uri |> should.equal("at://did:plc:test/collection/rkey") 85 + None -> panic as "Expected to extract URI even without $type" 86 + } 87 + } 88 + 89 + // Test is_strong_ref helper 90 + pub fn is_strong_ref_test() { 91 + let strong_ref = 92 + dict.from_list([ 93 + #("$type", test_helpers.to_dynamic("com.atproto.repo.strongRef")), 94 + #("uri", test_helpers.to_dynamic("at://did:plc:abc/collection/key")), 95 + #("cid", test_helpers.to_dynamic("bafyrei...")), 96 + ]) 97 + |> test_helpers.to_dynamic 98 + 99 + uri_extractor.is_strong_ref(strong_ref) 100 + |> should.be_true 101 + } 102 + 103 + // Test is_strong_ref with non-strongRef 104 + pub fn is_strong_ref_negative_test() { 105 + let not_strong_ref = 106 + dict.from_list([#("$type", test_helpers.to_dynamic("some.other.type"))]) 107 + |> test_helpers.to_dynamic 108 + 109 + uri_extractor.is_strong_ref(not_strong_ref) 110 + |> should.be_false 111 + } 112 + 113 + // Test is_at_uri_string helper 114 + pub fn is_at_uri_string_test() { 115 + let at_uri = test_helpers.to_dynamic("at://did:plc:test/app.bsky.feed.post/123") 116 + 117 + uri_extractor.is_at_uri_string(at_uri) 118 + |> should.be_true 119 + } 120 + 121 + // Test is_at_uri_string with non-at-uri 122 + pub fn is_at_uri_string_negative_test() { 123 + let not_at_uri = test_helpers.to_dynamic("https://example.com") 124 + 125 + uri_extractor.is_at_uri_string(not_at_uri) 126 + |> should.be_false 127 + } 128 + 129 + // Test is_at_uri_string with object 130 + pub fn is_at_uri_string_with_object_test() { 131 + let obj = 132 + dict.from_list([#("uri", test_helpers.to_dynamic("at://..."))]) 133 + |> test_helpers.to_dynamic 134 + 135 + uri_extractor.is_at_uri_string(obj) 136 + |> should.be_false 137 + } 138 + 139 + // Test extracting from strongRef with wrong $type 140 + pub fn extract_uri_from_wrong_type_test() { 141 + let wrong_type = 142 + dict.from_list([ 143 + #("$type", test_helpers.to_dynamic("wrong.type")), 144 + #("uri", test_helpers.to_dynamic("at://did:plc:test/collection/key")), 145 + ]) 146 + |> test_helpers.to_dynamic 147 + 148 + // Should still extract the URI even if $type is wrong, as long as uri field exists 149 + case uri_extractor.extract_uri(wrong_type) { 150 + Some(uri) -> uri |> should.equal("at://did:plc:test/collection/key") 151 + None -> panic as "Expected to extract URI with wrong $type" 152 + } 153 + } 154 + 155 + // Test extracting from number (should return None) 156 + pub fn extract_uri_from_number_test() { 157 + let number = test_helpers.to_dynamic(42) 158 + 159 + uri_extractor.extract_uri(number) 160 + |> should.equal(None) 161 + } 162 + 163 + // Test extracting from boolean (should return None) 164 + pub fn extract_uri_from_boolean_test() { 165 + let boolean = test_helpers.to_dynamic(True) 166 + 167 + uri_extractor.extract_uri(boolean) 168 + |> should.equal(None) 169 + } 170 + 171 + // Test extracting from empty string (should return None) 172 + pub fn extract_uri_from_empty_string_test() { 173 + let empty = test_helpers.to_dynamic("") 174 + 175 + uri_extractor.extract_uri(empty) 176 + |> should.equal(None) 177 + } 178 + 179 + // Test extracting URI with various valid formats 180 + pub fn extract_uri_various_formats_test() { 181 + // With tid 182 + let uri1 = test_helpers.to_dynamic("at://did:plc:abc/app.bsky.feed.post/3k7h8...") 183 + case uri_extractor.extract_uri(uri1) { 184 + Some(_) -> Nil 185 + None -> panic as "Expected valid URI with tid" 186 + } 187 + 188 + // With literal:self 189 + let uri2 = test_helpers.to_dynamic("at://did:plc:abc/app.bsky.actor.profile/self") 190 + case uri_extractor.extract_uri(uri2) { 191 + Some(_) -> Nil 192 + None -> panic as "Expected valid URI with literal:self" 193 + } 194 + 195 + // With custom rkey 196 + let uri3 = test_helpers.to_dynamic("at://did:plc:abc/collection/custom-key-123") 197 + case uri_extractor.extract_uri(uri3) { 198 + Some(_) -> Nil 199 + None -> panic as "Expected valid URI with custom key" 200 + } 201 + }
+140 -1
lexicon_graphql/test/where_schema_test.gleam
··· 4 4 /// generated for where input filtering 5 5 6 6 import birdie 7 - import lexicon_graphql/connection 7 + import gleam/dict 8 + import gleam/list 9 + import gleam/option.{None, Some} 8 10 import gleeunit 11 + import gleeunit/should 12 + import graphql/introspection 9 13 import graphql/schema 10 14 import graphql/sdl 15 + import lexicon_graphql/connection 16 + import lexicon_graphql/db_schema_builder 17 + import lexicon_graphql/types 11 18 12 19 pub fn main() { 13 20 gleeunit.main() 21 + } 22 + 23 + // Helper to create a test schema with a mock fetcher 24 + fn create_test_schema_from_lexicons( 25 + lexicons: List(types.Lexicon), 26 + ) -> schema.Schema { 27 + // Mock fetcher that returns empty results (we're only testing schema generation) 28 + let fetcher = fn(_collection, _params) { 29 + Ok(#([], option.None, False, False, option.None)) 30 + } 31 + 32 + case 33 + db_schema_builder.build_schema_with_fetcher( 34 + lexicons, 35 + fetcher, 36 + option.None, 37 + option.None, 38 + option.None, 39 + option.None, 40 + option.None, 41 + option.None, 42 + ) 43 + { 44 + Ok(s) -> s 45 + Error(_) -> panic as "Failed to build test schema" 46 + } 14 47 } 15 48 16 49 // ===== Simple Record Type ===== ··· 167 200 content: serialized, 168 201 ) 169 202 } 203 + 204 + // ===== Integration Tests with db_schema_builder ===== 205 + 206 + // Test: WHERE input only includes primitive types (string, integer, boolean, number) 207 + pub fn where_input_excludes_blob_and_ref_types_test() { 208 + let lexicon = 209 + types.Lexicon( 210 + "app.bsky.test.record", 211 + types.Defs( 212 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 213 + #("stringField", types.Property(type_: "string", required: False, format: None, ref: None)), 214 + #("intField", types.Property(type_: "integer", required: False, format: None, ref: None)), 215 + #("boolField", types.Property(type_: "boolean", required: False, format: None, ref: None)), 216 + #("numberField", types.Property(type_: "number", required: False, format: None, ref: None)), 217 + #("uriField", types.Property(type_: "string", required: False, format: Some("at-uri"), ref: None)), 218 + // Non-sortable types that should be excluded 219 + #("blobField", types.Property(type_: "blob", required: False, format: None, ref: None)), 220 + #("refField", types.Property(type_: "ref", required: False, format: None, ref: Some("app.bsky.test.object"))), 221 + ])), 222 + others: dict.new(), 223 + ), 224 + ) 225 + 226 + let test_schema = create_test_schema_from_lexicons([lexicon]) 227 + let all_types = introspection.get_all_schema_types(test_schema) 228 + 229 + // Find the WhereInput type 230 + let where_input = 231 + list.find(all_types, fn(t) { 232 + schema.type_name(t) == "AppBskyTestRecordWhereInput" 233 + }) 234 + 235 + case where_input { 236 + Ok(input_type) -> { 237 + let input_fields = schema.get_input_fields(input_type) 238 + let field_names = list.map(input_fields, schema.input_field_name) 239 + 240 + // Should include primitive fields 241 + should.be_true(list.contains(field_names, "stringField")) 242 + should.be_true(list.contains(field_names, "intField")) 243 + should.be_true(list.contains(field_names, "boolField")) 244 + should.be_true(list.contains(field_names, "numberField")) 245 + should.be_true(list.contains(field_names, "uriField")) 246 + 247 + // Should include standard fields 248 + should.be_true(list.contains(field_names, "uri")) 249 + should.be_true(list.contains(field_names, "cid")) 250 + should.be_true(list.contains(field_names, "did")) 251 + should.be_true(list.contains(field_names, "collection")) 252 + should.be_true(list.contains(field_names, "indexedAt")) 253 + should.be_true(list.contains(field_names, "actorHandle")) 254 + 255 + // Should include AND/OR fields 256 + should.be_true(list.contains(field_names, "and")) 257 + should.be_true(list.contains(field_names, "or")) 258 + 259 + // Should NOT include blob or ref fields 260 + should.be_false(list.contains(field_names, "blobField")) 261 + should.be_false(list.contains(field_names, "refField")) 262 + } 263 + Error(_) -> should.fail() 264 + } 265 + } 266 + 267 + // Snapshot test: WHERE input with mixed field types 268 + pub fn where_input_with_mixed_field_types_snapshot_test() { 269 + let lexicon = 270 + types.Lexicon( 271 + "app.bsky.test.record", 272 + types.Defs( 273 + main: Some(types.RecordDef(type_: "record", key: None, properties: [ 274 + // Sortable primitive types 275 + #("stringField", types.Property(type_: "string", required: False, format: None, ref: None)), 276 + #("intField", types.Property(type_: "integer", required: False, format: None, ref: None)), 277 + #("boolField", types.Property(type_: "boolean", required: False, format: None, ref: None)), 278 + #("numberField", types.Property(type_: "number", required: False, format: None, ref: None)), 279 + #("datetimeField", types.Property(type_: "string", required: False, format: Some("datetime"), ref: None)), 280 + #("uriField", types.Property(type_: "string", required: False, format: Some("at-uri"), ref: None)), 281 + // Non-sortable types 282 + #("blobField", types.Property(type_: "blob", required: False, format: None, ref: None)), 283 + #("refField", types.Property(type_: "ref", required: False, format: None, ref: Some("com.atproto.repo.strongRef"))), 284 + ])), 285 + others: dict.new(), 286 + ), 287 + ) 288 + 289 + let test_schema = create_test_schema_from_lexicons([lexicon]) 290 + let all_types = introspection.get_all_schema_types(test_schema) 291 + 292 + // Find and print the WhereInput type 293 + let where_input = 294 + list.find(all_types, fn(t) { 295 + schema.type_name(t) == "AppBskyTestRecordWhereInput" 296 + }) 297 + 298 + case where_input { 299 + Ok(input_type) -> { 300 + let serialized = sdl.print_type(input_type) 301 + birdie.snap( 302 + title: "WhereInput with mixed types - only includes primitives", 303 + content: serialized, 304 + ) 305 + } 306 + Error(_) -> should.fail() 307 + } 308 + }
server/priv/lexicons/app/bsky/actor/profile.json server/test/fixtures/grain/lexicons/app/bsky/actor/profile.json
server/priv/lexicons/com/atproto/label/defs.json server/test/fixtures/grain/lexicons/com/atproto/label/defs.json
server/priv/lexicons/com/atproto/repo/strongRef.json server/test/fixtures/grain/lexicons/com/atproto/repo/strongRef.json
server/priv/lexicons/xyz/statusphere/status.json server/test/fixtures/statusphere/lexicons/xyz/statusphere/status.json
+537 -3
server/src/database.gleam
··· 700 700 701 701 // Check if there are more results 702 702 let has_more = list.length(records) > limit 703 - let final_records = case has_more { 703 + let trimmed_records = case has_more { 704 704 True -> list.take(records, limit) 705 705 False -> records 706 + } 707 + 708 + // For backward pagination, reverse the results to restore original order 709 + let final_records = case is_forward { 710 + True -> trimmed_records 711 + False -> list.reverse(trimmed_records) 706 712 } 707 713 708 714 // Calculate hasNextPage and hasPreviousPage ··· 755 761 None -> [#("indexed_at", "desc")] 756 762 } 757 763 764 + // For backward pagination (last/before), reverse the sort order 765 + let query_sort_fields = case is_forward { 766 + True -> sort_fields 767 + False -> reverse_sort_fields(sort_fields) 768 + } 769 + 758 770 // Check if we need to join with actor table 759 771 let needs_actor_join = case where { 760 772 Some(wc) -> where_clause.requires_actor_join(wc) ··· 762 774 } 763 775 764 776 // Build the ORDER BY clause (with table prefix if doing a join) 765 - let order_by_clause = build_order_by(sort_fields, needs_actor_join) 777 + let order_by_clause = build_order_by(query_sort_fields, needs_actor_join) 766 778 767 779 // Build FROM clause with optional LEFT JOIN 768 780 let from_clause = case needs_actor_join { ··· 856 868 857 869 // Check if there are more results 858 870 let has_more = list.length(records) > limit 859 - let final_records = case has_more { 871 + let trimmed_records = case has_more { 860 872 True -> list.take(records, limit) 861 873 False -> records 874 + } 875 + 876 + // For backward pagination, reverse the results to restore original order 877 + let final_records = case is_forward { 878 + True -> trimmed_records 879 + False -> list.reverse(trimmed_records) 862 880 } 863 881 864 882 // Calculate hasNextPage and hasPreviousPage ··· 960 978 961 979 /// Builds an ORDER BY clause from sort fields 962 980 /// use_table_prefix: if True, prefixes table columns with "record." for joins 981 + /// Reverse sort direction for backward pagination 982 + fn reverse_sort_direction(direction: String) -> String { 983 + case string.lowercase(direction) { 984 + "asc" -> "desc" 985 + "desc" -> "asc" 986 + _ -> "asc" 987 + } 988 + } 989 + 990 + /// Reverse sort fields for backward pagination 991 + fn reverse_sort_fields( 992 + sort_fields: List(#(String, String)), 993 + ) -> List(#(String, String)) { 994 + list.map(sort_fields, fn(field) { 995 + let #(field_name, direction) = field 996 + #(field_name, reverse_sort_direction(direction)) 997 + }) 998 + } 999 + 963 1000 fn build_order_by( 964 1001 sort_fields: List(#(String, String)), 965 1002 use_table_prefix: Bool, ··· 1007 1044 False -> string.join(order_parts, ", ") 1008 1045 } 1009 1046 } 1047 + 1048 + /// Get records by a list of URIs (for forward joins / DataLoader) 1049 + /// Returns records in any order - caller must group them 1050 + pub fn get_records_by_uris( 1051 + conn: sqlight.Connection, 1052 + uris: List(String), 1053 + ) -> Result(List(Record), sqlight.Error) { 1054 + case uris { 1055 + [] -> Ok([]) 1056 + _ -> { 1057 + // Build placeholders for SQL IN clause 1058 + let placeholders = 1059 + list.map(uris, fn(_) { "?" }) 1060 + |> string.join(", ") 1061 + 1062 + let sql = 1063 + " 1064 + SELECT uri, cid, did, collection, json, indexed_at 1065 + FROM record 1066 + WHERE uri IN (" <> placeholders <> ") 1067 + " 1068 + 1069 + // Convert URIs to sqlight.Value list 1070 + let params = list.map(uris, sqlight.text) 1071 + 1072 + let decoder = { 1073 + use uri <- decode.field(0, decode.string) 1074 + use cid <- decode.field(1, decode.string) 1075 + use did <- decode.field(2, decode.string) 1076 + use collection <- decode.field(3, decode.string) 1077 + use json <- decode.field(4, decode.string) 1078 + use indexed_at <- decode.field(5, decode.string) 1079 + decode.success(Record(uri:, cid:, did:, collection:, json:, indexed_at:)) 1080 + } 1081 + 1082 + sqlight.query(sql, on: conn, with: params, expecting: decoder) 1083 + } 1084 + } 1085 + } 1086 + 1087 + /// Get records by reference field (for reverse joins / DataLoader) 1088 + /// Finds all records in a collection where a field references one of the parent URIs 1089 + /// Note: This does a JSON field extraction, so it may be slow on large datasets 1090 + pub fn get_records_by_reference_field( 1091 + conn: sqlight.Connection, 1092 + collection: String, 1093 + field_name: String, 1094 + parent_uris: List(String), 1095 + ) -> Result(List(Record), sqlight.Error) { 1096 + case parent_uris { 1097 + [] -> Ok([]) 1098 + _ -> { 1099 + // Build placeholders for SQL IN clause 1100 + let placeholders = 1101 + list.map(parent_uris, fn(_) { "?" }) 1102 + |> string.join(", ") 1103 + 1104 + // Use SQLite JSON extraction to find records where field_name matches parent URIs 1105 + // This supports both simple string fields and strongRef objects with a "uri" field 1106 + let sql = 1107 + " 1108 + SELECT uri, cid, did, collection, json, indexed_at 1109 + FROM record 1110 + WHERE collection = ? 1111 + AND ( 1112 + json_extract(json, '$." <> field_name <> "') IN (" <> placeholders <> ") 1113 + OR json_extract(json, '$." <> field_name <> ".uri') IN (" <> placeholders <> ") 1114 + ) 1115 + " 1116 + 1117 + // Build params: collection + parent_uris twice (once for direct match, once for strongRef) 1118 + let params = 1119 + list.flatten([ 1120 + [sqlight.text(collection)], 1121 + list.map(parent_uris, sqlight.text), 1122 + list.map(parent_uris, sqlight.text), 1123 + ]) 1124 + 1125 + let decoder = { 1126 + use uri <- decode.field(0, decode.string) 1127 + use cid <- decode.field(1, decode.string) 1128 + use did <- decode.field(2, decode.string) 1129 + use collection <- decode.field(3, decode.string) 1130 + use json <- decode.field(4, decode.string) 1131 + use indexed_at <- decode.field(5, decode.string) 1132 + decode.success(Record(uri:, cid:, did:, collection:, json:, indexed_at:)) 1133 + } 1134 + 1135 + sqlight.query(sql, on: conn, with: params, expecting: decoder) 1136 + } 1137 + } 1138 + } 1139 + 1140 + /// Get records by reference field with pagination (for reverse joins with connections) 1141 + /// Similar to get_records_by_reference_field but supports cursor-based pagination 1142 + /// Returns: (records, next_cursor, has_next_page, has_previous_page, total_count) 1143 + pub fn get_records_by_reference_field_paginated( 1144 + conn: sqlight.Connection, 1145 + collection: String, 1146 + field_name: String, 1147 + parent_uri: String, 1148 + first: Option(Int), 1149 + after: Option(String), 1150 + last: Option(Int), 1151 + before: Option(String), 1152 + sort_by: Option(List(#(String, String))), 1153 + where_clause: Option(where_clause.WhereClause), 1154 + ) -> Result( 1155 + #(List(Record), Option(String), Bool, Bool, Option(Int)), 1156 + sqlight.Error, 1157 + ) { 1158 + // Validate pagination arguments 1159 + let #(limit, is_forward, cursor_opt) = case first, last { 1160 + Some(f), None -> #(f, True, after) 1161 + None, Some(l) -> #(l, False, before) 1162 + Some(f), Some(_) -> #(f, True, after) 1163 + None, None -> #(50, True, None) 1164 + } 1165 + 1166 + // Default sort order if not specified 1167 + let sort_fields = case sort_by { 1168 + Some(fields) -> fields 1169 + None -> [#("indexed_at", "desc")] 1170 + } 1171 + 1172 + // For backward pagination (last/before), reverse the sort order 1173 + let query_sort_fields = case is_forward { 1174 + True -> sort_fields 1175 + False -> reverse_sort_fields(sort_fields) 1176 + } 1177 + 1178 + // Build the ORDER BY clause 1179 + let order_by_clause = build_order_by(query_sort_fields, False) 1180 + 1181 + // Build WHERE clause parts for reference field matching 1182 + let base_where_parts = [ 1183 + "collection = ?", 1184 + "(json_extract(json, '$." <> field_name <> "') = ? OR json_extract(json, '$." <> field_name <> ".uri') = ?)", 1185 + ] 1186 + let base_bind_values = [ 1187 + sqlight.text(collection), 1188 + sqlight.text(parent_uri), 1189 + sqlight.text(parent_uri), 1190 + ] 1191 + 1192 + // Add where clause conditions if present 1193 + let #(with_where_parts, with_where_values) = case where_clause { 1194 + Some(clause) -> { 1195 + let #(where_sql, where_params) = 1196 + where_clause.build_where_sql(clause, False) 1197 + case where_sql { 1198 + "" -> #(base_where_parts, base_bind_values) 1199 + _ -> #( 1200 + list.append(base_where_parts, [where_sql]), 1201 + list.append(base_bind_values, where_params), 1202 + ) 1203 + } 1204 + } 1205 + None -> #(base_where_parts, base_bind_values) 1206 + } 1207 + 1208 + // Add cursor condition if present 1209 + let #(final_where_parts, final_bind_values) = case cursor_opt { 1210 + Some(cursor_str) -> { 1211 + case cursor.decode_cursor(cursor_str, sort_by) { 1212 + Ok(decoded_cursor) -> { 1213 + let #(cursor_where, cursor_params) = 1214 + cursor.build_cursor_where_clause( 1215 + decoded_cursor, 1216 + sort_by, 1217 + !is_forward, 1218 + ) 1219 + 1220 + let new_where = list.append(with_where_parts, [cursor_where]) 1221 + let new_binds = 1222 + list.append(with_where_values, list.map(cursor_params, sqlight.text)) 1223 + #(new_where, new_binds) 1224 + } 1225 + Error(_) -> #(with_where_parts, with_where_values) 1226 + } 1227 + } 1228 + None -> #(with_where_parts, with_where_values) 1229 + } 1230 + 1231 + // Fetch limit + 1 to detect if there are more pages 1232 + let fetch_limit = limit + 1 1233 + 1234 + // Build the SQL query 1235 + let sql = 1236 + " 1237 + SELECT uri, cid, did, collection, json, indexed_at 1238 + FROM record 1239 + WHERE " 1240 + <> string.join(final_where_parts, " AND ") 1241 + <> " 1242 + ORDER BY " 1243 + <> order_by_clause 1244 + <> " 1245 + LIMIT " 1246 + <> int.to_string(fetch_limit) 1247 + 1248 + // Execute query 1249 + let decoder = { 1250 + use uri <- decode.field(0, decode.string) 1251 + use cid <- decode.field(1, decode.string) 1252 + use did <- decode.field(2, decode.string) 1253 + use collection <- decode.field(3, decode.string) 1254 + use json <- decode.field(4, decode.string) 1255 + use indexed_at <- decode.field(5, decode.string) 1256 + decode.success(Record(uri:, cid:, did:, collection:, json:, indexed_at:)) 1257 + } 1258 + 1259 + use records <- result.try(sqlight.query( 1260 + sql, 1261 + on: conn, 1262 + with: final_bind_values, 1263 + expecting: decoder, 1264 + )) 1265 + 1266 + // Check if there are more results 1267 + let has_more = list.length(records) > limit 1268 + let trimmed_records = case has_more { 1269 + True -> list.take(records, limit) 1270 + False -> records 1271 + } 1272 + 1273 + // For backward pagination, reverse the results to restore original order 1274 + let final_records = case is_forward { 1275 + True -> trimmed_records 1276 + False -> list.reverse(trimmed_records) 1277 + } 1278 + 1279 + // Calculate hasNextPage and hasPreviousPage 1280 + let has_next_page = case is_forward { 1281 + True -> has_more 1282 + False -> option.is_some(cursor_opt) 1283 + } 1284 + 1285 + let has_previous_page = case is_forward { 1286 + True -> option.is_some(cursor_opt) 1287 + False -> has_more 1288 + } 1289 + 1290 + // Generate next cursor if there are more results 1291 + let next_cursor = case has_more, list.last(final_records) { 1292 + True, Ok(last_record) -> { 1293 + let record_like = record_to_record_like(last_record) 1294 + Some(cursor.generate_cursor_from_record(record_like, sort_by)) 1295 + } 1296 + _, _ -> None 1297 + } 1298 + 1299 + // Get total count using the WHERE clause (with where conditions, but without cursor conditions) 1300 + let count_sql = 1301 + "SELECT COUNT(*) FROM record WHERE " <> string.join(with_where_parts, " AND ") 1302 + 1303 + let count_decoder = { 1304 + use count <- decode.field(0, decode.int) 1305 + decode.success(count) 1306 + } 1307 + 1308 + let total_count = case 1309 + sqlight.query( 1310 + count_sql, 1311 + on: conn, 1312 + with: with_where_values, 1313 + expecting: count_decoder, 1314 + ) 1315 + { 1316 + Ok([count]) -> Some(count) 1317 + _ -> None 1318 + } 1319 + 1320 + Ok(#(final_records, next_cursor, has_next_page, has_previous_page, total_count)) 1321 + } 1322 + 1323 + /// Get records by DIDs and collection (for DID joins / DataLoader) 1324 + /// Finds all records in a specific collection that belong to any of the given DIDs 1325 + /// Uses the idx_record_did_collection index for efficient lookup 1326 + pub fn get_records_by_dids_and_collection( 1327 + conn: sqlight.Connection, 1328 + dids: List(String), 1329 + collection: String, 1330 + ) -> Result(List(Record), sqlight.Error) { 1331 + case dids { 1332 + [] -> Ok([]) 1333 + _ -> { 1334 + // Build placeholders for SQL IN clause 1335 + let placeholders = 1336 + list.map(dids, fn(_) { "?" }) 1337 + |> string.join(", ") 1338 + 1339 + let sql = 1340 + " 1341 + SELECT uri, cid, did, collection, json, indexed_at 1342 + FROM record 1343 + WHERE did IN (" <> placeholders <> ") 1344 + AND collection = ? 1345 + ORDER BY indexed_at DESC 1346 + " 1347 + 1348 + // Build params: DIDs + collection 1349 + let params = 1350 + list.flatten([ 1351 + list.map(dids, sqlight.text), 1352 + [sqlight.text(collection)], 1353 + ]) 1354 + 1355 + let decoder = { 1356 + use uri <- decode.field(0, decode.string) 1357 + use cid <- decode.field(1, decode.string) 1358 + use did <- decode.field(2, decode.string) 1359 + use collection <- decode.field(3, decode.string) 1360 + use json <- decode.field(4, decode.string) 1361 + use indexed_at <- decode.field(5, decode.string) 1362 + decode.success(Record(uri:, cid:, did:, collection:, json:, indexed_at:)) 1363 + } 1364 + 1365 + sqlight.query(sql, on: conn, with: params, expecting: decoder) 1366 + } 1367 + } 1368 + } 1369 + 1370 + /// Get records by DID and collection with pagination (for DID joins with connections) 1371 + /// Similar to get_records_by_dids_and_collection but for a single DID with cursor-based pagination 1372 + /// Returns: (records, next_cursor, has_next_page, has_previous_page, total_count) 1373 + pub fn get_records_by_dids_and_collection_paginated( 1374 + conn: sqlight.Connection, 1375 + did: String, 1376 + collection: String, 1377 + first: Option(Int), 1378 + after: Option(String), 1379 + last: Option(Int), 1380 + before: Option(String), 1381 + sort_by: Option(List(#(String, String))), 1382 + where_clause: Option(where_clause.WhereClause), 1383 + ) -> Result( 1384 + #(List(Record), Option(String), Bool, Bool, Option(Int)), 1385 + sqlight.Error, 1386 + ) { 1387 + // Validate pagination arguments 1388 + let #(limit, is_forward, cursor_opt) = case first, last { 1389 + Some(f), None -> #(f, True, after) 1390 + None, Some(l) -> #(l, False, before) 1391 + Some(f), Some(_) -> #(f, True, after) 1392 + None, None -> #(50, True, None) 1393 + } 1394 + 1395 + // Default sort order if not specified 1396 + let sort_fields = case sort_by { 1397 + Some(fields) -> fields 1398 + None -> [#("indexed_at", "desc")] 1399 + } 1400 + 1401 + // For backward pagination (last/before), reverse the sort order 1402 + let query_sort_fields = case is_forward { 1403 + True -> sort_fields 1404 + False -> reverse_sort_fields(sort_fields) 1405 + } 1406 + 1407 + // Build the ORDER BY clause 1408 + let order_by_clause = build_order_by(query_sort_fields, False) 1409 + 1410 + // Build WHERE clause parts for DID and collection matching 1411 + let base_where_parts = ["did = ?", "collection = ?"] 1412 + let base_bind_values = [sqlight.text(did), sqlight.text(collection)] 1413 + 1414 + // Add where clause conditions if present 1415 + let #(with_where_parts, with_where_values) = case where_clause { 1416 + Some(clause) -> { 1417 + let #(where_sql, where_params) = 1418 + where_clause.build_where_sql(clause, False) 1419 + case where_sql { 1420 + "" -> #(base_where_parts, base_bind_values) 1421 + _ -> #( 1422 + list.append(base_where_parts, [where_sql]), 1423 + list.append(base_bind_values, where_params), 1424 + ) 1425 + } 1426 + } 1427 + None -> #(base_where_parts, base_bind_values) 1428 + } 1429 + 1430 + // Add cursor condition if present 1431 + let #(final_where_parts, final_bind_values) = case cursor_opt { 1432 + Some(cursor_str) -> { 1433 + case cursor.decode_cursor(cursor_str, sort_by) { 1434 + Ok(decoded_cursor) -> { 1435 + let #(cursor_where, cursor_params) = 1436 + cursor.build_cursor_where_clause( 1437 + decoded_cursor, 1438 + sort_by, 1439 + !is_forward, 1440 + ) 1441 + 1442 + let new_where = list.append(with_where_parts, [cursor_where]) 1443 + let new_binds = 1444 + list.append(with_where_values, list.map(cursor_params, sqlight.text)) 1445 + #(new_where, new_binds) 1446 + } 1447 + Error(_) -> #(with_where_parts, with_where_values) 1448 + } 1449 + } 1450 + None -> #(with_where_parts, with_where_values) 1451 + } 1452 + 1453 + // Fetch limit + 1 to detect if there are more pages 1454 + let fetch_limit = limit + 1 1455 + 1456 + // Build the SQL query 1457 + let sql = 1458 + " 1459 + SELECT uri, cid, did, collection, json, indexed_at 1460 + FROM record 1461 + WHERE " 1462 + <> string.join(final_where_parts, " AND ") 1463 + <> " 1464 + ORDER BY " 1465 + <> order_by_clause 1466 + <> " 1467 + LIMIT " 1468 + <> int.to_string(fetch_limit) 1469 + 1470 + // Execute query 1471 + let decoder = { 1472 + use uri <- decode.field(0, decode.string) 1473 + use cid <- decode.field(1, decode.string) 1474 + use did <- decode.field(2, decode.string) 1475 + use collection <- decode.field(3, decode.string) 1476 + use json <- decode.field(4, decode.string) 1477 + use indexed_at <- decode.field(5, decode.string) 1478 + decode.success(Record(uri:, cid:, did:, collection:, json:, indexed_at:)) 1479 + } 1480 + 1481 + use records <- result.try(sqlight.query( 1482 + sql, 1483 + on: conn, 1484 + with: final_bind_values, 1485 + expecting: decoder, 1486 + )) 1487 + 1488 + // Check if there are more results 1489 + let has_more = list.length(records) > limit 1490 + let trimmed_records = case has_more { 1491 + True -> list.take(records, limit) 1492 + False -> records 1493 + } 1494 + 1495 + // For backward pagination, reverse the results to restore original order 1496 + let final_records = case is_forward { 1497 + True -> trimmed_records 1498 + False -> list.reverse(trimmed_records) 1499 + } 1500 + 1501 + // Calculate hasNextPage and hasPreviousPage 1502 + let has_next_page = case is_forward { 1503 + True -> has_more 1504 + False -> option.is_some(cursor_opt) 1505 + } 1506 + 1507 + let has_previous_page = case is_forward { 1508 + True -> option.is_some(cursor_opt) 1509 + False -> has_more 1510 + } 1511 + 1512 + // Generate next cursor if there are more results 1513 + let next_cursor = case has_more, list.last(final_records) { 1514 + True, Ok(last_record) -> { 1515 + let record_like = record_to_record_like(last_record) 1516 + Some(cursor.generate_cursor_from_record(record_like, sort_by)) 1517 + } 1518 + _, _ -> None 1519 + } 1520 + 1521 + // Get total count using the WHERE clause (with where conditions, but without cursor conditions) 1522 + let count_sql = 1523 + "SELECT COUNT(*) FROM record WHERE " <> string.join(with_where_parts, " AND ") 1524 + 1525 + let count_decoder = { 1526 + use count <- decode.field(0, decode.int) 1527 + decode.success(count) 1528 + } 1529 + 1530 + let total_count = case 1531 + sqlight.query( 1532 + count_sql, 1533 + on: conn, 1534 + with: with_where_values, 1535 + expecting: count_decoder, 1536 + ) 1537 + { 1538 + Ok([count]) -> Some(count) 1539 + _ -> None 1540 + } 1541 + 1542 + Ok(#(final_records, next_cursor, has_next_page, has_previous_page, total_count)) 1543 + }
+227 -2
server/src/graphql_gleam.gleam
··· 15 15 import graphql/executor 16 16 import graphql/schema 17 17 import graphql/value 18 + import lexicon_graphql/dataloader 18 19 import lexicon_graphql/db_schema_builder 19 20 import lexicon_graphql/lexicon_parser 20 21 import mutation_resolvers ··· 41 42 // Step 2: Parse lexicon JSON into structured Lexicon types 42 43 let parsed_lexicons = 43 44 lexicon_records 44 - |> list.filter_map(fn(lex) { lexicon_parser.parse_lexicon(lex.json) }) 45 + |> list.filter_map(fn(lex) { 46 + case lexicon_parser.parse_lexicon(lex.json) { 47 + Ok(parsed) -> Ok(parsed) 48 + Error(_) -> Error(Nil) 49 + } 50 + }) 45 51 46 52 // Check if we got any valid lexicons 47 53 case parsed_lexicons { ··· 50 56 // Step 3: Create a record fetcher function that queries the database with pagination 51 57 let record_fetcher = fn( 52 58 collection_nsid: String, 53 - pagination_params: db_schema_builder.PaginationParams, 59 + pagination_params: dataloader.PaginationParams, 54 60 ) -> Result( 55 61 #( 56 62 List(#(value.Value, String)), ··· 117 123 } 118 124 } 119 125 126 + // Step 3.5: Create a batch fetcher function for join operations 127 + let batch_fetcher = fn( 128 + uris: List(String), 129 + collection: String, 130 + field: option.Option(String), 131 + ) -> Result(dataloader.BatchResult, String) { 132 + // Check if this is a forward join (field is None) or reverse join (field is Some) 133 + case field { 134 + option.None -> { 135 + // Determine if we're dealing with DIDs or URIs 136 + case uris { 137 + [] -> Ok(dict.new()) 138 + [first, ..] -> { 139 + case string.starts_with(first, "did:") { 140 + True -> { 141 + // DID join: fetch records by DID and collection 142 + case database.get_records_by_dids_and_collection(db, uris, collection) { 143 + Ok(records) -> { 144 + // Group records by DID 145 + let grouped = 146 + list.fold(records, dict.new(), fn(acc, record) { 147 + let graphql_value = record_to_graphql_value(record, db) 148 + let existing = dict.get(acc, record.did) |> result.unwrap([]) 149 + dict.insert(acc, record.did, [graphql_value, ..existing]) 150 + }) 151 + Ok(grouped) 152 + } 153 + Error(_) -> Error("Failed to fetch records by DIDs") 154 + } 155 + } 156 + False -> { 157 + // Forward join: fetch records by their URIs 158 + case database.get_records_by_uris(db, uris) { 159 + Ok(records) -> { 160 + // Group records by URI 161 + let grouped = 162 + list.fold(records, dict.new(), fn(acc, record) { 163 + let graphql_value = record_to_graphql_value(record, db) 164 + // For forward joins, return single record per URI 165 + dict.insert(acc, record.uri, [graphql_value]) 166 + }) 167 + Ok(grouped) 168 + } 169 + Error(_) -> Error("Failed to fetch records by URIs") 170 + } 171 + } 172 + } 173 + } 174 + } 175 + } 176 + option.Some(reference_field) -> { 177 + // Reverse join: fetch records that reference the parent URIs 178 + case 179 + database.get_records_by_reference_field( 180 + db, 181 + collection, 182 + reference_field, 183 + uris, 184 + ) 185 + { 186 + Ok(records) -> { 187 + // Group records by the parent URI they reference 188 + // Parse each record's JSON to extract the reference field value 189 + let grouped = 190 + list.fold(records, dict.new(), fn(acc, record) { 191 + let graphql_value = record_to_graphql_value(record, db) 192 + // Extract the reference field from the record JSON to find parent URI 193 + case extract_reference_uri(record.json, reference_field) { 194 + Ok(parent_uri) -> { 195 + let existing = 196 + dict.get(acc, parent_uri) |> result.unwrap([]) 197 + dict.insert(acc, parent_uri, [graphql_value, ..existing]) 198 + } 199 + Error(_) -> acc 200 + } 201 + }) 202 + Ok(grouped) 203 + } 204 + Error(_) -> 205 + Error( 206 + "Failed to fetch records by reference field: " <> reference_field, 207 + ) 208 + } 209 + } 210 + } 211 + } 212 + 213 + // Step 3.6: Create a paginated batch fetcher function for join operations with pagination 214 + let paginated_batch_fetcher = fn( 215 + key: String, 216 + collection: String, 217 + field: option.Option(String), 218 + pagination_params: dataloader.PaginationParams, 219 + ) -> Result(dataloader.PaginatedBatchResult, String) { 220 + // Convert pagination params to database pagination params 221 + let db_first = pagination_params.first 222 + let db_after = pagination_params.after 223 + let db_last = pagination_params.last 224 + let db_before = pagination_params.before 225 + let db_sort_by = pagination_params.sort_by 226 + 227 + // Convert where clause from GraphQL to database format 228 + let db_where = case pagination_params.where { 229 + option.Some(where_clause) -> 230 + option.Some(where_converter.convert_where_clause(where_clause)) 231 + option.None -> option.None 232 + } 233 + 234 + // Check if this is a DID join (field is None) or reverse join (field is Some) 235 + case field { 236 + option.None -> { 237 + // DID join: key is the DID 238 + case 239 + database.get_records_by_dids_and_collection_paginated( 240 + db, 241 + key, 242 + collection, 243 + db_first, 244 + db_after, 245 + db_last, 246 + db_before, 247 + db_sort_by, 248 + db_where, 249 + ) 250 + { 251 + Ok(#(records, _next_cursor, has_next_page, has_previous_page, total_count)) -> { 252 + // Convert records to GraphQL values with cursors 253 + let edges = 254 + list.map(records, fn(record) { 255 + let graphql_value = record_to_graphql_value(record, db) 256 + let cursor = 257 + cursor.generate_cursor_from_record( 258 + database.record_to_record_like(record), 259 + db_sort_by, 260 + ) 261 + #(graphql_value, cursor) 262 + }) 263 + 264 + Ok(dataloader.PaginatedBatchResult( 265 + edges: edges, 266 + has_next_page: has_next_page, 267 + has_previous_page: has_previous_page, 268 + total_count: total_count, 269 + )) 270 + } 271 + Error(_) -> Error("Failed to fetch paginated records by DID") 272 + } 273 + } 274 + option.Some(reference_field) -> { 275 + // Reverse join: key is the parent URI 276 + case 277 + database.get_records_by_reference_field_paginated( 278 + db, 279 + collection, 280 + reference_field, 281 + key, 282 + db_first, 283 + db_after, 284 + db_last, 285 + db_before, 286 + db_sort_by, 287 + db_where, 288 + ) 289 + { 290 + Ok(#(records, _next_cursor, has_next_page, has_previous_page, total_count)) -> { 291 + // Convert records to GraphQL values with cursors 292 + let edges = 293 + list.map(records, fn(record) { 294 + let graphql_value = record_to_graphql_value(record, db) 295 + let cursor = 296 + cursor.generate_cursor_from_record( 297 + database.record_to_record_like(record), 298 + db_sort_by, 299 + ) 300 + #(graphql_value, cursor) 301 + }) 302 + 303 + Ok(dataloader.PaginatedBatchResult( 304 + edges: edges, 305 + has_next_page: has_next_page, 306 + has_previous_page: has_previous_page, 307 + total_count: total_count, 308 + )) 309 + } 310 + Error(_) -> 311 + Error( 312 + "Failed to fetch paginated records by reference field: " 313 + <> reference_field, 314 + ) 315 + } 316 + } 317 + } 318 + } 319 + 120 320 // Step 4: Create mutation resolver factories 121 321 let mutation_ctx = 122 322 mutation_resolvers.MutationContext(db: db, auth_base_url: auth_base_url) ··· 146 346 db_schema_builder.build_schema_with_fetcher( 147 347 parsed_lexicons, 148 348 record_fetcher, 349 + option.Some(batch_fetcher), 350 + option.Some(paginated_batch_fetcher), 149 351 create_factory, 150 352 update_factory, 151 353 delete_factory, ··· 372 574 } 373 575 } 374 576 } 577 + 578 + /// Extract a reference URI from a record's JSON 579 + /// This handles both simple string fields (at-uri) and strongRef objects 580 + fn extract_reference_uri(json_str: String, field_name: String) -> Result(String, Nil) { 581 + // Parse the JSON 582 + case parse_json_to_value(json_str) { 583 + Ok(value.Object(fields)) -> { 584 + // Find the field 585 + case list.key_find(fields, field_name) { 586 + Ok(value.String(uri)) -> Ok(uri) 587 + Ok(value.Object(ref_fields)) -> { 588 + // Handle strongRef: { "uri": "...", "cid": "..." } 589 + case list.key_find(ref_fields, "uri") { 590 + Ok(value.String(uri)) -> Ok(uri) 591 + _ -> Error(Nil) 592 + } 593 + } 594 + _ -> Error(Nil) 595 + } 596 + } 597 + _ -> Error(Nil) 598 + } 599 + }
-4
server/src/graphql_handler.gleam
··· 47 47 Ok(body) -> { 48 48 case bit_array.to_string(body) { 49 49 Ok(body_string) -> { 50 - // Log the query for debugging (truncate if too long) 51 - // io.println("Request body length: " <> string.inspect(string.length(body_string))) 52 - 53 50 // Parse JSON to extract query and variables 54 51 case extract_request_from_json(body_string) { 55 52 Ok(#(query, variables)) -> { 56 - // io.println("Query: " <> query) 57 53 execute_graphql_query(db, query, variables, auth_token, auth_base_url) 58 54 } 59 55 Error(err) -> bad_request_response("Invalid JSON: " <> err)
+73
server/test/fixtures/grain/lexicons/app/bsky/richtext/factet.json
··· 1 + { 2 + "lexicon": 1, 3 + "id": "app.bsky.richtext.facet", 4 + "defs": { 5 + "tag": { 6 + "type": "object", 7 + "required": ["tag"], 8 + "properties": { 9 + "tag": { 10 + "type": "string", 11 + "maxLength": 640, 12 + "maxGraphemes": 64 13 + } 14 + }, 15 + "description": "Facet feature for a hashtag. The text usually includes a '#' prefix, but the facet reference should not (except in the case of 'double hash tags')." 16 + }, 17 + "link": { 18 + "type": "object", 19 + "required": ["uri"], 20 + "properties": { 21 + "uri": { 22 + "type": "string", 23 + "format": "uri" 24 + } 25 + }, 26 + "description": "Facet feature for a URL. The text URL may have been simplified or truncated, but the facet reference should be a complete URL." 27 + }, 28 + "main": { 29 + "type": "object", 30 + "required": ["index", "features"], 31 + "properties": { 32 + "index": { 33 + "ref": "#byteSlice", 34 + "type": "ref" 35 + }, 36 + "features": { 37 + "type": "array", 38 + "items": { 39 + "refs": ["#mention", "#link", "#tag"], 40 + "type": "union" 41 + } 42 + } 43 + }, 44 + "description": "Annotation of a sub-string within rich text." 45 + }, 46 + "mention": { 47 + "type": "object", 48 + "required": ["did"], 49 + "properties": { 50 + "did": { 51 + "type": "string", 52 + "format": "did" 53 + } 54 + }, 55 + "description": "Facet feature for mention of another account. The text is usually a handle, including a '@' prefix, but the facet reference is a DID." 56 + }, 57 + "byteSlice": { 58 + "type": "object", 59 + "required": ["byteStart", "byteEnd"], 60 + "properties": { 61 + "byteEnd": { 62 + "type": "integer", 63 + "minimum": 0 64 + }, 65 + "byteStart": { 66 + "type": "integer", 67 + "minimum": 0 68 + } 69 + }, 70 + "description": "Specifies the sub-string range a facet feature applies to. Start index is inclusive, end index is exclusive. Indices are zero-indexed, counting bytes of the UTF-8 encoded text. NOTE: some languages, like Javascript, use UTF-16 or Unicode codepoints for string slice indexing; in these languages, convert to byte arrays before working with facets." 71 + } 72 + } 73 + }
+77
server/test/fixtures/grain/lexicons/social/grain/actor/defs.json
··· 1 + { 2 + "lexicon": 1, 3 + "id": "social.grain.actor.defs", 4 + "defs": { 5 + "profileView": { 6 + "type": "object", 7 + "required": ["cid", "did", "handle"], 8 + "properties": { 9 + "cid": { "type": "string", "format": "cid" }, 10 + "did": { "type": "string", "format": "did" }, 11 + "handle": { "type": "string", "format": "handle" }, 12 + "displayName": { 13 + "type": "string", 14 + "maxGraphemes": 64, 15 + "maxLength": 640 16 + }, 17 + "description": { 18 + "type": "string", 19 + "maxLength": 2560, 20 + "maxGraphemes": 256 21 + }, 22 + "labels": { 23 + "type": "array", 24 + "items": { 25 + "ref": "com.atproto.label.defs#label", 26 + "type": "ref" 27 + } 28 + }, 29 + "avatar": { "type": "string", "format": "uri" }, 30 + "createdAt": { "type": "string", "format": "datetime" } 31 + } 32 + }, 33 + "profileViewDetailed": { 34 + "type": "object", 35 + "required": ["cid", "did", "handle"], 36 + "properties": { 37 + "cid": { "type": "string", "format": "cid" }, 38 + "did": { "type": "string", "format": "did" }, 39 + "handle": { "type": "string", "format": "handle" }, 40 + "displayName": { 41 + "type": "string", 42 + "maxGraphemes": 64, 43 + "maxLength": 640 44 + }, 45 + "description": { 46 + "type": "string", 47 + "maxGraphemes": 256, 48 + "maxLength": 2560 49 + }, 50 + "avatar": { "type": "string", "format": "uri" }, 51 + "cameras": { 52 + "type": "array", 53 + "items": { "type": "string" }, 54 + "description": "List of camera make and models used by this actor derived from EXIF data of photos linked to galleries." 55 + }, 56 + "followersCount": { "type": "integer" }, 57 + "followsCount": { "type": "integer" }, 58 + "galleryCount": { "type": "integer" }, 59 + "indexedAt": { "type": "string", "format": "datetime" }, 60 + "createdAt": { "type": "string", "format": "datetime" }, 61 + "viewer": { "type": "ref", "ref": "#viewerState" }, 62 + "labels": { 63 + "type": "array", 64 + "items": { "type": "ref", "ref": "com.atproto.label.defs#label" } 65 + } 66 + } 67 + }, 68 + "viewerState": { 69 + "type": "object", 70 + "description": "Metadata about the requesting account's relationship with the subject account. Only has meaningful content for authed requests.", 71 + "properties": { 72 + "following": { "type": "string", "format": "at-uri" }, 73 + "followedBy": { "type": "string", "format": "at-uri" } 74 + } 75 + } 76 + } 77 + }
+34
server/test/fixtures/grain/lexicons/social/grain/actor/profile.json
··· 1 + { 2 + "lexicon": 1, 3 + "id": "social.grain.actor.profile", 4 + "defs": { 5 + "main": { 6 + "type": "record", 7 + "description": "A declaration of a basic account profile.", 8 + "key": "literal:self", 9 + "record": { 10 + "type": "object", 11 + "properties": { 12 + "displayName": { 13 + "type": "string", 14 + "maxGraphemes": 64, 15 + "maxLength": 640 16 + }, 17 + "description": { 18 + "type": "string", 19 + "description": "Free-form profile description text.", 20 + "maxGraphemes": 256, 21 + "maxLength": 2560 22 + }, 23 + "avatar": { 24 + "type": "blob", 25 + "description": "Small image to be displayed next to posts from account. AKA, 'profile picture'", 26 + "accept": ["image/png", "image/jpeg"], 27 + "maxSize": 1000000 28 + }, 29 + "createdAt": { "type": "string", "format": "datetime" } 30 + } 31 + } 32 + } 33 + } 34 + }
+42
server/test/fixtures/grain/lexicons/social/grain/comment/comment.json
··· 1 + { 2 + "lexicon": 1, 3 + "id": "social.grain.comment", 4 + "defs": { 5 + "main": { 6 + "type": "record", 7 + "key": "tid", 8 + "record": { 9 + "type": "object", 10 + "required": ["text", "subject", "createdAt"], 11 + "properties": { 12 + "text": { 13 + "type": "string", 14 + "maxLength": 3000, 15 + "maxGraphemes": 300 16 + }, 17 + "facets": { 18 + "type": "array", 19 + "description": "Annotations of description text (mentions and URLs, hashtags, etc)", 20 + "items": { "type": "ref", "ref": "app.bsky.richtext.facet" } 21 + }, 22 + "subject": { 23 + "type": "string", 24 + "format": "at-uri" 25 + }, 26 + "focus": { 27 + "type": "string", 28 + "format": "at-uri" 29 + }, 30 + "replyTo": { 31 + "type": "string", 32 + "format": "at-uri" 33 + }, 34 + "createdAt": { 35 + "type": "string", 36 + "format": "datetime" 37 + } 38 + } 39 + } 40 + } 41 + } 42 + }
+52
server/test/fixtures/grain/lexicons/social/grain/comment/defs.json
··· 1 + { 2 + "lexicon": 1, 3 + "id": "social.grain.comment.defs", 4 + "defs": { 5 + "commentView": { 6 + "type": "object", 7 + "required": ["uri", "cid", "author", "text", "createdAt"], 8 + "properties": { 9 + "uri": { "type": "string", "format": "at-uri" }, 10 + "cid": { "type": "string", "format": "cid" }, 11 + "author": { 12 + "type": "ref", 13 + "ref": "social.grain.actor.defs#profileView" 14 + }, 15 + "record": { "type": "unknown" }, 16 + "text": { 17 + "type": "string", 18 + "maxLength": 3000, 19 + "maxGraphemes": 300 20 + }, 21 + "facets": { 22 + "type": "array", 23 + "description": "Annotations of description text (mentions and URLs, hashtags, etc)", 24 + "items": { "type": "ref", "ref": "app.bsky.richtext.facet" } 25 + }, 26 + "subject": { 27 + "type": "union", 28 + "refs": [ 29 + "social.grain.gallery.defs#galleryView" 30 + ], 31 + "description": "The subject of the comment, which can be a gallery or a photo." 32 + }, 33 + "focus": { 34 + "type": "union", 35 + "refs": [ 36 + "social.grain.photo.defs#photoView" 37 + ], 38 + "description": "The photo that the comment is focused on, if applicable." 39 + }, 40 + "replyTo": { 41 + "type": "string", 42 + "format": "at-uri", 43 + "description": "The URI of the comment this comment is replying to, if applicable." 44 + }, 45 + "createdAt": { 46 + "type": "string", 47 + "format": "datetime" 48 + } 49 + } 50 + } 51 + } 52 + }
+15
server/test/fixtures/grain/lexicons/social/grain/defs.json
··· 1 + { 2 + "lexicon": 1, 3 + "id": "social.grain.defs", 4 + "defs": { 5 + "aspectRatio": { 6 + "type": "object", 7 + "description": "width:height represents an aspect ratio. It may be approximate, and may not correspond to absolute dimensions in any given unit.", 8 + "required": ["width", "height"], 9 + "properties": { 10 + "width": { "type": "integer", "minimum": 1 }, 11 + "height": { "type": "integer", "minimum": 1 } 12 + } 13 + } 14 + } 15 + }
+24
server/test/fixtures/grain/lexicons/social/grain/favorite/favorite.json
··· 1 + { 2 + "lexicon": 1, 3 + "id": "social.grain.favorite", 4 + "defs": { 5 + "main": { 6 + "type": "record", 7 + "key": "tid", 8 + "record": { 9 + "type": "object", 10 + "required": ["createdAt", "subject"], 11 + "properties": { 12 + "createdAt": { 13 + "type": "string", 14 + "format": "datetime" 15 + }, 16 + "subject": { 17 + "type": "string", 18 + "format": "at-uri" 19 + } 20 + } 21 + } 22 + } 23 + } 24 + }
+56
server/test/fixtures/grain/lexicons/social/grain/gallery/defs.json
··· 1 + { 2 + "lexicon": 1, 3 + "id": "social.grain.gallery.defs", 4 + "defs": { 5 + "galleryView": { 6 + "type": "object", 7 + "required": ["uri", "cid", "creator", "record", "indexedAt"], 8 + "properties": { 9 + "uri": { "type": "string", "format": "at-uri" }, 10 + "cid": { "type": "string", "format": "cid" }, 11 + "title": { "type": "string" }, 12 + "description": { "type": "string" }, 13 + "cameras": { 14 + "type": "array", 15 + "description": "List of camera make and models used in this gallery derived from EXIF data.", 16 + "items": { "type": "string" } 17 + }, 18 + "facets": { 19 + "type": "array", 20 + "description": "Annotations of description text (mentions, URLs, hashtags, etc)", 21 + "items": { "type": "ref", "ref": "app.bsky.richtext.facet" } 22 + }, 23 + "creator": { 24 + "type": "ref", 25 + "ref": "social.grain.actor.defs#profileView" 26 + }, 27 + "record": { "type": "unknown" }, 28 + "items": { 29 + "type": "array", 30 + "items": { 31 + "type": "union", 32 + "refs": [ 33 + "social.grain.photo.defs#photoView" 34 + ] 35 + } 36 + }, 37 + "favCount": { "type": "integer" }, 38 + "commentCount": { "type": "integer" }, 39 + "labels": { 40 + "type": "array", 41 + "items": { "type": "ref", "ref": "com.atproto.label.defs#label" } 42 + }, 43 + "createdAt": { "type": "string", "format": "datetime" }, 44 + "indexedAt": { "type": "string", "format": "datetime" }, 45 + "viewer": { "type": "ref", "ref": "#viewerState" } 46 + } 47 + }, 48 + "viewerState": { 49 + "type": "object", 50 + "description": "Metadata about the requesting account's relationship with the subject content. Only has meaningful content for authed requests.", 51 + "properties": { 52 + "fav": { "type": "string", "format": "at-uri" } 53 + } 54 + } 55 + } 56 + }
+30
server/test/fixtures/grain/lexicons/social/grain/gallery/gallery.json
··· 1 + { 2 + "lexicon": 1, 3 + "id": "social.grain.gallery", 4 + "defs": { 5 + "main": { 6 + "type": "record", 7 + "key": "tid", 8 + "record": { 9 + "type": "object", 10 + "required": ["title", "createdAt"], 11 + "properties": { 12 + "title": { "type": "string", "maxLength": 100 }, 13 + "description": { "type": "string", "maxLength": 1000 }, 14 + "facets": { 15 + "type": "array", 16 + "description": "Annotations of description text (mentions, URLs, hashtags, etc)", 17 + "items": { "type": "ref", "ref": "app.bsky.richtext.facet" } 18 + }, 19 + "labels": { 20 + "type": "union", 21 + "description": "Self-label values for this post. Effectively content warnings.", 22 + "refs": ["com.atproto.label.defs#selfLabels"] 23 + }, 24 + "updatedAt": { "type": "string", "format": "datetime" }, 25 + "createdAt": { "type": "string", "format": "datetime" } 26 + } 27 + } 28 + } 29 + } 30 + }
+32
server/test/fixtures/grain/lexicons/social/grain/gallery/item.json
··· 1 + { 2 + "lexicon": 1, 3 + "id": "social.grain.gallery.item", 4 + "defs": { 5 + "main": { 6 + "type": "record", 7 + "key": "tid", 8 + "record": { 9 + "type": "object", 10 + "required": ["createdAt", "gallery", "item"], 11 + "properties": { 12 + "createdAt": { 13 + "type": "string", 14 + "format": "datetime" 15 + }, 16 + "gallery": { 17 + "type": "string", 18 + "format": "at-uri" 19 + }, 20 + "item": { 21 + "type": "string", 22 + "format": "at-uri" 23 + }, 24 + "position": { 25 + "type": "integer", 26 + "default": 0 27 + } 28 + } 29 + } 30 + } 31 + } 32 + }
+27
server/test/fixtures/grain/lexicons/social/grain/graph/follow.json
··· 1 + { 2 + "lexicon": 1, 3 + "id": "social.grain.graph.follow", 4 + "defs": { 5 + "main": { 6 + "key": "tid", 7 + "type": "record", 8 + "record": { 9 + "type": "object", 10 + "required": [ 11 + "subject", 12 + "createdAt" 13 + ], 14 + "properties": { 15 + "subject": { 16 + "type": "string", 17 + "format": "did" 18 + }, 19 + "createdAt": { 20 + "type": "string", 21 + "format": "datetime" 22 + } 23 + } 24 + } 25 + } 26 + } 27 + }
+69
server/test/fixtures/grain/lexicons/social/grain/photo/defs.json
··· 1 + { 2 + "lexicon": 1, 3 + "id": "social.grain.photo.defs", 4 + "defs": { 5 + "photoView": { 6 + "type": "object", 7 + "required": ["uri", "cid", "thumb", "fullsize", "aspectRatio"], 8 + "properties": { 9 + "uri": { "type": "string", "format": "at-uri" }, 10 + "cid": { "type": "string", "format": "cid" }, 11 + "thumb": { 12 + "type": "string", 13 + "format": "uri", 14 + "description": "Fully-qualified URL where a thumbnail of the image can be fetched. For example, CDN location provided by the App View." 15 + }, 16 + "fullsize": { 17 + "type": "string", 18 + "format": "uri", 19 + "description": "Fully-qualified URL where a large version of the image can be fetched. May or may not be the exact original blob. For example, CDN location provided by the App View." 20 + }, 21 + "alt": { 22 + "type": "string", 23 + "description": "Alt text description of the image, for accessibility." 24 + }, 25 + "aspectRatio": { 26 + "type": "ref", 27 + "ref": "social.grain.defs#aspectRatio" 28 + }, 29 + "exif": { 30 + "type": "ref", 31 + "ref": "social.grain.photo.defs#exifView", 32 + "description": "EXIF metadata for the photo, if available." 33 + }, 34 + "gallery": { "type": "ref", "ref": "#galleryState" } 35 + } 36 + }, 37 + "exifView": { 38 + "type": "object", 39 + "required": ["uri", "cid", "photo", "record", "createdAt"], 40 + "properties": { 41 + "uri": { "type": "string", "format": "at-uri" }, 42 + "cid": { "type": "string", "format": "cid" }, 43 + "photo": { "type": "string", "format": "at-uri" }, 44 + "record": { "type": "unknown" }, 45 + "createdAt": { "type": "string", "format": "datetime" }, 46 + "dateTimeOriginal": { "type": "string" }, 47 + "exposureTime": { "type": "string" }, 48 + "fNumber": { "type": "string" }, 49 + "flash": { "type": "string" }, 50 + "focalLengthIn35mmFormat": { "type": "string" }, 51 + "iSO": { "type": "integer" }, 52 + "lensMake": { "type": "string" }, 53 + "lensModel": { "type": "string" }, 54 + "make": { "type": "string" }, 55 + "model": { "type": "string" } 56 + } 57 + }, 58 + "galleryState": { 59 + "type": "object", 60 + "required": ["item", "itemCreatedAt", "itemPosition"], 61 + "description": "Metadata about the photo's relationship with the subject content. Only has meaningful content when photo is attached to a gallery.", 62 + "properties": { 63 + "item": { "type": "string", "format": "at-uri" }, 64 + "itemCreatedAt": { "type": "string", "format": "datetime" }, 65 + "itemPosition": { "type": "integer" } 66 + } 67 + } 68 + } 69 + }
+32
server/test/fixtures/grain/lexicons/social/grain/photo/exif.json
··· 1 + { 2 + "lexicon": 1, 3 + "id": "social.grain.photo.exif", 4 + "defs": { 5 + "main": { 6 + "type": "record", 7 + "description": "Basic EXIF metadata for a photo. Integers are scaled by 1000000 to accommodate decimal values and potentially other tags in the future.", 8 + "key": "tid", 9 + "record": { 10 + "type": "object", 11 + "required": [ 12 + "photo", 13 + "createdAt" 14 + ], 15 + "properties": { 16 + "photo": { "type": "string", "format": "at-uri" }, 17 + "createdAt": { "type": "string", "format": "datetime" }, 18 + "dateTimeOriginal": { "type": "string", "format": "datetime" }, 19 + "exposureTime": { "type": "integer" }, 20 + "fNumber": { "type": "integer" }, 21 + "flash": { "type": "string" }, 22 + "focalLengthIn35mmFormat": { "type": "integer" }, 23 + "iSO": { "type": "integer" }, 24 + "lensMake": { "type": "string" }, 25 + "lensModel": { "type": "string" }, 26 + "make": { "type": "string" }, 27 + "model": { "type": "string" } 28 + } 29 + } 30 + } 31 + } 32 + }
+30
server/test/fixtures/grain/lexicons/social/grain/photo/photo.json
··· 1 + { 2 + "lexicon": 1, 3 + "id": "social.grain.photo", 4 + "defs": { 5 + "main": { 6 + "type": "record", 7 + "key": "tid", 8 + "record": { 9 + "type": "object", 10 + "required": ["photo", "aspectRatio", "createdAt"], 11 + "properties": { 12 + "photo": { 13 + "type": "blob", 14 + "accept": ["image/*"], 15 + "maxSize": 1000000 16 + }, 17 + "alt": { 18 + "type": "string", 19 + "description": "Alt text description of the image, for accessibility." 20 + }, 21 + "aspectRatio": { 22 + "type": "ref", 23 + "ref": "social.grain.defs#aspectRatio" 24 + }, 25 + "createdAt": { "type": "string", "format": "datetime" } 26 + } 27 + } 28 + } 29 + } 30 + }
+74
server/test/fixtures/statusphere/lexicons/app/bsky/actor/profile.json
··· 1 + { 2 + "lexicon": 1, 3 + "id": "app.bsky.actor.profile", 4 + "defs": { 5 + "main": { 6 + "key": "literal:self", 7 + "type": "record", 8 + "record": { 9 + "type": "object", 10 + "properties": { 11 + "avatar": { 12 + "type": "blob", 13 + "accept": [ 14 + "image/png", 15 + "image/jpeg" 16 + ], 17 + "maxSize": 1000000, 18 + "description": "Small image to be displayed next to posts from account. AKA, 'profile picture'" 19 + }, 20 + "banner": { 21 + "type": "blob", 22 + "accept": [ 23 + "image/png", 24 + "image/jpeg" 25 + ], 26 + "maxSize": 1000000, 27 + "description": "Larger horizontal image to display behind profile view." 28 + }, 29 + "labels": { 30 + "refs": [ 31 + "com.atproto.label.defs#selfLabels" 32 + ], 33 + "type": "union", 34 + "description": "Self-label values, specific to the Bluesky application, on the overall account." 35 + }, 36 + "website": { 37 + "type": "string", 38 + "format": "uri" 39 + }, 40 + "pronouns": { 41 + "type": "string", 42 + "maxLength": 200, 43 + "description": "Free-form pronouns text.", 44 + "maxGraphemes": 20 45 + }, 46 + "createdAt": { 47 + "type": "string", 48 + "format": "datetime" 49 + }, 50 + "pinnedPost": { 51 + "ref": "com.atproto.repo.strongRef", 52 + "type": "ref" 53 + }, 54 + "description": { 55 + "type": "string", 56 + "maxLength": 2560, 57 + "description": "Free-form profile description text.", 58 + "maxGraphemes": 256 59 + }, 60 + "displayName": { 61 + "type": "string", 62 + "maxLength": 640, 63 + "maxGraphemes": 64 64 + }, 65 + "joinedViaStarterPack": { 66 + "ref": "com.atproto.repo.strongRef", 67 + "type": "ref" 68 + } 69 + } 70 + }, 71 + "description": "A declaration of a Bluesky account profile." 72 + } 73 + } 74 + }
+192
server/test/fixtures/statusphere/lexicons/com/atproto/label/defs.json
··· 1 + { 2 + "lexicon": 1, 3 + "id": "com.atproto.label.defs", 4 + "defs": { 5 + "label": { 6 + "type": "object", 7 + "required": [ 8 + "src", 9 + "uri", 10 + "val", 11 + "cts" 12 + ], 13 + "properties": { 14 + "cid": { 15 + "type": "string", 16 + "format": "cid", 17 + "description": "Optionally, CID specifying the specific version of 'uri' resource this label applies to." 18 + }, 19 + "cts": { 20 + "type": "string", 21 + "format": "datetime", 22 + "description": "Timestamp when this label was created." 23 + }, 24 + "exp": { 25 + "type": "string", 26 + "format": "datetime", 27 + "description": "Timestamp at which this label expires (no longer applies)." 28 + }, 29 + "neg": { 30 + "type": "boolean", 31 + "description": "If true, this is a negation label, overwriting a previous label." 32 + }, 33 + "sig": { 34 + "type": "bytes", 35 + "description": "Signature of dag-cbor encoded label." 36 + }, 37 + "src": { 38 + "type": "string", 39 + "format": "did", 40 + "description": "DID of the actor who created this label." 41 + }, 42 + "uri": { 43 + "type": "string", 44 + "format": "uri", 45 + "description": "AT URI of the record, repository (account), or other resource that this label applies to." 46 + }, 47 + "val": { 48 + "type": "string", 49 + "maxLength": 128, 50 + "description": "The short string name of the value or type of this label." 51 + }, 52 + "ver": { 53 + "type": "integer", 54 + "description": "The AT Protocol version of the label object." 55 + } 56 + }, 57 + "description": "Metadata tag on an atproto resource (eg, repo or record)." 58 + }, 59 + "selfLabel": { 60 + "type": "object", 61 + "required": [ 62 + "val" 63 + ], 64 + "properties": { 65 + "val": { 66 + "type": "string", 67 + "maxLength": 128, 68 + "description": "The short string name of the value or type of this label." 69 + } 70 + }, 71 + "description": "Metadata tag on an atproto record, published by the author within the record. Note that schemas should use #selfLabels, not #selfLabel." 72 + }, 73 + "labelValue": { 74 + "type": "string", 75 + "knownValues": [ 76 + "!hide", 77 + "!no-promote", 78 + "!warn", 79 + "!no-unauthenticated", 80 + "dmca-violation", 81 + "doxxing", 82 + "porn", 83 + "sexual", 84 + "nudity", 85 + "nsfl", 86 + "gore" 87 + ] 88 + }, 89 + "selfLabels": { 90 + "type": "object", 91 + "required": [ 92 + "values" 93 + ], 94 + "properties": { 95 + "values": { 96 + "type": "array", 97 + "items": { 98 + "ref": "#selfLabel", 99 + "type": "ref" 100 + }, 101 + "maxLength": 10 102 + } 103 + }, 104 + "description": "Metadata tags on an atproto record, published by the author within the record." 105 + }, 106 + "labelValueDefinition": { 107 + "type": "object", 108 + "required": [ 109 + "identifier", 110 + "severity", 111 + "blurs", 112 + "locales" 113 + ], 114 + "properties": { 115 + "blurs": { 116 + "type": "string", 117 + "description": "What should this label hide in the UI, if applied? 'content' hides all of the target; 'media' hides the images/video/audio; 'none' hides nothing.", 118 + "knownValues": [ 119 + "content", 120 + "media", 121 + "none" 122 + ] 123 + }, 124 + "locales": { 125 + "type": "array", 126 + "items": { 127 + "ref": "#labelValueDefinitionStrings", 128 + "type": "ref" 129 + } 130 + }, 131 + "severity": { 132 + "type": "string", 133 + "description": "How should a client visually convey this label? 'inform' means neutral and informational; 'alert' means negative and warning; 'none' means show nothing.", 134 + "knownValues": [ 135 + "inform", 136 + "alert", 137 + "none" 138 + ] 139 + }, 140 + "adultOnly": { 141 + "type": "boolean", 142 + "description": "Does the user need to have adult content enabled in order to configure this label?" 143 + }, 144 + "identifier": { 145 + "type": "string", 146 + "maxLength": 100, 147 + "description": "The value of the label being defined. Must only include lowercase ascii and the '-' character ([a-z-]+).", 148 + "maxGraphemes": 100 149 + }, 150 + "defaultSetting": { 151 + "type": "string", 152 + "default": "warn", 153 + "description": "The default setting for this label.", 154 + "knownValues": [ 155 + "ignore", 156 + "warn", 157 + "hide" 158 + ] 159 + } 160 + }, 161 + "description": "Declares a label value and its expected interpretations and behaviors." 162 + }, 163 + "labelValueDefinitionStrings": { 164 + "type": "object", 165 + "required": [ 166 + "lang", 167 + "name", 168 + "description" 169 + ], 170 + "properties": { 171 + "lang": { 172 + "type": "string", 173 + "format": "language", 174 + "description": "The code of the language these strings are written in." 175 + }, 176 + "name": { 177 + "type": "string", 178 + "maxLength": 640, 179 + "description": "A short human-readable name for the label.", 180 + "maxGraphemes": 64 181 + }, 182 + "description": { 183 + "type": "string", 184 + "maxLength": 100000, 185 + "description": "A longer description of what the label means and why it might be applied.", 186 + "maxGraphemes": 10000 187 + } 188 + }, 189 + "description": "Strings which describe the label in the UI, localized into a specific language." 190 + } 191 + } 192 + }
+24
server/test/fixtures/statusphere/lexicons/com/atproto/repo/strongRef.json
··· 1 + { 2 + "lexicon": 1, 3 + "id": "com.atproto.repo.strongRef", 4 + "description": "A URI with a content-hash fingerprint.", 5 + "defs": { 6 + "main": { 7 + "type": "object", 8 + "required": [ 9 + "uri", 10 + "cid" 11 + ], 12 + "properties": { 13 + "cid": { 14 + "type": "string", 15 + "format": "cid" 16 + }, 17 + "uri": { 18 + "type": "string", 19 + "format": "at-uri" 20 + } 21 + } 22 + } 23 + } 24 + }
+495
server/test/graphql_introspection_did_join_test.gleam
··· 1 + /// Integration test for GraphQL introspection with DID joins 2 + /// 3 + /// This test verifies that DID join fields are properly generated in the GraphQL schema 4 + /// by running a full introspection query and checking for the expected join fields. 5 + import database 6 + import gleam/dynamic/decode 7 + import gleam/http 8 + import gleam/json 9 + import gleam/list 10 + import gleam/result 11 + import gleam/string 12 + import gleeunit/should 13 + import graphql_handler 14 + import importer 15 + import lexicon 16 + import simplifile 17 + import sqlight 18 + import wisp 19 + import wisp/simulate 20 + 21 + // Helper to load all lexicons from the grain fixtures directory 22 + fn load_grain_lexicons(db: sqlight.Connection) -> Result(Nil, String) { 23 + let lexicons_dir = "test/fixtures/grain/lexicons" 24 + 25 + // Scan directory for all JSON files 26 + use file_paths <- result.try(importer.scan_directory_recursive(lexicons_dir)) 27 + 28 + // Read all files first to get their content 29 + let file_contents = 30 + file_paths 31 + |> list.filter_map(fn(file_path) { 32 + case simplifile.read(file_path) { 33 + Ok(content) -> Ok(#(file_path, content)) 34 + Error(_) -> Error(Nil) 35 + } 36 + }) 37 + 38 + // Extract all JSON strings for validation 39 + let all_json_strings = list.map(file_contents, fn(pair) { pair.1 }) 40 + 41 + // Validate all schemas together (this allows cross-references to be resolved) 42 + use _ <- result.try(case lexicon.validate_schemas(all_json_strings) { 43 + Ok(_) -> Ok(Nil) 44 + Error(err) -> Error("Validation failed: " <> string.inspect(err)) 45 + }) 46 + 47 + // Import each lexicon (skipping individual validation since we already validated all together) 48 + use _ <- result.try(case 49 + list.try_map(file_contents, fn(pair) { 50 + let #(_file_path, json_content) = pair 51 + // Extract lexicon ID and insert into database 52 + case json.parse(json_content, { 53 + use id <- decode.field("id", decode.string) 54 + decode.success(id) 55 + }) { 56 + Ok(lexicon_id) -> 57 + case database.insert_lexicon(db, lexicon_id, json_content) { 58 + Ok(_) -> Ok(Nil) 59 + Error(_) -> Error("Database insertion failed") 60 + } 61 + Error(_) -> Error("Missing 'id' field") 62 + } 63 + }) 64 + { 65 + Ok(_) -> Ok(Nil) 66 + Error(err) -> Error("Import failed: " <> err) 67 + }) 68 + 69 + Ok(Nil) 70 + } 71 + 72 + pub fn introspection_query_includes_did_join_fields_test() { 73 + // NOTE: This test documents a known limitation with GraphQL introspection. 74 + // Types may appear in introspection without all their join fields if they're 75 + // encountered through a Connection in another type's join field before being 76 + // encountered through their Query field. 77 + // 78 + // This test is kept to document the issue and ensure we don't regress further. 79 + // The actual GraphQL queries work correctly (see introspection_query_did_join_field_structure_test). 80 + 81 + // Create in-memory database 82 + let assert Ok(db) = sqlight.open(":memory:") 83 + let assert Ok(_) = database.create_lexicon_table(db) 84 + let assert Ok(_) = database.create_record_table(db) 85 + 86 + // Load all grain lexicons from fixtures 87 + let assert Ok(_) = load_grain_lexicons(db) 88 + 89 + // Use __type introspection to query the SocialGrainGallery type 90 + let introspection_query = 91 + "query IntrospectionQuery { 92 + __type(name: \"SocialGrainGallery\") { 93 + name 94 + kind 95 + fields { 96 + name 97 + type { 98 + name 99 + kind 100 + ofType { 101 + name 102 + kind 103 + } 104 + } 105 + } 106 + } 107 + }" 108 + 109 + let query = 110 + json.object([#("query", json.string(introspection_query))]) 111 + |> json.to_string 112 + 113 + let request = 114 + simulate.request(http.Post, "/graphql") 115 + |> simulate.string_body(query) 116 + |> simulate.header("content-type", "application/json") 117 + 118 + let response = 119 + graphql_handler.handle_graphql_request(request, db, "http://localhost:3000") 120 + 121 + // Verify response 122 + response.status 123 + |> should.equal(200) 124 + 125 + // Get response body 126 + let assert wisp.Text(body) = response.body 127 + 128 + // Verify that introspection worked (no errors or empty errors array) 129 + let has_non_empty_errors = 130 + string.contains(body, "\"errors\":[") 131 + && !string.contains(body, "\"errors\":[]") 132 + has_non_empty_errors 133 + |> should.be_false 134 + 135 + // Verify the type was found 136 + string.contains(body, "SocialGrainGallery") 137 + |> should.be_true 138 + 139 + // NOTE: These assertions are enabled to verify DID join fields appear in introspection 140 + // after refactoring the type builder to use topological sort. 141 + 142 + string.contains(body, "socialGrainActorProfileByDid") 143 + |> should.be_true 144 + 145 + string.contains(body, "socialGrainPhotoByDid") 146 + |> should.be_true 147 + 148 + string.contains(body, "socialGrainCommentByDid") 149 + |> should.be_true 150 + 151 + string.contains(body, "socialGrainFavoriteByDid") 152 + |> should.be_true 153 + 154 + // Clean up 155 + let assert Ok(_) = sqlight.close(db) 156 + } 157 + 158 + pub fn introspection_query_profile_join_fields_test() { 159 + // This test verifies that the SocialGrainActorProfile type has the reverse DID join 160 + // field to galleries. 161 + 162 + // Create in-memory database 163 + let assert Ok(db) = sqlight.open(":memory:") 164 + let assert Ok(_) = database.create_lexicon_table(db) 165 + let assert Ok(_) = database.create_record_table(db) 166 + 167 + // Load all grain lexicons from fixtures 168 + let assert Ok(_) = load_grain_lexicons(db) 169 + 170 + // Use __type introspection to query the SocialGrainActorProfile type 171 + let introspection_query = 172 + "query IntrospectionQuery { 173 + __type(name: \"SocialGrainActorProfile\") { 174 + name 175 + kind 176 + fields { 177 + name 178 + type { 179 + name 180 + kind 181 + ofType { 182 + name 183 + kind 184 + } 185 + } 186 + } 187 + } 188 + }" 189 + 190 + let query = 191 + json.object([#("query", json.string(introspection_query))]) 192 + |> json.to_string 193 + 194 + let request = 195 + simulate.request(http.Post, "/graphql") 196 + |> simulate.string_body(query) 197 + |> simulate.header("content-type", "application/json") 198 + 199 + let response = 200 + graphql_handler.handle_graphql_request(request, db, "http://localhost:3000") 201 + 202 + // Verify response 203 + response.status 204 + |> should.equal(200) 205 + 206 + // Get response body 207 + let assert wisp.Text(body) = response.body 208 + 209 + // Verify that introspection worked (no errors or empty errors array) 210 + let has_non_empty_errors = 211 + string.contains(body, "\"errors\":[") 212 + && !string.contains(body, "\"errors\":[]") 213 + has_non_empty_errors 214 + |> should.be_false 215 + 216 + // Verify the type was found 217 + string.contains(body, "SocialGrainActorProfile") 218 + |> should.be_true 219 + 220 + // Verify that the DID join field to galleries exists 221 + string.contains(body, "socialGrainGalleryByDid") 222 + |> should.be_true 223 + 224 + // Clean up 225 + let assert Ok(_) = sqlight.close(db) 226 + } 227 + 228 + pub fn introspection_query_did_join_field_structure_test() { 229 + // This test verifies DID joins work end-to-end by querying the actual fields 230 + // and retrieving data. This complements the introspection tests by verifying 231 + // that the join fields not only exist in the schema but also execute correctly. 232 + 233 + // Create in-memory database 234 + let assert Ok(db) = sqlight.open(":memory:") 235 + let assert Ok(_) = database.create_lexicon_table(db) 236 + let assert Ok(_) = database.create_record_table(db) 237 + 238 + // Load all grain lexicons from fixtures 239 + let assert Ok(_) = load_grain_lexicons(db) 240 + 241 + let test_did = "did:plc:test" 242 + 243 + // Insert a test profile record 244 + let profile_json = 245 + json.object([ 246 + #("displayName", json.string("Test User")), 247 + #("createdAt", json.string("2024-01-01T00:00:00.000Z")), 248 + ]) 249 + |> json.to_string 250 + 251 + let assert Ok(_) = 252 + database.insert_record( 253 + db, 254 + "at://" <> test_did <> "/social.grain.actor.profile/self", 255 + "cid1", 256 + test_did, 257 + "social.grain.actor.profile", 258 + profile_json, 259 + ) 260 + 261 + // Insert a gallery record 262 + let gallery_json = 263 + json.object([ 264 + #("title", json.string("My Gallery")), 265 + #("createdAt", json.string("2024-01-01T00:00:00.000Z")), 266 + ]) 267 + |> json.to_string 268 + 269 + let assert Ok(_) = 270 + database.insert_record( 271 + db, 272 + "at://" <> test_did <> "/social.grain.gallery/123", 273 + "cid2", 274 + test_did, 275 + "social.grain.gallery", 276 + gallery_json, 277 + ) 278 + 279 + // Insert a photo record 280 + let photo_json = 281 + json.object([ 282 + #("alt", json.string("Test Photo")), 283 + #("createdAt", json.string("2024-01-01T00:00:00.000Z")), 284 + ]) 285 + |> json.to_string 286 + 287 + let assert Ok(_) = 288 + database.insert_record( 289 + db, 290 + "at://" <> test_did <> "/social.grain.photo/456", 291 + "cid3", 292 + test_did, 293 + "social.grain.photo", 294 + photo_json, 295 + ) 296 + 297 + // Insert a comment record 298 + let comment_json = 299 + json.object([ 300 + #("text", json.string("Test Comment")), 301 + #("createdAt", json.string("2024-01-01T00:00:00.000Z")), 302 + ]) 303 + |> json.to_string 304 + 305 + let assert Ok(_) = 306 + database.insert_record( 307 + db, 308 + "at://" <> test_did <> "/social.grain.comment/789", 309 + "cid4", 310 + test_did, 311 + "social.grain.comment", 312 + comment_json, 313 + ) 314 + 315 + // Insert a favorite record 316 + let favorite_json = 317 + json.object([ 318 + #("subject", json.string("at://" <> test_did <> "/social.grain.photo/456")), 319 + #("createdAt", json.string("2024-01-01T00:00:00.000Z")), 320 + ]) 321 + |> json.to_string 322 + 323 + let assert Ok(_) = 324 + database.insert_record( 325 + db, 326 + "at://" <> test_did <> "/social.grain.favorite/abc", 327 + "cid5", 328 + test_did, 329 + "social.grain.favorite", 330 + favorite_json, 331 + ) 332 + 333 + // Query SocialGrainGallery and verify it has DID join fields to profile, photo, comment, and favorite 334 + let query_text = 335 + "{ 336 + socialGrainGallery { 337 + edges { 338 + node { 339 + title 340 + socialGrainActorProfileByDid { displayName } 341 + socialGrainPhotoByDid { edges { node { alt } } } 342 + socialGrainCommentByDid { edges { node { text } } } 343 + socialGrainFavoriteByDid { edges { node { createdAt } } } 344 + } 345 + } 346 + } 347 + }" 348 + 349 + let query = 350 + json.object([#("query", json.string(query_text))]) 351 + |> json.to_string 352 + 353 + let request = 354 + simulate.request(http.Post, "/graphql") 355 + |> simulate.string_body(query) 356 + |> simulate.header("content-type", "application/json") 357 + 358 + let response = 359 + graphql_handler.handle_graphql_request(request, db, "http://localhost:3000") 360 + 361 + // Verify response 362 + response.status 363 + |> should.equal(200) 364 + 365 + let assert wisp.Text(body) = response.body 366 + 367 + // Check for errors - if any DID join field doesn't exist, we'll get an error 368 + let has_field_error = string.contains(body, "Cannot query field") || string.contains(body, "not found") 369 + 370 + // If there are errors about fields not being found, the test should fail 371 + case has_field_error { 372 + True -> should.fail() 373 + False -> { 374 + // Verify response contains data 375 + string.contains(body, "data") 376 + |> should.be_true 377 + 378 + // Verify the gallery data is returned 379 + string.contains(body, "My Gallery") 380 + |> should.be_true 381 + 382 + // Verify the DID join fields are present and returned data 383 + string.contains(body, "Test User") 384 + |> should.be_true 385 + 386 + string.contains(body, "Test Photo") 387 + |> should.be_true 388 + 389 + string.contains(body, "Test Comment") 390 + |> should.be_true 391 + } 392 + } 393 + 394 + // Clean up 395 + let assert Ok(_) = sqlight.close(db) 396 + } 397 + 398 + pub fn did_join_field_query_execution_test() { 399 + // Create in-memory database 400 + let assert Ok(db) = sqlight.open(":memory:") 401 + let assert Ok(_) = database.create_lexicon_table(db) 402 + let assert Ok(_) = database.create_record_table(db) 403 + 404 + // Load all grain lexicons from fixtures 405 + let assert Ok(_) = load_grain_lexicons(db) 406 + 407 + // Insert a profile record 408 + let profile_json = 409 + json.object([ 410 + #("displayName", json.string("Test User")), 411 + #("createdAt", json.string("2024-01-01T00:00:00.000Z")), 412 + ]) 413 + |> json.to_string 414 + 415 + let assert Ok(_) = 416 + database.insert_record( 417 + db, 418 + "at://did:plc:test/social.grain.actor.profile/self", 419 + "cid1", 420 + "did:plc:test", 421 + "social.grain.actor.profile", 422 + profile_json, 423 + ) 424 + 425 + // Insert gallery records for the same DID 426 + let gallery1_json = 427 + json.object([ 428 + #("title", json.string("Gallery 1")), 429 + #("createdAt", json.string("2024-01-01T00:00:00.000Z")), 430 + ]) 431 + |> json.to_string 432 + 433 + let assert Ok(_) = 434 + database.insert_record( 435 + db, 436 + "at://did:plc:test/social.grain.gallery/123", 437 + "cid2", 438 + "did:plc:test", 439 + "social.grain.gallery", 440 + gallery1_json, 441 + ) 442 + 443 + let gallery2_json = 444 + json.object([ 445 + #("title", json.string("Gallery 2")), 446 + #("createdAt", json.string("2024-01-01T00:00:00.000Z")), 447 + ]) 448 + |> json.to_string 449 + 450 + let assert Ok(_) = 451 + database.insert_record( 452 + db, 453 + "at://did:plc:test/social.grain.gallery/456", 454 + "cid3", 455 + "did:plc:test", 456 + "social.grain.gallery", 457 + gallery2_json, 458 + ) 459 + 460 + // Query profile with DID join field 461 + let query_text = 462 + "{ socialGrainActorProfile { edges { node { displayName socialGrainGalleryByDid { edges { node { title } } } } } } }" 463 + 464 + let query = 465 + json.object([#("query", json.string(query_text))]) 466 + |> json.to_string 467 + 468 + let request = 469 + simulate.request(http.Post, "/graphql") 470 + |> simulate.string_body(query) 471 + |> simulate.header("content-type", "application/json") 472 + 473 + let response = 474 + graphql_handler.handle_graphql_request(request, db, "http://localhost:3000") 475 + 476 + // Verify response 477 + response.status 478 + |> should.equal(200) 479 + 480 + let assert wisp.Text(body) = response.body 481 + 482 + // Should contain the profile data 483 + string.contains(body, "Test User") 484 + |> should.be_true 485 + 486 + // Should contain the joined gallery data 487 + string.contains(body, "Gallery 1") 488 + |> should.be_true 489 + 490 + string.contains(body, "Gallery 2") 491 + |> should.be_true 492 + 493 + // Clean up 494 + let assert Ok(_) = sqlight.close(db) 495 + }
+8
server/test/graphql_mutation_integration_test.gleam
··· 91 91 option.None, 92 92 option.None, 93 93 option.None, 94 + option.None, 95 + option.None, 94 96 ) 95 97 96 98 // Test: Verify that the schema has mutation type ··· 140 142 db_schema_builder.build_schema_with_fetcher( 141 143 parsed_lexicons, 142 144 empty_fetcher, 145 + option.None, 146 + option.None, 143 147 option.None, 144 148 option.None, 145 149 option.None, ··· 199 203 option.None, 200 204 option.None, 201 205 option.None, 206 + option.None, 207 + option.None, 202 208 ) 203 209 204 210 // Get mutation type and find updateXyzStatusphereStatus field ··· 249 255 db_schema_builder.build_schema_with_fetcher( 250 256 parsed_lexicons, 251 257 empty_fetcher, 258 + option.None, 259 + option.None, 252 260 option.None, 253 261 option.None, 254 262 option.None,
+1196
server/test/join_integration_test.gleam
··· 1 + /// Integration tests for joins and DataLoader with actual database 2 + /// 3 + /// Tests verify that: 4 + /// - Forward joins (at-uri and strongRef) resolve correctly 5 + /// - Reverse joins discover and resolve relationships 6 + /// - DataLoader batches queries efficiently 7 + /// - All join types work with actual SQLite database queries 8 + import database 9 + import gleam/json 10 + import gleam/string 11 + import gleeunit/should 12 + import graphql_gleam 13 + import sqlight 14 + 15 + // Helper to create a post lexicon JSON 16 + fn create_post_lexicon() -> String { 17 + json.object([ 18 + #("lexicon", json.int(1)), 19 + #("id", json.string("app.bsky.feed.post")), 20 + #( 21 + "defs", 22 + json.object([ 23 + #( 24 + "main", 25 + json.object([ 26 + #("type", json.string("record")), 27 + #("key", json.string("tid")), 28 + #( 29 + "record", 30 + json.object([ 31 + #("type", json.string("object")), 32 + #( 33 + "required", 34 + json.array([json.string("text")], of: fn(x) { x }), 35 + ), 36 + #( 37 + "properties", 38 + json.object([ 39 + #( 40 + "text", 41 + json.object([ 42 + #("type", json.string("string")), 43 + #("maxLength", json.int(300)), 44 + ]), 45 + ), 46 + #( 47 + "replyTo", 48 + json.object([ 49 + #("type", json.string("string")), 50 + #("format", json.string("at-uri")), 51 + ]), 52 + ), 53 + ]), 54 + ), 55 + ]), 56 + ), 57 + ]), 58 + ), 59 + ]), 60 + ), 61 + ]) 62 + |> json.to_string 63 + } 64 + 65 + // Helper to create a like lexicon JSON with subject field (at-uri) 66 + fn create_like_lexicon() -> String { 67 + json.object([ 68 + #("lexicon", json.int(1)), 69 + #("id", json.string("app.bsky.feed.like")), 70 + #( 71 + "defs", 72 + json.object([ 73 + #( 74 + "main", 75 + json.object([ 76 + #("type", json.string("record")), 77 + #("key", json.string("tid")), 78 + #( 79 + "record", 80 + json.object([ 81 + #("type", json.string("object")), 82 + #( 83 + "required", 84 + json.array([json.string("subject")], of: fn(x) { x }), 85 + ), 86 + #( 87 + "properties", 88 + json.object([ 89 + #( 90 + "subject", 91 + json.object([ 92 + #("type", json.string("string")), 93 + #("format", json.string("at-uri")), 94 + ]), 95 + ), 96 + #( 97 + "createdAt", 98 + json.object([ 99 + #("type", json.string("string")), 100 + #("format", json.string("datetime")), 101 + ]), 102 + ), 103 + ]), 104 + ), 105 + ]), 106 + ), 107 + ]), 108 + ), 109 + ]), 110 + ), 111 + ]) 112 + |> json.to_string 113 + } 114 + 115 + // Helper to create a profile lexicon with strongRef 116 + fn create_profile_lexicon() -> String { 117 + json.object([ 118 + #("lexicon", json.int(1)), 119 + #("id", json.string("app.bsky.actor.profile")), 120 + #( 121 + "defs", 122 + json.object([ 123 + #( 124 + "main", 125 + json.object([ 126 + #("type", json.string("record")), 127 + #("key", json.string("self")), 128 + #( 129 + "record", 130 + json.object([ 131 + #("type", json.string("object")), 132 + #( 133 + "properties", 134 + json.object([ 135 + #( 136 + "displayName", 137 + json.object([#("type", json.string("string"))]), 138 + ), 139 + #( 140 + "pinnedPost", 141 + json.object([ 142 + #("type", json.string("ref")), 143 + #("ref", json.string("com.atproto.repo.strongRef")), 144 + ]), 145 + ), 146 + ]), 147 + ), 148 + ]), 149 + ), 150 + ]), 151 + ), 152 + ]), 153 + ), 154 + ]) 155 + |> json.to_string 156 + } 157 + 158 + // Test: Forward join with at-uri field resolves correctly 159 + pub fn forward_join_at_uri_resolves_test() { 160 + // Setup database 161 + let assert Ok(db) = sqlight.open(":memory:") 162 + let assert Ok(_) = database.create_lexicon_table(db) 163 + let assert Ok(_) = database.create_record_table(db) 164 + let assert Ok(_) = database.create_actor_table(db) 165 + 166 + // Insert lexicons 167 + let post_lexicon = create_post_lexicon() 168 + let assert Ok(_) = 169 + database.insert_lexicon(db, "app.bsky.feed.post", post_lexicon) 170 + 171 + // Insert test records 172 + // Parent post 173 + let parent_uri = "at://did:plc:parent123/app.bsky.feed.post/parent1" 174 + let parent_json = 175 + json.object([#("text", json.string("This is the parent post"))]) 176 + |> json.to_string 177 + 178 + let assert Ok(_) = 179 + database.insert_record( 180 + db, 181 + parent_uri, 182 + "cid_parent", 183 + "did:plc:parent123", 184 + "app.bsky.feed.post", 185 + parent_json, 186 + ) 187 + 188 + // Reply post that references parent 189 + let reply_uri = "at://did:plc:user456/app.bsky.feed.post/reply1" 190 + let reply_json = 191 + json.object([ 192 + #("text", json.string("This is a reply")), 193 + #("replyTo", json.string(parent_uri)), 194 + ]) 195 + |> json.to_string 196 + 197 + let assert Ok(_) = 198 + database.insert_record( 199 + db, 200 + reply_uri, 201 + "cid_reply", 202 + "did:plc:user456", 203 + "app.bsky.feed.post", 204 + reply_json, 205 + ) 206 + 207 + // Execute GraphQL query with forward join 208 + let query = 209 + " 210 + { 211 + appBskyFeedPost { 212 + edges { 213 + node { 214 + uri 215 + replyToResolved { 216 + uri 217 + } 218 + } 219 + } 220 + } 221 + } 222 + " 223 + 224 + let assert Ok(response_json) = 225 + graphql_gleam.execute_query_with_db(db, query, "{}", Error(Nil), "") 226 + 227 + // Verify the response contains resolved join with parent URI 228 + string.contains(response_json, reply_uri) 229 + |> should.be_true 230 + 231 + string.contains(response_json, parent_uri) 232 + |> should.be_true 233 + 234 + string.contains(response_json, "replyToResolved") 235 + |> should.be_true 236 + } 237 + 238 + // Test: Forward join with strongRef resolves correctly 239 + pub fn forward_join_strong_ref_resolves_test() { 240 + // Setup database 241 + let assert Ok(db) = sqlight.open(":memory:") 242 + let assert Ok(_) = database.create_lexicon_table(db) 243 + let assert Ok(_) = database.create_record_table(db) 244 + let assert Ok(_) = database.create_actor_table(db) 245 + 246 + // Insert lexicons 247 + let post_lexicon = create_post_lexicon() 248 + let profile_lexicon = create_profile_lexicon() 249 + let assert Ok(_) = 250 + database.insert_lexicon(db, "app.bsky.feed.post", post_lexicon) 251 + let assert Ok(_) = 252 + database.insert_lexicon(db, "app.bsky.actor.profile", profile_lexicon) 253 + 254 + // Insert test records 255 + // A post 256 + let post_uri = "at://did:plc:user123/app.bsky.feed.post/post1" 257 + let post_json = 258 + json.object([#("text", json.string("My favorite post"))]) 259 + |> json.to_string 260 + 261 + let assert Ok(_) = 262 + database.insert_record( 263 + db, 264 + post_uri, 265 + "cid_post1", 266 + "did:plc:user123", 267 + "app.bsky.feed.post", 268 + post_json, 269 + ) 270 + 271 + // A profile that pins the post (using strongRef) 272 + let profile_uri = "at://did:plc:user123/app.bsky.actor.profile/self" 273 + let profile_json = 274 + json.object([ 275 + #("displayName", json.string("Alice")), 276 + #( 277 + "pinnedPost", 278 + json.object([ 279 + #("uri", json.string(post_uri)), 280 + #("cid", json.string("cid_post1")), 281 + ]), 282 + ), 283 + ]) 284 + |> json.to_string 285 + 286 + let assert Ok(_) = 287 + database.insert_record( 288 + db, 289 + profile_uri, 290 + "cid_profile", 291 + "did:plc:user123", 292 + "app.bsky.actor.profile", 293 + profile_json, 294 + ) 295 + 296 + // Execute GraphQL query with forward join on strongRef 297 + let query = 298 + " 299 + { 300 + appBskyActorProfile { 301 + edges { 302 + node { 303 + uri 304 + pinnedPostResolved { 305 + uri 306 + } 307 + } 308 + } 309 + } 310 + } 311 + " 312 + 313 + let assert Ok(response_json) = 314 + graphql_gleam.execute_query_with_db(db, query, "{}", Error(Nil), "") 315 + 316 + // Verify the response contains resolved strongRef join with post URI 317 + string.contains(response_json, profile_uri) 318 + |> should.be_true 319 + 320 + string.contains(response_json, post_uri) 321 + |> should.be_true 322 + 323 + string.contains(response_json, "pinnedPostResolved") 324 + |> should.be_true 325 + } 326 + 327 + // Test: Reverse join discovers and resolves relationships 328 + pub fn reverse_join_resolves_test() { 329 + // Setup database 330 + let assert Ok(db) = sqlight.open(":memory:") 331 + let assert Ok(_) = database.create_lexicon_table(db) 332 + let assert Ok(_) = database.create_record_table(db) 333 + let assert Ok(_) = database.create_actor_table(db) 334 + 335 + // Insert lexicons 336 + let post_lexicon = create_post_lexicon() 337 + let like_lexicon = create_like_lexicon() 338 + 339 + let assert Ok(_) = 340 + database.insert_lexicon(db, "app.bsky.feed.post", post_lexicon) 341 + 342 + let assert Ok(_) = 343 + database.insert_lexicon(db, "app.bsky.feed.like", like_lexicon) 344 + 345 + // Insert test records 346 + // A post 347 + let post_uri = "at://did:plc:author789/app.bsky.feed.post/post1" 348 + let post_json = 349 + json.object([#("text", json.string("Great content!"))]) 350 + |> json.to_string 351 + 352 + let assert Ok(_) = 353 + database.insert_record( 354 + db, 355 + post_uri, 356 + "cid_post", 357 + "did:plc:author789", 358 + "app.bsky.feed.post", 359 + post_json, 360 + ) 361 + 362 + // Multiple likes that reference the post (to test batching) 363 + let like1_uri = "at://did:plc:liker1/app.bsky.feed.like/like1" 364 + let like1_json = 365 + json.object([ 366 + #("subject", json.string(post_uri)), 367 + #("createdAt", json.string("2024-01-01T12:00:00Z")), 368 + ]) 369 + |> json.to_string 370 + 371 + let assert Ok(_) = 372 + database.insert_record( 373 + db, 374 + like1_uri, 375 + "cid_like1", 376 + "did:plc:liker1", 377 + "app.bsky.feed.like", 378 + like1_json, 379 + ) 380 + 381 + let like2_uri = "at://did:plc:liker2/app.bsky.feed.like/like2" 382 + let like2_json = 383 + json.object([ 384 + #("subject", json.string(post_uri)), 385 + #("createdAt", json.string("2024-01-01T12:05:00Z")), 386 + ]) 387 + |> json.to_string 388 + 389 + let assert Ok(_) = 390 + database.insert_record( 391 + db, 392 + like2_uri, 393 + "cid_like2", 394 + "did:plc:liker2", 395 + "app.bsky.feed.like", 396 + like2_json, 397 + ) 398 + 399 + // Execute GraphQL query with reverse join (now returns connection) 400 + let query = 401 + " 402 + { 403 + appBskyFeedPost { 404 + edges { 405 + node { 406 + uri 407 + appBskyFeedLikeViaSubject { 408 + edges { 409 + node { 410 + uri 411 + } 412 + } 413 + } 414 + } 415 + } 416 + } 417 + } 418 + " 419 + 420 + let assert Ok(response_json) = 421 + graphql_gleam.execute_query_with_db(db, query, "{}", Error(Nil), "") 422 + 423 + // Verify the response contains reverse join results 424 + string.contains(response_json, post_uri) 425 + |> should.be_true 426 + 427 + string.contains(response_json, "appBskyFeedLikeViaSubject") 428 + |> should.be_true 429 + 430 + // Both likes should be in the response 431 + string.contains(response_json, like1_uri) 432 + |> should.be_true 433 + 434 + string.contains(response_json, like2_uri) 435 + |> should.be_true 436 + } 437 + 438 + // Test: DataLoader batches multiple forward joins efficiently 439 + pub fn dataloader_batches_forward_joins_test() { 440 + // Setup database 441 + let assert Ok(db) = sqlight.open(":memory:") 442 + let assert Ok(_) = database.create_lexicon_table(db) 443 + let assert Ok(_) = database.create_record_table(db) 444 + let assert Ok(_) = database.create_actor_table(db) 445 + 446 + // Insert lexicons 447 + let post_lexicon = create_post_lexicon() 448 + let assert Ok(_) = 449 + database.insert_lexicon(db, "app.bsky.feed.post", post_lexicon) 450 + 451 + // Insert multiple parent posts 452 + let parent1_uri = "at://did:plc:user1/app.bsky.feed.post/parent1" 453 + let parent1_json = 454 + json.object([#("text", json.string("Parent post 1"))]) 455 + |> json.to_string 456 + 457 + let assert Ok(_) = 458 + database.insert_record( 459 + db, 460 + parent1_uri, 461 + "cid_p1", 462 + "did:plc:user1", 463 + "app.bsky.feed.post", 464 + parent1_json, 465 + ) 466 + 467 + let parent2_uri = "at://did:plc:user2/app.bsky.feed.post/parent2" 468 + let parent2_json = 469 + json.object([#("text", json.string("Parent post 2"))]) 470 + |> json.to_string 471 + 472 + let assert Ok(_) = 473 + database.insert_record( 474 + db, 475 + parent2_uri, 476 + "cid_p2", 477 + "did:plc:user2", 478 + "app.bsky.feed.post", 479 + parent2_json, 480 + ) 481 + 482 + // Insert multiple reply posts 483 + let reply1_uri = "at://did:plc:user3/app.bsky.feed.post/reply1" 484 + let reply1_json = 485 + json.object([ 486 + #("text", json.string("Reply to post 1")), 487 + #("replyTo", json.string(parent1_uri)), 488 + ]) 489 + |> json.to_string 490 + 491 + let assert Ok(_) = 492 + database.insert_record( 493 + db, 494 + reply1_uri, 495 + "cid_r1", 496 + "did:plc:user3", 497 + "app.bsky.feed.post", 498 + reply1_json, 499 + ) 500 + 501 + let reply2_uri = "at://did:plc:user4/app.bsky.feed.post/reply2" 502 + let reply2_json = 503 + json.object([ 504 + #("text", json.string("Reply to post 2")), 505 + #("replyTo", json.string(parent2_uri)), 506 + ]) 507 + |> json.to_string 508 + 509 + let assert Ok(_) = 510 + database.insert_record( 511 + db, 512 + reply2_uri, 513 + "cid_r2", 514 + "did:plc:user4", 515 + "app.bsky.feed.post", 516 + reply2_json, 517 + ) 518 + 519 + // Execute GraphQL query that fetches multiple posts with joins 520 + // DataLoader should batch the replyToResolved lookups 521 + let query = 522 + " 523 + { 524 + appBskyFeedPost { 525 + edges { 526 + node { 527 + uri 528 + replyToResolved { 529 + uri 530 + } 531 + } 532 + } 533 + } 534 + } 535 + " 536 + 537 + let assert Ok(response_json) = 538 + graphql_gleam.execute_query_with_db(db, query, "{}", Error(Nil), "") 539 + 540 + // Verify all posts appear 541 + string.contains(response_json, reply1_uri) 542 + |> should.be_true 543 + 544 + string.contains(response_json, reply2_uri) 545 + |> should.be_true 546 + 547 + string.contains(response_json, parent1_uri) 548 + |> should.be_true 549 + 550 + string.contains(response_json, parent2_uri) 551 + |> should.be_true 552 + // Note: To truly verify batching, we'd need to instrument the database 553 + // layer to count queries. For now, this test ensures correctness. 554 + } 555 + 556 + // Test: Reverse joins work with strongRef fields 557 + pub fn reverse_join_with_strong_ref_test() { 558 + // Setup database 559 + let assert Ok(db) = sqlight.open(":memory:") 560 + let assert Ok(_) = database.create_lexicon_table(db) 561 + let assert Ok(_) = database.create_record_table(db) 562 + let assert Ok(_) = database.create_actor_table(db) 563 + 564 + // Insert lexicons 565 + let post_lexicon = create_post_lexicon() 566 + let profile_lexicon = create_profile_lexicon() 567 + let assert Ok(_) = 568 + database.insert_lexicon(db, "app.bsky.feed.post", post_lexicon) 569 + let assert Ok(_) = 570 + database.insert_lexicon(db, "app.bsky.actor.profile", profile_lexicon) 571 + 572 + // Insert a post 573 + let post_uri = "at://did:plc:creator/app.bsky.feed.post/amazing" 574 + let post_json = 575 + json.object([#("text", json.string("Amazing post"))]) 576 + |> json.to_string 577 + 578 + let assert Ok(_) = 579 + database.insert_record( 580 + db, 581 + post_uri, 582 + "cid_amazing", 583 + "did:plc:creator", 584 + "app.bsky.feed.post", 585 + post_json, 586 + ) 587 + 588 + // Multiple profiles pin this post (using strongRef) 589 + let profile1_uri = "at://did:plc:user1/app.bsky.actor.profile/self" 590 + let profile1_json = 591 + json.object([ 592 + #("displayName", json.string("User One")), 593 + #( 594 + "pinnedPost", 595 + json.object([ 596 + #("uri", json.string(post_uri)), 597 + #("cid", json.string("cid_amazing")), 598 + ]), 599 + ), 600 + ]) 601 + |> json.to_string 602 + 603 + let assert Ok(_) = 604 + database.insert_record( 605 + db, 606 + profile1_uri, 607 + "cid_prof1", 608 + "did:plc:user1", 609 + "app.bsky.actor.profile", 610 + profile1_json, 611 + ) 612 + 613 + let profile2_uri = "at://did:plc:user2/app.bsky.actor.profile/self" 614 + let profile2_json = 615 + json.object([ 616 + #("displayName", json.string("User Two")), 617 + #( 618 + "pinnedPost", 619 + json.object([ 620 + #("uri", json.string(post_uri)), 621 + #("cid", json.string("cid_amazing")), 622 + ]), 623 + ), 624 + ]) 625 + |> json.to_string 626 + 627 + let assert Ok(_) = 628 + database.insert_record( 629 + db, 630 + profile2_uri, 631 + "cid_prof2", 632 + "did:plc:user2", 633 + "app.bsky.actor.profile", 634 + profile2_json, 635 + ) 636 + 637 + // Query post with reverse join to find all profiles that pinned it (now returns connection) 638 + let query = 639 + " 640 + { 641 + appBskyFeedPost { 642 + edges { 643 + node { 644 + uri 645 + appBskyActorProfileViaPinnedPost { 646 + edges { 647 + node { 648 + uri 649 + } 650 + } 651 + } 652 + } 653 + } 654 + } 655 + } 656 + " 657 + 658 + let assert Ok(response_json) = 659 + graphql_gleam.execute_query_with_db(db, query, "{}", Error(Nil), "") 660 + 661 + // Verify the reverse join through strongRef works 662 + string.contains(response_json, post_uri) 663 + |> should.be_true 664 + 665 + string.contains(response_json, "appBskyActorProfileViaPinnedPost") 666 + |> should.be_true 667 + 668 + string.contains(response_json, profile1_uri) 669 + |> should.be_true 670 + 671 + string.contains(response_json, profile2_uri) 672 + |> should.be_true 673 + } 674 + 675 + // Test: Forward join with union type and inline fragments 676 + pub fn forward_join_union_inline_fragments_test() { 677 + // Setup database 678 + let assert Ok(db) = sqlight.open(":memory:") 679 + let assert Ok(_) = database.create_lexicon_table(db) 680 + let assert Ok(_) = database.create_record_table(db) 681 + let assert Ok(_) = database.create_actor_table(db) 682 + 683 + // Insert lexicons 684 + let post_lexicon = create_post_lexicon() 685 + let like_lexicon = create_like_lexicon() 686 + let assert Ok(_) = 687 + database.insert_lexicon(db, "app.bsky.feed.post", post_lexicon) 688 + let assert Ok(_) = 689 + database.insert_lexicon(db, "app.bsky.feed.like", like_lexicon) 690 + 691 + // Insert a parent post 692 + let parent_post_uri = "at://did:plc:parent/app.bsky.feed.post/parent1" 693 + let parent_post_json = 694 + json.object([#("text", json.string("This is the parent post"))]) 695 + |> json.to_string 696 + 697 + let assert Ok(_) = 698 + database.insert_record( 699 + db, 700 + parent_post_uri, 701 + "cid_parent_post", 702 + "did:plc:parent", 703 + "app.bsky.feed.post", 704 + parent_post_json, 705 + ) 706 + 707 + // Insert a like that will be referenced 708 + let target_like_uri = "at://did:plc:liker/app.bsky.feed.like/like1" 709 + let target_like_json = 710 + json.object([ 711 + #("subject", json.string(parent_post_uri)), 712 + #("createdAt", json.string("2024-01-01T12:00:00Z")), 713 + ]) 714 + |> json.to_string 715 + 716 + let assert Ok(_) = 717 + database.insert_record( 718 + db, 719 + target_like_uri, 720 + "cid_like", 721 + "did:plc:liker", 722 + "app.bsky.feed.like", 723 + target_like_json, 724 + ) 725 + 726 + // Insert a reply post that references the parent (post) 727 + let reply_to_post_uri = "at://did:plc:user1/app.bsky.feed.post/reply1" 728 + let reply_to_post_json = 729 + json.object([ 730 + #("text", json.string("Reply to a post")), 731 + #("replyTo", json.string(parent_post_uri)), 732 + ]) 733 + |> json.to_string 734 + 735 + let assert Ok(_) = 736 + database.insert_record( 737 + db, 738 + reply_to_post_uri, 739 + "cid_reply1", 740 + "did:plc:user1", 741 + "app.bsky.feed.post", 742 + reply_to_post_json, 743 + ) 744 + 745 + // Insert a reply post that references the like 746 + let reply_to_like_uri = "at://did:plc:user2/app.bsky.feed.post/reply2" 747 + let reply_to_like_json = 748 + json.object([ 749 + #("text", json.string("Reply to a like")), 750 + #("replyTo", json.string(target_like_uri)), 751 + ]) 752 + |> json.to_string 753 + 754 + let assert Ok(_) = 755 + database.insert_record( 756 + db, 757 + reply_to_like_uri, 758 + "cid_reply2", 759 + "did:plc:user2", 760 + "app.bsky.feed.post", 761 + reply_to_like_json, 762 + ) 763 + 764 + // Execute GraphQL query with inline fragments to access type-specific fields 765 + let query = 766 + " 767 + { 768 + appBskyFeedPost { 769 + edges { 770 + node { 771 + uri 772 + text 773 + replyToResolved { 774 + ... on AppBskyFeedPost { 775 + uri 776 + text 777 + } 778 + ... on AppBskyFeedLike { 779 + uri 780 + subject 781 + createdAt 782 + } 783 + } 784 + } 785 + } 786 + } 787 + } 788 + " 789 + 790 + let assert Ok(response_json) = 791 + graphql_gleam.execute_query_with_db(db, query, "{}", Error(Nil), "") 792 + 793 + // Verify we can access type-specific fields through inline fragments 794 + 795 + // For the post reply, we should see the parent post's text 796 + string.contains(response_json, reply_to_post_uri) 797 + |> should.be_true 798 + 799 + string.contains(response_json, "Reply to a post") 800 + |> should.be_true 801 + 802 + string.contains(response_json, "This is the parent post") 803 + |> should.be_true 804 + 805 + // For the like reply, we should see the like's subject and createdAt 806 + string.contains(response_json, reply_to_like_uri) 807 + |> should.be_true 808 + 809 + string.contains(response_json, "Reply to a like") 810 + |> should.be_true 811 + 812 + string.contains(response_json, "2024-01-01T12:00:00Z") 813 + |> should.be_true 814 + 815 + // Verify the resolved records have the correct URIs 816 + string.contains(response_json, parent_post_uri) 817 + |> should.be_true 818 + 819 + string.contains(response_json, target_like_uri) 820 + |> should.be_true 821 + } 822 + 823 + // Helper to create a profile lexicon with literal:self key 824 + fn create_profile_lexicon_with_literal_self() -> String { 825 + json.object([ 826 + #("lexicon", json.int(1)), 827 + #("id", json.string("app.bsky.actor.profile")), 828 + #( 829 + "defs", 830 + json.object([ 831 + #( 832 + "main", 833 + json.object([ 834 + #("type", json.string("record")), 835 + #("key", json.string("literal:self")), 836 + #( 837 + "record", 838 + json.object([ 839 + #("type", json.string("object")), 840 + #( 841 + "properties", 842 + json.object([ 843 + #( 844 + "displayName", 845 + json.object([#("type", json.string("string"))]), 846 + ), 847 + #("bio", json.object([#("type", json.string("string"))])), 848 + ]), 849 + ), 850 + ]), 851 + ), 852 + ]), 853 + ), 854 + ]), 855 + ), 856 + ]) 857 + |> json.to_string 858 + } 859 + 860 + // Test: DID join to literal:self collection returns single nullable object 861 + pub fn did_join_to_literal_self_returns_single_test() { 862 + // Setup database 863 + let assert Ok(db) = sqlight.open(":memory:") 864 + let assert Ok(_) = database.create_lexicon_table(db) 865 + let assert Ok(_) = database.create_record_table(db) 866 + let assert Ok(_) = database.create_actor_table(db) 867 + 868 + // Insert lexicons 869 + let post_lexicon = create_post_lexicon() 870 + let profile_lexicon = create_profile_lexicon_with_literal_self() 871 + let assert Ok(_) = 872 + database.insert_lexicon(db, "app.bsky.feed.post", post_lexicon) 873 + let assert Ok(_) = 874 + database.insert_lexicon(db, "app.bsky.actor.profile", profile_lexicon) 875 + 876 + // Insert a profile with literal:self key 877 + let profile_uri = "at://did:plc:user123/app.bsky.actor.profile/self" 878 + let profile_json = 879 + json.object([ 880 + #("displayName", json.string("Alice")), 881 + #("bio", json.string("Software engineer")), 882 + ]) 883 + |> json.to_string 884 + 885 + let assert Ok(_) = 886 + database.insert_record( 887 + db, 888 + profile_uri, 889 + "cid_profile", 890 + "did:plc:user123", 891 + "app.bsky.actor.profile", 892 + profile_json, 893 + ) 894 + 895 + // Insert a post by the same DID 896 + let post_uri = "at://did:plc:user123/app.bsky.feed.post/post1" 897 + let post_json = 898 + json.object([#("text", json.string("My first post"))]) 899 + |> json.to_string 900 + 901 + let assert Ok(_) = 902 + database.insert_record( 903 + db, 904 + post_uri, 905 + "cid_post1", 906 + "did:plc:user123", 907 + "app.bsky.feed.post", 908 + post_json, 909 + ) 910 + 911 + // Execute GraphQL query with DID join from post to profile 912 + let query = 913 + " 914 + { 915 + appBskyFeedPost { 916 + edges { 917 + node { 918 + uri 919 + text 920 + appBskyActorProfileByDid { 921 + uri 922 + displayName 923 + bio 924 + } 925 + } 926 + } 927 + } 928 + } 929 + " 930 + 931 + let assert Ok(response_json) = 932 + graphql_gleam.execute_query_with_db(db, query, "{}", Error(Nil), "") 933 + 934 + // Verify the response contains the DID-joined profile as a single object (not array) 935 + string.contains(response_json, post_uri) 936 + |> should.be_true 937 + 938 + string.contains(response_json, "appBskyActorProfileByDid") 939 + |> should.be_true 940 + 941 + string.contains(response_json, profile_uri) 942 + |> should.be_true 943 + 944 + string.contains(response_json, "Alice") 945 + |> should.be_true 946 + 947 + string.contains(response_json, "Software engineer") 948 + |> should.be_true 949 + } 950 + 951 + // Test: DID join to non-literal:self collection returns list 952 + pub fn did_join_to_non_literal_self_returns_list_test() { 953 + // Setup database 954 + let assert Ok(db) = sqlight.open(":memory:") 955 + let assert Ok(_) = database.create_lexicon_table(db) 956 + let assert Ok(_) = database.create_record_table(db) 957 + let assert Ok(_) = database.create_actor_table(db) 958 + 959 + // Insert lexicons 960 + let post_lexicon = create_post_lexicon() 961 + let profile_lexicon = create_profile_lexicon_with_literal_self() 962 + let assert Ok(_) = 963 + database.insert_lexicon(db, "app.bsky.feed.post", post_lexicon) 964 + let assert Ok(_) = 965 + database.insert_lexicon(db, "app.bsky.actor.profile", profile_lexicon) 966 + 967 + // Insert a profile 968 + let profile_uri = "at://did:plc:author/app.bsky.actor.profile/self" 969 + let profile_json = 970 + json.object([ 971 + #("displayName", json.string("Bob")), 972 + #("bio", json.string("Writes a lot")), 973 + ]) 974 + |> json.to_string 975 + 976 + let assert Ok(_) = 977 + database.insert_record( 978 + db, 979 + profile_uri, 980 + "cid_profile", 981 + "did:plc:author", 982 + "app.bsky.actor.profile", 983 + profile_json, 984 + ) 985 + 986 + // Insert multiple posts by the same DID 987 + let post1_uri = "at://did:plc:author/app.bsky.feed.post/post1" 988 + let post1_json = 989 + json.object([#("text", json.string("First post"))]) 990 + |> json.to_string 991 + 992 + let assert Ok(_) = 993 + database.insert_record( 994 + db, 995 + post1_uri, 996 + "cid_post1", 997 + "did:plc:author", 998 + "app.bsky.feed.post", 999 + post1_json, 1000 + ) 1001 + 1002 + let post2_uri = "at://did:plc:author/app.bsky.feed.post/post2" 1003 + let post2_json = 1004 + json.object([#("text", json.string("Second post"))]) 1005 + |> json.to_string 1006 + 1007 + let assert Ok(_) = 1008 + database.insert_record( 1009 + db, 1010 + post2_uri, 1011 + "cid_post2", 1012 + "did:plc:author", 1013 + "app.bsky.feed.post", 1014 + post2_json, 1015 + ) 1016 + 1017 + // Execute GraphQL query with DID join from profile to posts (now returns connection) 1018 + let query = 1019 + " 1020 + { 1021 + appBskyActorProfile { 1022 + edges { 1023 + node { 1024 + uri 1025 + displayName 1026 + appBskyFeedPostByDid { 1027 + edges { 1028 + node { 1029 + uri 1030 + text 1031 + } 1032 + } 1033 + } 1034 + } 1035 + } 1036 + } 1037 + } 1038 + " 1039 + 1040 + let assert Ok(response_json) = 1041 + graphql_gleam.execute_query_with_db(db, query, "{}", Error(Nil), "") 1042 + 1043 + // Verify the response contains the DID-joined posts as a list 1044 + string.contains(response_json, profile_uri) 1045 + |> should.be_true 1046 + 1047 + string.contains(response_json, "appBskyFeedPostByDid") 1048 + |> should.be_true 1049 + 1050 + string.contains(response_json, post1_uri) 1051 + |> should.be_true 1052 + 1053 + string.contains(response_json, "First post") 1054 + |> should.be_true 1055 + 1056 + string.contains(response_json, post2_uri) 1057 + |> should.be_true 1058 + 1059 + string.contains(response_json, "Second post") 1060 + |> should.be_true 1061 + } 1062 + 1063 + // Test: DID join batches queries efficiently for multiple records 1064 + pub fn did_join_batches_queries_test() { 1065 + // Setup database 1066 + let assert Ok(db) = sqlight.open(":memory:") 1067 + let assert Ok(_) = database.create_lexicon_table(db) 1068 + let assert Ok(_) = database.create_record_table(db) 1069 + let assert Ok(_) = database.create_actor_table(db) 1070 + 1071 + // Insert lexicons 1072 + let post_lexicon = create_post_lexicon() 1073 + let profile_lexicon = create_profile_lexicon_with_literal_self() 1074 + let assert Ok(_) = 1075 + database.insert_lexicon(db, "app.bsky.feed.post", post_lexicon) 1076 + let assert Ok(_) = 1077 + database.insert_lexicon(db, "app.bsky.actor.profile", profile_lexicon) 1078 + 1079 + // Insert multiple profiles 1080 + let profile1_uri = "at://did:plc:user1/app.bsky.actor.profile/self" 1081 + let profile1_json = 1082 + json.object([ 1083 + #("displayName", json.string("User One")), 1084 + #("bio", json.string("First user")), 1085 + ]) 1086 + |> json.to_string 1087 + 1088 + let assert Ok(_) = 1089 + database.insert_record( 1090 + db, 1091 + profile1_uri, 1092 + "cid_profile1", 1093 + "did:plc:user1", 1094 + "app.bsky.actor.profile", 1095 + profile1_json, 1096 + ) 1097 + 1098 + let profile2_uri = "at://did:plc:user2/app.bsky.actor.profile/self" 1099 + let profile2_json = 1100 + json.object([ 1101 + #("displayName", json.string("User Two")), 1102 + #("bio", json.string("Second user")), 1103 + ]) 1104 + |> json.to_string 1105 + 1106 + let assert Ok(_) = 1107 + database.insert_record( 1108 + db, 1109 + profile2_uri, 1110 + "cid_profile2", 1111 + "did:plc:user2", 1112 + "app.bsky.actor.profile", 1113 + profile2_json, 1114 + ) 1115 + 1116 + // Insert posts by different DIDs 1117 + let post1_uri = "at://did:plc:user1/app.bsky.feed.post/post1" 1118 + let post1_json = 1119 + json.object([#("text", json.string("Post by user 1"))]) 1120 + |> json.to_string 1121 + 1122 + let assert Ok(_) = 1123 + database.insert_record( 1124 + db, 1125 + post1_uri, 1126 + "cid_post1", 1127 + "did:plc:user1", 1128 + "app.bsky.feed.post", 1129 + post1_json, 1130 + ) 1131 + 1132 + let post2_uri = "at://did:plc:user2/app.bsky.feed.post/post2" 1133 + let post2_json = 1134 + json.object([#("text", json.string("Post by user 2"))]) 1135 + |> json.to_string 1136 + 1137 + let assert Ok(_) = 1138 + database.insert_record( 1139 + db, 1140 + post2_uri, 1141 + "cid_post2", 1142 + "did:plc:user2", 1143 + "app.bsky.feed.post", 1144 + post2_json, 1145 + ) 1146 + 1147 + // Execute GraphQL query that fetches multiple posts with DID joins to profiles 1148 + // DataLoader should batch the profile lookups by DID 1149 + let query = 1150 + " 1151 + { 1152 + appBskyFeedPost { 1153 + edges { 1154 + node { 1155 + uri 1156 + text 1157 + appBskyActorProfileByDid { 1158 + uri 1159 + displayName 1160 + } 1161 + } 1162 + } 1163 + } 1164 + } 1165 + " 1166 + 1167 + let assert Ok(response_json) = 1168 + graphql_gleam.execute_query_with_db(db, query, "{}", Error(Nil), "") 1169 + 1170 + // Verify all posts and their associated profiles appear 1171 + string.contains(response_json, post1_uri) 1172 + |> should.be_true 1173 + 1174 + string.contains(response_json, "Post by user 1") 1175 + |> should.be_true 1176 + 1177 + string.contains(response_json, profile1_uri) 1178 + |> should.be_true 1179 + 1180 + string.contains(response_json, "User One") 1181 + |> should.be_true 1182 + 1183 + string.contains(response_json, post2_uri) 1184 + |> should.be_true 1185 + 1186 + string.contains(response_json, "Post by user 2") 1187 + |> should.be_true 1188 + 1189 + string.contains(response_json, profile2_uri) 1190 + |> should.be_true 1191 + 1192 + string.contains(response_json, "User Two") 1193 + |> should.be_true 1194 + // Note: To truly verify batching, we'd need to instrument the database 1195 + // layer to count queries. For now, this test ensures correctness. 1196 + }
+16
server/test/mutation_resolver_integration_test.gleam
··· 204 204 db_schema_builder.build_schema_with_fetcher( 205 205 parsed_lexicons, 206 206 empty_fetcher, 207 + option.None, 208 + option.None, 207 209 create_factory, 208 210 option.None, 209 211 option.None, ··· 252 254 db_schema_builder.build_schema_with_fetcher( 253 255 parsed_lexicons, 254 256 empty_fetcher, 257 + option.None, 258 + option.None, 255 259 create_factory, 256 260 option.None, 257 261 option.None, ··· 318 322 parsed_lexicons, 319 323 empty_fetcher, 320 324 option.None, 325 + option.None, 326 + option.None, 321 327 update_factory, 322 328 option.None, 323 329 option.None, ··· 381 387 db_schema_builder.build_schema_with_fetcher( 382 388 parsed_lexicons, 383 389 empty_fetcher, 390 + option.None, 391 + option.None, 384 392 option.None, 385 393 option.None, 386 394 delete_factory, ··· 446 454 parsed_lexicons, 447 455 empty_fetcher, 448 456 option.None, 457 + option.None, 449 458 update_factory, 459 + option.None, 450 460 option.None, 451 461 option.None, 452 462 ) ··· 494 504 empty_fetcher, 495 505 option.None, 496 506 option.None, 507 + option.None, 497 508 delete_factory, 509 + option.None, 498 510 option.None, 499 511 ) 500 512 ··· 553 565 option.None, 554 566 option.None, 555 567 option.None, 568 + option.None, 569 + option.None, 556 570 upload_blob_factory, 557 571 ) 558 572 ··· 626 640 db_schema_builder.build_schema_with_fetcher( 627 641 parsed_lexicons, 628 642 empty_fetcher, 643 + option.None, 644 + option.None, 629 645 option.None, 630 646 option.None, 631 647 option.None,
+785
server/test/nested_join_sortby_where_test.gleam
··· 1 + /// Integration tests for sortBy and where on nested join connections 2 + /// 3 + /// Tests verify that: 4 + /// - sortBy works on nested DID joins and reverse joins 5 + /// - where filters work on nested joins 6 + /// - totalCount reflects filtered results 7 + /// - Combination of sortBy + where works correctly 8 + import database 9 + import gleam/int 10 + import gleam/json 11 + import gleam/list 12 + import gleam/string 13 + import gleeunit/should 14 + import graphql_gleam 15 + import sqlight 16 + 17 + // Helper to create a status lexicon with createdAt field 18 + fn create_status_lexicon() -> String { 19 + json.object([ 20 + #("lexicon", json.int(1)), 21 + #("id", json.string("xyz.statusphere.status")), 22 + #( 23 + "defs", 24 + json.object([ 25 + #( 26 + "main", 27 + json.object([ 28 + #("type", json.string("record")), 29 + #("key", json.string("tid")), 30 + #( 31 + "record", 32 + json.object([ 33 + #("type", json.string("object")), 34 + #( 35 + "required", 36 + json.array([json.string("status")], of: fn(x) { x }), 37 + ), 38 + #( 39 + "properties", 40 + json.object([ 41 + #( 42 + "status", 43 + json.object([ 44 + #("type", json.string("string")), 45 + #("maxLength", json.int(300)), 46 + ]), 47 + ), 48 + #( 49 + "createdAt", 50 + json.object([ 51 + #("type", json.string("string")), 52 + #("format", json.string("datetime")), 53 + ]), 54 + ), 55 + ]), 56 + ), 57 + ]), 58 + ), 59 + ]), 60 + ), 61 + ]), 62 + ), 63 + ]) 64 + |> json.to_string 65 + } 66 + 67 + // Helper to create a profile lexicon 68 + fn create_profile_lexicon() -> String { 69 + json.object([ 70 + #("lexicon", json.int(1)), 71 + #("id", json.string("app.bsky.actor.profile")), 72 + #( 73 + "defs", 74 + json.object([ 75 + #( 76 + "main", 77 + json.object([ 78 + #("type", json.string("record")), 79 + #("key", json.string("literal:self")), 80 + #( 81 + "record", 82 + json.object([ 83 + #("type", json.string("object")), 84 + #( 85 + "properties", 86 + json.object([ 87 + #( 88 + "displayName", 89 + json.object([#("type", json.string("string"))]), 90 + ), 91 + ]), 92 + ), 93 + ]), 94 + ), 95 + ]), 96 + ), 97 + ]), 98 + ), 99 + ]) 100 + |> json.to_string 101 + } 102 + 103 + // Test: DID join with sortBy on createdAt DESC 104 + pub fn did_join_sortby_createdat_desc_test() { 105 + // Setup database 106 + let assert Ok(db) = sqlight.open(":memory:") 107 + let assert Ok(_) = database.create_lexicon_table(db) 108 + let assert Ok(_) = database.create_record_table(db) 109 + let assert Ok(_) = database.create_actor_table(db) 110 + 111 + // Insert lexicons 112 + let status_lexicon = create_status_lexicon() 113 + let profile_lexicon = create_profile_lexicon() 114 + let assert Ok(_) = 115 + database.insert_lexicon(db, "xyz.statusphere.status", status_lexicon) 116 + let assert Ok(_) = 117 + database.insert_lexicon(db, "app.bsky.actor.profile", profile_lexicon) 118 + 119 + // Insert a profile 120 + let profile_uri = "at://did:plc:user1/app.bsky.actor.profile/self" 121 + let profile_json = 122 + json.object([#("displayName", json.string("User One"))]) 123 + |> json.to_string 124 + 125 + let assert Ok(_) = 126 + database.insert_record( 127 + db, 128 + profile_uri, 129 + "cid_profile", 130 + "did:plc:user1", 131 + "app.bsky.actor.profile", 132 + profile_json, 133 + ) 134 + 135 + // Insert 5 statuses with DIFFERENT createdAt timestamps 136 + // Status 1: 2024-01-01 (oldest) 137 + // Status 2: 2024-01-02 138 + // Status 3: 2024-01-03 139 + // Status 4: 2024-01-04 140 + // Status 5: 2024-01-05 (newest) 141 + list.range(1, 5) 142 + |> list.each(fn(i) { 143 + let status_uri = 144 + "at://did:plc:user1/xyz.statusphere.status/status" <> int.to_string(i) 145 + let status_json = 146 + json.object([ 147 + #("status", json.string("Status #" <> int.to_string(i))), 148 + #( 149 + "createdAt", 150 + json.string("2024-01-0" <> int.to_string(i) <> "T12:00:00Z"), 151 + ), 152 + ]) 153 + |> json.to_string 154 + 155 + let assert Ok(_) = 156 + database.insert_record( 157 + db, 158 + status_uri, 159 + "cid_status" <> int.to_string(i), 160 + "did:plc:user1", 161 + "xyz.statusphere.status", 162 + status_json, 163 + ) 164 + Nil 165 + }) 166 + 167 + // Execute GraphQL query with sortBy createdAt DESC and first:3 168 + let query = 169 + " 170 + { 171 + appBskyActorProfile { 172 + edges { 173 + node { 174 + uri 175 + statuses: xyzStatusphereStatusByDid( 176 + first: 3 177 + sortBy: [{field: \"createdAt\", direction: DESC}] 178 + ) { 179 + totalCount 180 + edges { 181 + node { 182 + status 183 + createdAt 184 + } 185 + } 186 + } 187 + } 188 + } 189 + } 190 + } 191 + " 192 + 193 + let assert Ok(response_json) = 194 + graphql_gleam.execute_query_with_db(db, query, "{}", Error(Nil), "") 195 + 196 + // Verify totalCount is 5 (all statuses) 197 + { 198 + string.contains(response_json, "\"totalCount\":5") 199 + || string.contains(response_json, "\"totalCount\": 5") 200 + } 201 + |> should.be_true 202 + 203 + // With sortBy createdAt DESC, first:3 should return Status 5, 4, 3 (newest first) 204 + string.contains(response_json, "Status #5") 205 + |> should.be_true 206 + 207 + string.contains(response_json, "Status #4") 208 + |> should.be_true 209 + 210 + string.contains(response_json, "Status #3") 211 + |> should.be_true 212 + 213 + // Should NOT contain Status 1 or 2 (they're older) 214 + string.contains(response_json, "Status #1") 215 + |> should.be_false 216 + 217 + string.contains(response_json, "Status #2") 218 + |> should.be_false 219 + 220 + // Verify order: Status 5 should appear before Status 4 221 + let pos_5 = case string.split(response_json, "Status #5") { 222 + [before, ..] -> string.length(before) 223 + [] -> 999_999 224 + } 225 + 226 + let pos_4 = case string.split(response_json, "Status #4") { 227 + [before, ..] -> string.length(before) 228 + [] -> 999_999 229 + } 230 + 231 + let pos_3 = case string.split(response_json, "Status #3") { 232 + [before, ..] -> string.length(before) 233 + [] -> 999_999 234 + } 235 + 236 + // Status 5 should come before Status 4 237 + { pos_5 < pos_4 } 238 + |> should.be_true 239 + 240 + // Status 4 should come before Status 3 241 + { pos_4 < pos_3 } 242 + |> should.be_true 243 + } 244 + 245 + // Test: DID join with sortBy createdAt ASC 246 + pub fn did_join_sortby_createdat_asc_test() { 247 + // Setup database 248 + let assert Ok(db) = sqlight.open(":memory:") 249 + let assert Ok(_) = database.create_lexicon_table(db) 250 + let assert Ok(_) = database.create_record_table(db) 251 + let assert Ok(_) = database.create_actor_table(db) 252 + 253 + // Insert lexicons 254 + let status_lexicon = create_status_lexicon() 255 + let profile_lexicon = create_profile_lexicon() 256 + let assert Ok(_) = 257 + database.insert_lexicon(db, "xyz.statusphere.status", status_lexicon) 258 + let assert Ok(_) = 259 + database.insert_lexicon(db, "app.bsky.actor.profile", profile_lexicon) 260 + 261 + // Insert a profile 262 + let profile_uri = "at://did:plc:user1/app.bsky.actor.profile/self" 263 + let profile_json = 264 + json.object([#("displayName", json.string("User One"))]) 265 + |> json.to_string 266 + 267 + let assert Ok(_) = 268 + database.insert_record( 269 + db, 270 + profile_uri, 271 + "cid_profile", 272 + "did:plc:user1", 273 + "app.bsky.actor.profile", 274 + profile_json, 275 + ) 276 + 277 + // Insert 5 statuses with different timestamps 278 + list.range(1, 5) 279 + |> list.each(fn(i) { 280 + let status_uri = 281 + "at://did:plc:user1/xyz.statusphere.status/status" <> int.to_string(i) 282 + let status_json = 283 + json.object([ 284 + #("status", json.string("Status #" <> int.to_string(i))), 285 + #( 286 + "createdAt", 287 + json.string("2024-01-0" <> int.to_string(i) <> "T12:00:00Z"), 288 + ), 289 + ]) 290 + |> json.to_string 291 + 292 + let assert Ok(_) = 293 + database.insert_record( 294 + db, 295 + status_uri, 296 + "cid_status" <> int.to_string(i), 297 + "did:plc:user1", 298 + "xyz.statusphere.status", 299 + status_json, 300 + ) 301 + Nil 302 + }) 303 + 304 + // Execute GraphQL query with sortBy createdAt ASC and first:3 305 + let query = 306 + " 307 + { 308 + appBskyActorProfile { 309 + edges { 310 + node { 311 + statuses: xyzStatusphereStatusByDid( 312 + first: 3 313 + sortBy: [{field: \"createdAt\", direction: ASC}] 314 + ) { 315 + totalCount 316 + edges { 317 + node { 318 + status 319 + createdAt 320 + } 321 + } 322 + } 323 + } 324 + } 325 + } 326 + } 327 + " 328 + 329 + let assert Ok(response_json) = 330 + graphql_gleam.execute_query_with_db(db, query, "{}", Error(Nil), "") 331 + 332 + // With sortBy createdAt ASC, first:3 should return Status 1, 2, 3 (oldest first) 333 + string.contains(response_json, "Status #1") 334 + |> should.be_true 335 + 336 + string.contains(response_json, "Status #2") 337 + |> should.be_true 338 + 339 + string.contains(response_json, "Status #3") 340 + |> should.be_true 341 + 342 + // Should NOT contain Status 4 or 5 343 + string.contains(response_json, "Status #4") 344 + |> should.be_false 345 + 346 + string.contains(response_json, "Status #5") 347 + |> should.be_false 348 + } 349 + 350 + // Test: DID join with where filter on status field 351 + pub fn did_join_where_filter_test() { 352 + // Setup database 353 + let assert Ok(db) = sqlight.open(":memory:") 354 + let assert Ok(_) = database.create_lexicon_table(db) 355 + let assert Ok(_) = database.create_record_table(db) 356 + let assert Ok(_) = database.create_actor_table(db) 357 + 358 + // Insert lexicons 359 + let status_lexicon = create_status_lexicon() 360 + let profile_lexicon = create_profile_lexicon() 361 + let assert Ok(_) = 362 + database.insert_lexicon(db, "xyz.statusphere.status", status_lexicon) 363 + let assert Ok(_) = 364 + database.insert_lexicon(db, "app.bsky.actor.profile", profile_lexicon) 365 + 366 + // Insert a profile 367 + let profile_uri = "at://did:plc:user1/app.bsky.actor.profile/self" 368 + let profile_json = 369 + json.object([#("displayName", json.string("User One"))]) 370 + |> json.to_string 371 + 372 + let assert Ok(_) = 373 + database.insert_record( 374 + db, 375 + profile_uri, 376 + "cid_profile", 377 + "did:plc:user1", 378 + "app.bsky.actor.profile", 379 + profile_json, 380 + ) 381 + 382 + // Insert 5 statuses: 3 containing "gleam", 2 not 383 + let statuses = [ 384 + #("Status about gleam programming", "2024-01-01T12:00:00Z"), 385 + #("Random post", "2024-01-02T12:00:00Z"), 386 + #("Learning gleam today", "2024-01-03T12:00:00Z"), 387 + #("Another random post", "2024-01-04T12:00:00Z"), 388 + #("Gleam is awesome", "2024-01-05T12:00:00Z"), 389 + ] 390 + 391 + list.index_map(statuses, fn(status_data, idx) { 392 + let #(status_text, created_at) = status_data 393 + let i = idx + 1 394 + let status_uri = 395 + "at://did:plc:user1/xyz.statusphere.status/status" <> int.to_string(i) 396 + let status_json = 397 + json.object([ 398 + #("status", json.string(status_text)), 399 + #("createdAt", json.string(created_at)), 400 + ]) 401 + |> json.to_string 402 + 403 + let assert Ok(_) = 404 + database.insert_record( 405 + db, 406 + status_uri, 407 + "cid_status" <> int.to_string(i), 408 + "did:plc:user1", 409 + "xyz.statusphere.status", 410 + status_json, 411 + ) 412 + Nil 413 + }) 414 + 415 + // Execute GraphQL query with where filter: status contains "gleam" 416 + let query = 417 + " 418 + { 419 + appBskyActorProfile { 420 + edges { 421 + node { 422 + statuses: xyzStatusphereStatusByDid( 423 + sortBy: [{field: \"createdAt\", direction: DESC}] 424 + where: {status: {contains: \"gleam\"}} 425 + ) { 426 + totalCount 427 + edges { 428 + node { 429 + status 430 + createdAt 431 + } 432 + } 433 + } 434 + } 435 + } 436 + } 437 + } 438 + " 439 + 440 + let assert Ok(response_json) = 441 + graphql_gleam.execute_query_with_db(db, query, "{}", Error(Nil), "") 442 + 443 + // totalCount should be 3 (only statuses containing "gleam") 444 + { 445 + string.contains(response_json, "\"totalCount\":3") 446 + || string.contains(response_json, "\"totalCount\": 3") 447 + } 448 + |> should.be_true 449 + 450 + // Should contain statuses with "gleam" 451 + string.contains(response_json, "gleam programming") 452 + |> should.be_true 453 + 454 + string.contains(response_json, "Learning gleam") 455 + |> should.be_true 456 + 457 + string.contains(response_json, "Gleam is awesome") 458 + |> should.be_true 459 + 460 + // Should NOT contain "Random post" 461 + string.contains(response_json, "Random post") 462 + |> should.be_false 463 + } 464 + 465 + // Test: Combination of sortBy + where + first 466 + pub fn did_join_sortby_where_first_test() { 467 + // Setup database 468 + let assert Ok(db) = sqlight.open(":memory:") 469 + let assert Ok(_) = database.create_lexicon_table(db) 470 + let assert Ok(_) = database.create_record_table(db) 471 + let assert Ok(_) = database.create_actor_table(db) 472 + 473 + // Insert lexicons 474 + let status_lexicon = create_status_lexicon() 475 + let profile_lexicon = create_profile_lexicon() 476 + let assert Ok(_) = 477 + database.insert_lexicon(db, "xyz.statusphere.status", status_lexicon) 478 + let assert Ok(_) = 479 + database.insert_lexicon(db, "app.bsky.actor.profile", profile_lexicon) 480 + 481 + // Insert a profile 482 + let profile_uri = "at://did:plc:user1/app.bsky.actor.profile/self" 483 + let profile_json = 484 + json.object([#("displayName", json.string("User One"))]) 485 + |> json.to_string 486 + 487 + let assert Ok(_) = 488 + database.insert_record( 489 + db, 490 + profile_uri, 491 + "cid_profile", 492 + "did:plc:user1", 493 + "app.bsky.actor.profile", 494 + profile_json, 495 + ) 496 + 497 + // Insert 5 statuses: 3 containing "rust", 2 not 498 + let statuses = [ 499 + #("Learning rust basics", "2024-01-01T12:00:00Z"), 500 + #("Random gleam post", "2024-01-02T12:00:00Z"), 501 + #("Rust ownership is cool", "2024-01-03T12:00:00Z"), 502 + #("Another elixir post", "2024-01-04T12:00:00Z"), 503 + #("Rust async programming", "2024-01-05T12:00:00Z"), 504 + ] 505 + 506 + list.index_map(statuses, fn(status_data, idx) { 507 + let #(status_text, created_at) = status_data 508 + let i = idx + 1 509 + let status_uri = 510 + "at://did:plc:user1/xyz.statusphere.status/status" <> int.to_string(i) 511 + let status_json = 512 + json.object([ 513 + #("status", json.string(status_text)), 514 + #("createdAt", json.string(created_at)), 515 + ]) 516 + |> json.to_string 517 + 518 + let assert Ok(_) = 519 + database.insert_record( 520 + db, 521 + status_uri, 522 + "cid_status" <> int.to_string(i), 523 + "did:plc:user1", 524 + "xyz.statusphere.status", 525 + status_json, 526 + ) 527 + Nil 528 + }) 529 + 530 + // Execute query: where status contains "rust", sortBy createdAt DESC, first: 2 531 + // Should return the 2 newest rust posts 532 + let query = 533 + " 534 + { 535 + appBskyActorProfile { 536 + edges { 537 + node { 538 + statuses: xyzStatusphereStatusByDid( 539 + first: 2 540 + sortBy: [{field: \"createdAt\", direction: DESC}] 541 + where: {status: {contains: \"rust\"}} 542 + ) { 543 + totalCount 544 + edges { 545 + node { 546 + status 547 + createdAt 548 + } 549 + } 550 + pageInfo { 551 + hasNextPage 552 + } 553 + } 554 + } 555 + } 556 + } 557 + } 558 + " 559 + 560 + let assert Ok(response_json) = 561 + graphql_gleam.execute_query_with_db(db, query, "{}", Error(Nil), "") 562 + 563 + // totalCount should be 3 (all rust posts) 564 + { 565 + string.contains(response_json, "\"totalCount\":3") 566 + || string.contains(response_json, "\"totalCount\": 3") 567 + } 568 + |> should.be_true 569 + 570 + // Should contain the 2 newest rust posts 571 + string.contains(response_json, "Rust async programming") 572 + |> should.be_true 573 + 574 + string.contains(response_json, "Rust ownership") 575 + |> should.be_true 576 + 577 + // Should NOT contain the oldest rust post 578 + string.contains(response_json, "Learning rust basics") 579 + |> should.be_false 580 + 581 + // Should NOT contain non-rust posts 582 + string.contains(response_json, "elixir") 583 + |> should.be_false 584 + 585 + string.contains(response_json, "gleam") 586 + |> should.be_false 587 + 588 + // hasNextPage should be true (1 more rust post available) 589 + { 590 + string.contains(response_json, "\"hasNextPage\":true") 591 + || string.contains(response_json, "\"hasNextPage\": true") 592 + } 593 + |> should.be_true 594 + } 595 + 596 + // Test: Exact query pattern from user - top-level where + nested sortBy 597 + pub fn user_query_pattern_test() { 598 + // Setup database 599 + let assert Ok(db) = sqlight.open(":memory:") 600 + let assert Ok(_) = database.create_lexicon_table(db) 601 + let assert Ok(_) = database.create_record_table(db) 602 + let assert Ok(_) = database.create_actor_table(db) 603 + 604 + // Insert lexicons 605 + let status_lexicon = create_status_lexicon() 606 + let profile_lexicon = create_profile_lexicon() 607 + let assert Ok(_) = 608 + database.insert_lexicon(db, "xyz.statusphere.status", status_lexicon) 609 + let assert Ok(_) = 610 + database.insert_lexicon(db, "app.bsky.actor.profile", profile_lexicon) 611 + 612 + // Insert 2 profiles with different handles 613 + let profile1_uri = "at://did:plc:user1/app.bsky.actor.profile/self" 614 + let profile1_json = 615 + json.object([#("displayName", json.string("User One"))]) 616 + |> json.to_string 617 + 618 + let assert Ok(_) = 619 + database.insert_record( 620 + db, 621 + profile1_uri, 622 + "cid_profile1", 623 + "did:plc:user1", 624 + "app.bsky.actor.profile", 625 + profile1_json, 626 + ) 627 + 628 + let assert Ok(_) = 629 + database.upsert_actor(db, "did:plc:user1", "chadtmiller.com") 630 + 631 + let profile2_uri = "at://did:plc:user2/app.bsky.actor.profile/self" 632 + let profile2_json = 633 + json.object([#("displayName", json.string("User Two"))]) 634 + |> json.to_string 635 + 636 + let assert Ok(_) = 637 + database.insert_record( 638 + db, 639 + profile2_uri, 640 + "cid_profile2", 641 + "did:plc:user2", 642 + "app.bsky.actor.profile", 643 + profile2_json, 644 + ) 645 + 646 + let assert Ok(_) = database.upsert_actor(db, "did:plc:user2", "other.com") 647 + 648 + // Insert statuses for user1 (chadtmiller.com) 649 + let statuses1 = [ 650 + #("First status", "2024-01-01T12:00:00Z"), 651 + #("Second status", "2024-01-02T12:00:00Z"), 652 + #("Third status", "2024-01-03T12:00:00Z"), 653 + ] 654 + 655 + list.index_map(statuses1, fn(status_data, idx) { 656 + let #(status_text, created_at) = status_data 657 + let i = idx + 1 658 + let status_uri = 659 + "at://did:plc:user1/xyz.statusphere.status/status" <> int.to_string(i) 660 + let status_json = 661 + json.object([ 662 + #("status", json.string(status_text)), 663 + #("createdAt", json.string(created_at)), 664 + ]) 665 + |> json.to_string 666 + 667 + let assert Ok(_) = 668 + database.insert_record( 669 + db, 670 + status_uri, 671 + "cid_status1_" <> int.to_string(i), 672 + "did:plc:user1", 673 + "xyz.statusphere.status", 674 + status_json, 675 + ) 676 + Nil 677 + }) 678 + 679 + // Insert statuses for user2 (should be filtered out) 680 + let assert Ok(_) = 681 + database.insert_record( 682 + db, 683 + "at://did:plc:user2/xyz.statusphere.status/status1", 684 + "cid_status2_1", 685 + "did:plc:user2", 686 + "xyz.statusphere.status", 687 + json.object([ 688 + #("status", json.string("Other user status")), 689 + #("createdAt", json.string("2024-01-01T12:00:00Z")), 690 + ]) 691 + |> json.to_string, 692 + ) 693 + 694 + // Execute query matching user's pattern: top-level where + nested sortBy 695 + let query = 696 + " 697 + { 698 + appBskyActorProfile(where: {actorHandle: {eq: \"chadtmiller.com\"}}) { 699 + totalCount 700 + edges { 701 + node { 702 + actorHandle 703 + statuses: xyzStatusphereStatusByDid( 704 + first: 12 705 + sortBy: [{field: \"createdAt\", direction: DESC}] 706 + ) { 707 + totalCount 708 + edges { 709 + node { 710 + status 711 + createdAt 712 + } 713 + } 714 + } 715 + } 716 + } 717 + } 718 + } 719 + " 720 + 721 + let assert Ok(response_json) = 722 + graphql_gleam.execute_query_with_db(db, query, "{}", Error(Nil), "") 723 + 724 + // Should only return 1 profile (chadtmiller.com) 725 + { 726 + string.contains(response_json, "\"totalCount\":1") 727 + || string.contains(response_json, "\"totalCount\": 1") 728 + } 729 + |> should.be_true 730 + 731 + // Should contain chadtmiller.com handle 732 + string.contains(response_json, "chadtmiller.com") 733 + |> should.be_true 734 + 735 + // Should NOT contain other.com 736 + string.contains(response_json, "other.com") 737 + |> should.be_false 738 + 739 + // Nested statuses should have totalCount 3 740 + // Count occurrences of totalCount in response 741 + let total_count_occurrences = 742 + string.split(response_json, "\"totalCount\"") 743 + |> list.length 744 + |> fn(x) { x - 1 } 745 + 746 + // Should have 2 totalCount fields (one for profiles, one for statuses) 747 + total_count_occurrences 748 + |> should.equal(2) 749 + 750 + // Statuses should be in DESC order: Third, Second, First 751 + string.contains(response_json, "Third status") 752 + |> should.be_true 753 + 754 + string.contains(response_json, "Second status") 755 + |> should.be_true 756 + 757 + string.contains(response_json, "First status") 758 + |> should.be_true 759 + 760 + // Verify order: Third should come before Second 761 + let pos_third = case string.split(response_json, "Third status") { 762 + [before, ..] -> string.length(before) 763 + [] -> 999_999 764 + } 765 + 766 + let pos_second = case string.split(response_json, "Second status") { 767 + [before, ..] -> string.length(before) 768 + [] -> 999_999 769 + } 770 + 771 + let pos_first = case string.split(response_json, "First status") { 772 + [before, ..] -> string.length(before) 773 + [] -> 999_999 774 + } 775 + 776 + { pos_third < pos_second } 777 + |> should.be_true 778 + 779 + { pos_second < pos_first } 780 + |> should.be_true 781 + 782 + // Should NOT contain other user's status 783 + string.contains(response_json, "Other user status") 784 + |> should.be_false 785 + }
+578
server/test/paginated_join_test.gleam
··· 1 + /// Integration tests for paginated joins (connections) 2 + /// 3 + /// Tests verify that: 4 + /// - DID joins return paginated connections with first/after/last/before 5 + /// - Reverse joins return paginated connections 6 + /// - PageInfo is correctly populated 7 + /// - Cursors work for pagination 8 + import database 9 + import gleam/int 10 + import gleam/json 11 + import gleam/list 12 + import gleam/string 13 + import gleeunit/should 14 + import graphql_gleam 15 + import sqlight 16 + 17 + // Helper to create a post lexicon JSON 18 + fn create_post_lexicon() -> String { 19 + json.object([ 20 + #("lexicon", json.int(1)), 21 + #("id", json.string("app.bsky.feed.post")), 22 + #( 23 + "defs", 24 + json.object([ 25 + #( 26 + "main", 27 + json.object([ 28 + #("type", json.string("record")), 29 + #("key", json.string("tid")), 30 + #( 31 + "record", 32 + json.object([ 33 + #("type", json.string("object")), 34 + #( 35 + "required", 36 + json.array([json.string("text")], of: fn(x) { x }), 37 + ), 38 + #( 39 + "properties", 40 + json.object([ 41 + #( 42 + "text", 43 + json.object([ 44 + #("type", json.string("string")), 45 + #("maxLength", json.int(300)), 46 + ]), 47 + ), 48 + ]), 49 + ), 50 + ]), 51 + ), 52 + ]), 53 + ), 54 + ]), 55 + ), 56 + ]) 57 + |> json.to_string 58 + } 59 + 60 + // Helper to create a like lexicon JSON with subject field 61 + fn create_like_lexicon() -> String { 62 + json.object([ 63 + #("lexicon", json.int(1)), 64 + #("id", json.string("app.bsky.feed.like")), 65 + #( 66 + "defs", 67 + json.object([ 68 + #( 69 + "main", 70 + json.object([ 71 + #("type", json.string("record")), 72 + #("key", json.string("tid")), 73 + #( 74 + "record", 75 + json.object([ 76 + #("type", json.string("object")), 77 + #( 78 + "required", 79 + json.array([json.string("subject")], of: fn(x) { x }), 80 + ), 81 + #( 82 + "properties", 83 + json.object([ 84 + #( 85 + "subject", 86 + json.object([ 87 + #("type", json.string("string")), 88 + #("format", json.string("at-uri")), 89 + ]), 90 + ), 91 + #( 92 + "createdAt", 93 + json.object([ 94 + #("type", json.string("string")), 95 + #("format", json.string("datetime")), 96 + ]), 97 + ), 98 + ]), 99 + ), 100 + ]), 101 + ), 102 + ]), 103 + ), 104 + ]), 105 + ), 106 + ]) 107 + |> json.to_string 108 + } 109 + 110 + // Helper to create a profile lexicon with literal:self key 111 + fn create_profile_lexicon() -> String { 112 + json.object([ 113 + #("lexicon", json.int(1)), 114 + #("id", json.string("app.bsky.actor.profile")), 115 + #( 116 + "defs", 117 + json.object([ 118 + #( 119 + "main", 120 + json.object([ 121 + #("type", json.string("record")), 122 + #("key", json.string("literal:self")), 123 + #( 124 + "record", 125 + json.object([ 126 + #("type", json.string("object")), 127 + #( 128 + "properties", 129 + json.object([ 130 + #( 131 + "displayName", 132 + json.object([#("type", json.string("string"))]), 133 + ), 134 + ]), 135 + ), 136 + ]), 137 + ), 138 + ]), 139 + ), 140 + ]), 141 + ), 142 + ]) 143 + |> json.to_string 144 + } 145 + 146 + // Test: DID join with first:1 returns only 1 result 147 + pub fn did_join_first_one_test() { 148 + // Setup database 149 + let assert Ok(db) = sqlight.open(":memory:") 150 + let assert Ok(_) = database.create_lexicon_table(db) 151 + let assert Ok(_) = database.create_record_table(db) 152 + let assert Ok(_) = database.create_actor_table(db) 153 + 154 + // Insert lexicons 155 + let post_lexicon = create_post_lexicon() 156 + let profile_lexicon = create_profile_lexicon() 157 + let assert Ok(_) = 158 + database.insert_lexicon(db, "app.bsky.feed.post", post_lexicon) 159 + let assert Ok(_) = 160 + database.insert_lexicon(db, "app.bsky.actor.profile", profile_lexicon) 161 + 162 + // Insert a profile 163 + let profile_uri = "at://did:plc:author/app.bsky.actor.profile/self" 164 + let profile_json = 165 + json.object([#("displayName", json.string("Author"))]) 166 + |> json.to_string 167 + 168 + let assert Ok(_) = 169 + database.insert_record( 170 + db, 171 + profile_uri, 172 + "cid_profile", 173 + "did:plc:author", 174 + "app.bsky.actor.profile", 175 + profile_json, 176 + ) 177 + 178 + // Insert 5 posts by the same DID 179 + list.range(1, 5) 180 + |> list.each(fn(i) { 181 + let post_uri = 182 + "at://did:plc:author/app.bsky.feed.post/post" <> int.to_string(i) 183 + let post_json = 184 + json.object([ 185 + #("text", json.string("Post number " <> int.to_string(i))), 186 + ]) 187 + |> json.to_string 188 + 189 + let assert Ok(_) = 190 + database.insert_record( 191 + db, 192 + post_uri, 193 + "cid_post" <> int.to_string(i), 194 + "did:plc:author", 195 + "app.bsky.feed.post", 196 + post_json, 197 + ) 198 + Nil 199 + }) 200 + 201 + // Execute GraphQL query with DID join and first:1 202 + let query = 203 + " 204 + { 205 + appBskyActorProfile { 206 + edges { 207 + node { 208 + uri 209 + appBskyFeedPostByDid(first: 1) { 210 + edges { 211 + node { 212 + uri 213 + text 214 + } 215 + } 216 + pageInfo { 217 + hasNextPage 218 + hasPreviousPage 219 + } 220 + } 221 + } 222 + } 223 + } 224 + } 225 + " 226 + 227 + let assert Ok(response_json) = 228 + graphql_gleam.execute_query_with_db(db, query, "{}", Error(Nil), "") 229 + 230 + // Verify only 1 post is returned 231 + string.contains(response_json, "\"edges\"") 232 + |> should.be_true 233 + 234 + // Count how many post URIs appear (should be 1) 235 + let post_count = 236 + list.range(1, 5) 237 + |> list.filter(fn(i) { 238 + string.contains( 239 + response_json, 240 + "at://did:plc:author/app.bsky.feed.post/post" <> int.to_string(i), 241 + ) 242 + }) 243 + |> list.length 244 + 245 + post_count 246 + |> should.equal(1) 247 + 248 + // Verify hasNextPage is true (more posts available) 249 + { 250 + string.contains(response_json, "\"hasNextPage\":true") 251 + || string.contains(response_json, "\"hasNextPage\": true") 252 + } 253 + |> should.be_true 254 + } 255 + 256 + // Test: DID join with first:2 returns only 2 results 257 + pub fn did_join_first_two_test() { 258 + // Setup database 259 + let assert Ok(db) = sqlight.open(":memory:") 260 + let assert Ok(_) = database.create_lexicon_table(db) 261 + let assert Ok(_) = database.create_record_table(db) 262 + let assert Ok(_) = database.create_actor_table(db) 263 + 264 + // Insert lexicons 265 + let post_lexicon = create_post_lexicon() 266 + let profile_lexicon = create_profile_lexicon() 267 + let assert Ok(_) = 268 + database.insert_lexicon(db, "app.bsky.feed.post", post_lexicon) 269 + let assert Ok(_) = 270 + database.insert_lexicon(db, "app.bsky.actor.profile", profile_lexicon) 271 + 272 + // Insert a profile 273 + let profile_uri = "at://did:plc:author/app.bsky.actor.profile/self" 274 + let profile_json = 275 + json.object([#("displayName", json.string("Author"))]) 276 + |> json.to_string 277 + 278 + let assert Ok(_) = 279 + database.insert_record( 280 + db, 281 + profile_uri, 282 + "cid_profile", 283 + "did:plc:author", 284 + "app.bsky.actor.profile", 285 + profile_json, 286 + ) 287 + 288 + // Insert 5 posts by the same DID 289 + list.range(1, 5) 290 + |> list.each(fn(i) { 291 + let post_uri = 292 + "at://did:plc:author/app.bsky.feed.post/post" <> int.to_string(i) 293 + let post_json = 294 + json.object([ 295 + #("text", json.string("Post number " <> int.to_string(i))), 296 + ]) 297 + |> json.to_string 298 + 299 + let assert Ok(_) = 300 + database.insert_record( 301 + db, 302 + post_uri, 303 + "cid_post" <> int.to_string(i), 304 + "did:plc:author", 305 + "app.bsky.feed.post", 306 + post_json, 307 + ) 308 + Nil 309 + }) 310 + 311 + // Execute GraphQL query with DID join and first:2 312 + let query = 313 + " 314 + { 315 + appBskyActorProfile { 316 + edges { 317 + node { 318 + uri 319 + appBskyFeedPostByDid(first: 2) { 320 + edges { 321 + node { 322 + uri 323 + text 324 + } 325 + } 326 + pageInfo { 327 + hasNextPage 328 + hasPreviousPage 329 + } 330 + } 331 + } 332 + } 333 + } 334 + } 335 + " 336 + 337 + let assert Ok(response_json) = 338 + graphql_gleam.execute_query_with_db(db, query, "{}", Error(Nil), "") 339 + 340 + // Count how many post URIs appear (should be 2) 341 + let post_count = 342 + list.range(1, 5) 343 + |> list.filter(fn(i) { 344 + string.contains( 345 + response_json, 346 + "at://did:plc:author/app.bsky.feed.post/post" <> int.to_string(i), 347 + ) 348 + }) 349 + |> list.length 350 + 351 + post_count 352 + |> should.equal(2) 353 + 354 + // Verify hasNextPage is true (more posts available) 355 + { 356 + string.contains(response_json, "\"hasNextPage\":true") 357 + || string.contains(response_json, "\"hasNextPage\": true") 358 + } 359 + |> should.be_true 360 + } 361 + 362 + // Test: Reverse join with first:1 returns only 1 result 363 + pub fn reverse_join_first_one_test() { 364 + // Setup database 365 + let assert Ok(db) = sqlight.open(":memory:") 366 + let assert Ok(_) = database.create_lexicon_table(db) 367 + let assert Ok(_) = database.create_record_table(db) 368 + let assert Ok(_) = database.create_actor_table(db) 369 + 370 + // Insert lexicons 371 + let post_lexicon = create_post_lexicon() 372 + let like_lexicon = create_like_lexicon() 373 + let assert Ok(_) = 374 + database.insert_lexicon(db, "app.bsky.feed.post", post_lexicon) 375 + let assert Ok(_) = 376 + database.insert_lexicon(db, "app.bsky.feed.like", like_lexicon) 377 + 378 + // Insert a post 379 + let post_uri = "at://did:plc:author/app.bsky.feed.post/post1" 380 + let post_json = 381 + json.object([#("text", json.string("Great post!"))]) 382 + |> json.to_string 383 + 384 + let assert Ok(_) = 385 + database.insert_record( 386 + db, 387 + post_uri, 388 + "cid_post", 389 + "did:plc:author", 390 + "app.bsky.feed.post", 391 + post_json, 392 + ) 393 + 394 + // Insert 5 likes that reference the post 395 + list.range(1, 5) 396 + |> list.each(fn(i) { 397 + let like_uri = 398 + "at://did:plc:liker" 399 + <> int.to_string(i) 400 + <> "/app.bsky.feed.like/like" 401 + <> int.to_string(i) 402 + let like_json = 403 + json.object([ 404 + #("subject", json.string(post_uri)), 405 + #("createdAt", json.string("2024-01-01T12:00:00Z")), 406 + ]) 407 + |> json.to_string 408 + 409 + let assert Ok(_) = 410 + database.insert_record( 411 + db, 412 + like_uri, 413 + "cid_like" <> int.to_string(i), 414 + "did:plc:liker" <> int.to_string(i), 415 + "app.bsky.feed.like", 416 + like_json, 417 + ) 418 + Nil 419 + }) 420 + 421 + // Execute GraphQL query with reverse join and first:1 422 + let query = 423 + " 424 + { 425 + appBskyFeedPost { 426 + edges { 427 + node { 428 + uri 429 + appBskyFeedLikeViaSubject(first: 1) { 430 + edges { 431 + node { 432 + uri 433 + } 434 + } 435 + pageInfo { 436 + hasNextPage 437 + hasPreviousPage 438 + } 439 + } 440 + } 441 + } 442 + } 443 + } 444 + " 445 + 446 + let assert Ok(response_json) = 447 + graphql_gleam.execute_query_with_db(db, query, "{}", Error(Nil), "") 448 + 449 + // Count how many like URIs appear (should be 1) 450 + let like_count = 451 + list.range(1, 5) 452 + |> list.filter(fn(i) { 453 + string.contains( 454 + response_json, 455 + "at://did:plc:liker" 456 + <> int.to_string(i) 457 + <> "/app.bsky.feed.like/like" 458 + <> int.to_string(i), 459 + ) 460 + }) 461 + |> list.length 462 + 463 + like_count 464 + |> should.equal(1) 465 + 466 + // Verify hasNextPage is true (more likes available) 467 + { 468 + string.contains(response_json, "\"hasNextPage\":true") 469 + || string.contains(response_json, "\"hasNextPage\": true") 470 + } 471 + |> should.be_true 472 + } 473 + 474 + // Test: DID join with no pagination args defaults to first:50 475 + pub fn did_join_default_pagination_test() { 476 + // Setup database 477 + let assert Ok(db) = sqlight.open(":memory:") 478 + let assert Ok(_) = database.create_lexicon_table(db) 479 + let assert Ok(_) = database.create_record_table(db) 480 + let assert Ok(_) = database.create_actor_table(db) 481 + 482 + // Insert lexicons 483 + let post_lexicon = create_post_lexicon() 484 + let profile_lexicon = create_profile_lexicon() 485 + let assert Ok(_) = 486 + database.insert_lexicon(db, "app.bsky.feed.post", post_lexicon) 487 + let assert Ok(_) = 488 + database.insert_lexicon(db, "app.bsky.actor.profile", profile_lexicon) 489 + 490 + // Insert a profile 491 + let profile_uri = "at://did:plc:author/app.bsky.actor.profile/self" 492 + let profile_json = 493 + json.object([#("displayName", json.string("Author"))]) 494 + |> json.to_string 495 + 496 + let assert Ok(_) = 497 + database.insert_record( 498 + db, 499 + profile_uri, 500 + "cid_profile", 501 + "did:plc:author", 502 + "app.bsky.actor.profile", 503 + profile_json, 504 + ) 505 + 506 + // Insert 3 posts by the same DID 507 + list.range(1, 3) 508 + |> list.each(fn(i) { 509 + let post_uri = 510 + "at://did:plc:author/app.bsky.feed.post/post" <> int.to_string(i) 511 + let post_json = 512 + json.object([ 513 + #("text", json.string("Post number " <> int.to_string(i))), 514 + ]) 515 + |> json.to_string 516 + 517 + let assert Ok(_) = 518 + database.insert_record( 519 + db, 520 + post_uri, 521 + "cid_post" <> int.to_string(i), 522 + "did:plc:author", 523 + "app.bsky.feed.post", 524 + post_json, 525 + ) 526 + Nil 527 + }) 528 + 529 + // Execute GraphQL query with DID join and NO pagination args (should default to first:50) 530 + let query = 531 + " 532 + { 533 + appBskyActorProfile { 534 + edges { 535 + node { 536 + uri 537 + appBskyFeedPostByDid { 538 + edges { 539 + node { 540 + uri 541 + text 542 + } 543 + } 544 + pageInfo { 545 + hasNextPage 546 + hasPreviousPage 547 + } 548 + } 549 + } 550 + } 551 + } 552 + } 553 + " 554 + 555 + let assert Ok(response_json) = 556 + graphql_gleam.execute_query_with_db(db, query, "{}", Error(Nil), "") 557 + 558 + // All 3 posts should be returned (within default limit of 50) 559 + let post_count = 560 + list.range(1, 3) 561 + |> list.filter(fn(i) { 562 + string.contains( 563 + response_json, 564 + "at://did:plc:author/app.bsky.feed.post/post" <> int.to_string(i), 565 + ) 566 + }) 567 + |> list.length 568 + 569 + post_count 570 + |> should.equal(3) 571 + 572 + // Verify hasNextPage is false (no more posts) 573 + { 574 + string.contains(response_json, "\"hasNextPage\":false") 575 + || string.contains(response_json, "\"hasNextPage\": false") 576 + } 577 + |> should.be_true 578 + }
+598
server/test/reverse_join_field_resolution_test.gleam
··· 1 + /// Regression tests for reverse join field resolution bugs 2 + /// 3 + /// Tests verify fixes for: 4 + /// 1. Forward join fields (like itemResolved) available through reverse joins 5 + /// 2. Integer and object fields resolved correctly (not always converted to strings) 6 + /// 3. Nested queries work correctly: profile → galleries → items → photos 7 + import database 8 + import gleam/json 9 + import gleam/string 10 + import gleeunit/should 11 + import graphql_gleam 12 + import sqlight 13 + import gleam/bool 14 + 15 + // Helper to create gallery lexicon 16 + fn create_gallery_lexicon() -> String { 17 + json.object([ 18 + #("lexicon", json.int(1)), 19 + #("id", json.string("social.grain.gallery")), 20 + #( 21 + "defs", 22 + json.object([ 23 + #( 24 + "main", 25 + json.object([ 26 + #("type", json.string("record")), 27 + #("key", json.string("tid")), 28 + #( 29 + "record", 30 + json.object([ 31 + #("type", json.string("object")), 32 + #( 33 + "required", 34 + json.array([json.string("title"), json.string("createdAt")], of: fn( 35 + x, 36 + ) { x }), 37 + ), 38 + #( 39 + "properties", 40 + json.object([ 41 + #( 42 + "title", 43 + json.object([ 44 + #("type", json.string("string")), 45 + #("maxLength", json.int(100)), 46 + ]), 47 + ), 48 + #( 49 + "createdAt", 50 + json.object([ 51 + #("type", json.string("string")), 52 + #("format", json.string("datetime")), 53 + ]), 54 + ), 55 + ]), 56 + ), 57 + ]), 58 + ), 59 + ]), 60 + ), 61 + ]), 62 + ), 63 + ]) 64 + |> json.to_string 65 + } 66 + 67 + // Helper to create gallery item lexicon with position field (integer) 68 + fn create_gallery_item_lexicon() -> String { 69 + json.object([ 70 + #("lexicon", json.int(1)), 71 + #("id", json.string("social.grain.gallery.item")), 72 + #( 73 + "defs", 74 + json.object([ 75 + #( 76 + "main", 77 + json.object([ 78 + #("type", json.string("record")), 79 + #("key", json.string("tid")), 80 + #( 81 + "record", 82 + json.object([ 83 + #("type", json.string("object")), 84 + #( 85 + "required", 86 + json.array( 87 + [ 88 + json.string("createdAt"), 89 + json.string("gallery"), 90 + json.string("item"), 91 + ], 92 + of: fn(x) { x }, 93 + ), 94 + ), 95 + #( 96 + "properties", 97 + json.object([ 98 + #( 99 + "createdAt", 100 + json.object([ 101 + #("type", json.string("string")), 102 + #("format", json.string("datetime")), 103 + ]), 104 + ), 105 + #( 106 + "gallery", 107 + json.object([ 108 + #("type", json.string("string")), 109 + #("format", json.string("at-uri")), 110 + ]), 111 + ), 112 + #( 113 + "item", 114 + json.object([ 115 + #("type", json.string("string")), 116 + #("format", json.string("at-uri")), 117 + ]), 118 + ), 119 + #( 120 + "position", 121 + json.object([ 122 + #("type", json.string("integer")), 123 + #("default", json.int(0)), 124 + ]), 125 + ), 126 + ]), 127 + ), 128 + ]), 129 + ), 130 + ]), 131 + ), 132 + ]), 133 + ), 134 + ]) 135 + |> json.to_string 136 + } 137 + 138 + // Helper to create photo lexicon 139 + fn create_photo_lexicon() -> String { 140 + json.object([ 141 + #("lexicon", json.int(1)), 142 + #("id", json.string("social.grain.photo")), 143 + #( 144 + "defs", 145 + json.object([ 146 + #( 147 + "main", 148 + json.object([ 149 + #("type", json.string("record")), 150 + #("key", json.string("tid")), 151 + #( 152 + "record", 153 + json.object([ 154 + #("type", json.string("object")), 155 + #( 156 + "required", 157 + json.array([json.string("createdAt")], of: fn(x) { x }), 158 + ), 159 + #( 160 + "properties", 161 + json.object([ 162 + #( 163 + "alt", 164 + json.object([#("type", json.string("string"))]), 165 + ), 166 + #( 167 + "createdAt", 168 + json.object([ 169 + #("type", json.string("string")), 170 + #("format", json.string("datetime")), 171 + ]), 172 + ), 173 + ]), 174 + ), 175 + ]), 176 + ), 177 + ]), 178 + ), 179 + ]), 180 + ), 181 + ]) 182 + |> json.to_string 183 + } 184 + 185 + // Helper to create profile lexicon 186 + fn create_profile_lexicon() -> String { 187 + json.object([ 188 + #("lexicon", json.int(1)), 189 + #("id", json.string("social.grain.actor.profile")), 190 + #( 191 + "defs", 192 + json.object([ 193 + #( 194 + "main", 195 + json.object([ 196 + #("type", json.string("record")), 197 + #("key", json.string("literal:self")), 198 + #( 199 + "record", 200 + json.object([ 201 + #("type", json.string("object")), 202 + #( 203 + "required", 204 + json.array( 205 + [json.string("displayName"), json.string("createdAt")], 206 + of: fn(x) { x }, 207 + ), 208 + ), 209 + #( 210 + "properties", 211 + json.object([ 212 + #( 213 + "displayName", 214 + json.object([ 215 + #("type", json.string("string")), 216 + #("maxGraphemes", json.int(64)), 217 + ]), 218 + ), 219 + #( 220 + "createdAt", 221 + json.object([ 222 + #("type", json.string("string")), 223 + #("format", json.string("datetime")), 224 + ]), 225 + ), 226 + ]), 227 + ), 228 + ]), 229 + ), 230 + ]), 231 + ), 232 + ]), 233 + ), 234 + ]) 235 + |> json.to_string 236 + } 237 + 238 + /// Test that forward join fields (itemResolved) are available through reverse joins 239 + /// This tests the fix for the circular dependency issue in schema building 240 + pub fn reverse_join_includes_forward_join_fields_test() { 241 + let assert Ok(conn) = sqlight.open(":memory:") 242 + 243 + let assert Ok(_) = database.create_lexicon_table(conn) 244 + let assert Ok(_) = database.create_record_table(conn) 245 + let assert Ok(_) = database.create_actor_table(conn) 246 + 247 + // Insert lexicons 248 + let assert Ok(_) = 249 + database.insert_lexicon(conn, "social.grain.gallery", create_gallery_lexicon()) 250 + let assert Ok(_) = 251 + database.insert_lexicon( 252 + conn, 253 + "social.grain.gallery.item", 254 + create_gallery_item_lexicon(), 255 + ) 256 + let assert Ok(_) = 257 + database.insert_lexicon(conn, "social.grain.photo", create_photo_lexicon()) 258 + 259 + // Create test data 260 + let did1 = "did:test:user1" 261 + let gallery_uri = "at://" <> did1 <> "/social.grain.gallery/gallery1" 262 + let item_uri = "at://" <> did1 <> "/social.grain.gallery.item/item1" 263 + let photo_uri = "at://" <> did1 <> "/social.grain.photo/photo1" 264 + 265 + // Insert gallery 266 + let gallery_json = 267 + json.object([ 268 + #("title", json.string("Test Gallery")), 269 + #("createdAt", json.string("2024-01-01T00:00:00.000Z")), 270 + ]) 271 + |> json.to_string 272 + let assert Ok(_) = 273 + database.insert_record( 274 + conn, 275 + gallery_uri, 276 + "cid1", 277 + did1, 278 + "social.grain.gallery", 279 + gallery_json, 280 + ) 281 + 282 + // Insert photo 283 + let photo_json = 284 + json.object([ 285 + #("alt", json.string("A beautiful sunset")), 286 + #("createdAt", json.string("2024-01-01T00:00:00.000Z")), 287 + ]) 288 + |> json.to_string 289 + let assert Ok(_) = 290 + database.insert_record( 291 + conn, 292 + photo_uri, 293 + "cid2", 294 + did1, 295 + "social.grain.photo", 296 + photo_json, 297 + ) 298 + 299 + // Insert gallery item linking gallery to photo 300 + let item_json = 301 + json.object([ 302 + #("gallery", json.string(gallery_uri)), 303 + #("item", json.string(photo_uri)), 304 + #("position", json.int(0)), 305 + #("createdAt", json.string("2024-01-01T00:00:00.000Z")), 306 + ]) 307 + |> json.to_string 308 + let assert Ok(_) = 309 + database.insert_record( 310 + conn, 311 + item_uri, 312 + "cid3", 313 + did1, 314 + "social.grain.gallery.item", 315 + item_json, 316 + ) 317 + 318 + // Query gallery with reverse join to items, then forward join to photos 319 + let query = 320 + "{ 321 + socialGrainGallery { 322 + edges { 323 + node { 324 + title 325 + socialGrainGalleryItemViaGallery { 326 + edges { 327 + node { 328 + uri 329 + itemResolved { 330 + ... on SocialGrainPhoto { 331 + uri 332 + alt 333 + } 334 + } 335 + } 336 + } 337 + } 338 + } 339 + } 340 + } 341 + }" 342 + 343 + let assert Ok(response_json) = 344 + graphql_gleam.execute_query_with_db(conn, query, "{}", Error(Nil), "") 345 + 346 + // Verify the response includes the gallery 347 + string.contains(response_json, "Test Gallery") 348 + |> should.be_true 349 + 350 + // Verify the reverse join worked (gallery item is present) 351 + string.contains(response_json, item_uri) 352 + |> should.be_true 353 + 354 + // CRITICAL: Verify itemResolved field exists and resolved the photo 355 + // This tests the fix for forward join fields being available through reverse joins 356 + string.contains(response_json, photo_uri) 357 + |> should.be_true 358 + 359 + string.contains(response_json, "A beautiful sunset") 360 + |> should.be_true 361 + } 362 + 363 + /// Test that integer fields are correctly resolved (not converted to strings) 364 + /// This tests the fix for field value type handling 365 + pub fn integer_field_resolves_correctly_test() { 366 + let assert Ok(conn) = sqlight.open(":memory:") 367 + let assert Ok(_) = database.create_lexicon_table(conn) 368 + let assert Ok(_) = database.create_record_table(conn) 369 + let assert Ok(_) = database.create_actor_table(conn) 370 + 371 + let assert Ok(_) = 372 + database.insert_lexicon( 373 + conn, 374 + "social.grain.gallery.item", 375 + create_gallery_item_lexicon(), 376 + ) 377 + 378 + let did1 = "did:test:user1" 379 + let gallery_uri = "at://" <> did1 <> "/social.grain.gallery/gallery1" 380 + let item_uri = "at://" <> did1 <> "/social.grain.gallery.item/item1" 381 + let photo_uri = "at://" <> did1 <> "/social.grain.photo/photo1" 382 + 383 + // Insert gallery item with position = 42 384 + let item_json = 385 + json.object([ 386 + #("gallery", json.string(gallery_uri)), 387 + #("item", json.string(photo_uri)), 388 + #("position", json.int(42)), 389 + #("createdAt", json.string("2024-01-01T00:00:00.000Z")), 390 + ]) 391 + |> json.to_string 392 + 393 + let assert Ok(_) = 394 + database.insert_record( 395 + conn, 396 + item_uri, 397 + "cid1", 398 + did1, 399 + "social.grain.gallery.item", 400 + item_json, 401 + ) 402 + 403 + let query = 404 + "{ 405 + socialGrainGalleryItem { 406 + edges { 407 + node { 408 + uri 409 + position 410 + } 411 + } 412 + } 413 + }" 414 + 415 + let assert Ok(response_json) = 416 + graphql_gleam.execute_query_with_db(conn, query, "{}", Error(Nil), "") 417 + 418 + // Verify position is returned as integer, not string or null 419 + { string.contains(response_json, "\"position\":42") } 420 + |> bool.or(string.contains(response_json, "\"position\": 42")) 421 + |> should.be_true 422 + 423 + // Ensure it's not returned as null 424 + { string.contains(response_json, "\"position\":null") } 425 + |> bool.or(string.contains(response_json, "\"position\": null")) 426 + |> should.be_false 427 + } 428 + 429 + /// Test complete nested query: profile → galleries → items → photos with sorting 430 + /// This is the actual use case that was failing before the fixes 431 + pub fn nested_query_profile_to_photos_test() { 432 + let assert Ok(conn) = sqlight.open(":memory:") 433 + let assert Ok(_) = database.create_lexicon_table(conn) 434 + let assert Ok(_) = database.create_record_table(conn) 435 + let assert Ok(_) = database.create_actor_table(conn) 436 + 437 + // Insert all lexicons 438 + let assert Ok(_) = 439 + database.insert_lexicon( 440 + conn, 441 + "social.grain.actor.profile", 442 + create_profile_lexicon(), 443 + ) 444 + let assert Ok(_) = 445 + database.insert_lexicon(conn, "social.grain.gallery", create_gallery_lexicon()) 446 + let assert Ok(_) = 447 + database.insert_lexicon( 448 + conn, 449 + "social.grain.gallery.item", 450 + create_gallery_item_lexicon(), 451 + ) 452 + let assert Ok(_) = 453 + database.insert_lexicon(conn, "social.grain.photo", create_photo_lexicon()) 454 + 455 + let did1 = "did:test:alice" 456 + let profile_uri = "at://" <> did1 <> "/social.grain.actor.profile/self" 457 + let gallery_uri = "at://" <> did1 <> "/social.grain.gallery/vacation" 458 + let photo1_uri = "at://" <> did1 <> "/social.grain.photo/photo1" 459 + let photo2_uri = "at://" <> did1 <> "/social.grain.photo/photo2" 460 + let item1_uri = "at://" <> did1 <> "/social.grain.gallery.item/item1" 461 + let item2_uri = "at://" <> did1 <> "/social.grain.gallery.item/item2" 462 + 463 + // Insert profile 464 + let assert Ok(_) = 465 + database.insert_record( 466 + conn, 467 + profile_uri, 468 + "cid1", 469 + did1, 470 + "social.grain.actor.profile", 471 + "{\"displayName\":\"Alice\",\"createdAt\":\"2024-01-01T00:00:00.000Z\"}", 472 + ) 473 + 474 + // Insert gallery 475 + let assert Ok(_) = 476 + database.insert_record( 477 + conn, 478 + gallery_uri, 479 + "cid2", 480 + did1, 481 + "social.grain.gallery", 482 + "{\"title\":\"Summer Vacation\",\"createdAt\":\"2024-01-01T00:00:00.000Z\"}", 483 + ) 484 + 485 + // Insert photos 486 + let assert Ok(_) = 487 + database.insert_record( 488 + conn, 489 + photo1_uri, 490 + "cid3", 491 + did1, 492 + "social.grain.photo", 493 + "{\"alt\":\"Beach\",\"createdAt\":\"2024-01-02T00:00:00.000Z\"}", 494 + ) 495 + let assert Ok(_) = 496 + database.insert_record( 497 + conn, 498 + photo2_uri, 499 + "cid4", 500 + did1, 501 + "social.grain.photo", 502 + "{\"alt\":\"Mountains\",\"createdAt\":\"2024-01-03T00:00:00.000Z\"}", 503 + ) 504 + 505 + // Insert gallery items with positions 506 + let assert Ok(_) = 507 + database.insert_record( 508 + conn, 509 + item1_uri, 510 + "cid5", 511 + did1, 512 + "social.grain.gallery.item", 513 + "{\"gallery\":\"" 514 + <> gallery_uri 515 + <> "\",\"item\":\"" 516 + <> photo1_uri 517 + <> "\",\"position\":1,\"createdAt\":\"2024-01-01T00:00:00.000Z\"}", 518 + ) 519 + let assert Ok(_) = 520 + database.insert_record( 521 + conn, 522 + item2_uri, 523 + "cid6", 524 + did1, 525 + "social.grain.gallery.item", 526 + "{\"gallery\":\"" 527 + <> gallery_uri 528 + <> "\",\"item\":\"" 529 + <> photo2_uri 530 + <> "\",\"position\":0,\"createdAt\":\"2024-01-01T00:00:00.000Z\"}", 531 + ) 532 + 533 + // The complete nested query that was failing 534 + let query = 535 + "{ 536 + socialGrainActorProfile { 537 + edges { 538 + node { 539 + displayName 540 + socialGrainGalleryByDid { 541 + edges { 542 + node { 543 + title 544 + socialGrainGalleryItemViaGallery( 545 + sortBy: [{ field: \"position\", direction: ASC }] 546 + ) { 547 + edges { 548 + node { 549 + position 550 + itemResolved { 551 + ... on SocialGrainPhoto { 552 + uri 553 + alt 554 + } 555 + } 556 + } 557 + } 558 + } 559 + } 560 + } 561 + } 562 + } 563 + } 564 + } 565 + }" 566 + 567 + let assert Ok(response_json) = 568 + graphql_gleam.execute_query_with_db(conn, query, "{}", Error(Nil), "") 569 + 570 + // Verify all levels of nesting work 571 + string.contains(response_json, "Alice") 572 + |> should.be_true 573 + 574 + string.contains(response_json, "Summer Vacation") 575 + |> should.be_true 576 + 577 + // Verify positions are integers 578 + { string.contains(response_json, "\"position\":0") } 579 + |> bool.or(string.contains(response_json, "\"position\": 0")) 580 + |> should.be_true 581 + 582 + { string.contains(response_json, "\"position\":1") } 583 + |> bool.or(string.contains(response_json, "\"position\": 1")) 584 + |> should.be_true 585 + 586 + // CRITICAL: Verify itemResolved works through the reverse join 587 + string.contains(response_json, photo1_uri) 588 + |> should.be_true 589 + 590 + string.contains(response_json, photo2_uri) 591 + |> should.be_true 592 + 593 + string.contains(response_json, "Beach") 594 + |> should.be_true 595 + 596 + string.contains(response_json, "Mountains") 597 + |> should.be_true 598 + }
+395
server/test/sorting_enum_validation_test.gleam
··· 1 + /// Integration test for sorting enum validation 2 + /// 3 + /// Verifies that each Connection field uses the correct collection-specific 4 + /// SortFieldInput type with the appropriate enum, fixing the bug where all 5 + /// fields were sharing a single global enum. 6 + import gleam/dict 7 + import gleam/list 8 + import gleam/option 9 + import gleam/result 10 + import gleeunit/should 11 + import graphql/executor 12 + import graphql/schema 13 + import graphql/value 14 + import lexicon_graphql/dataloader 15 + import lexicon_graphql/db_schema_builder 16 + import lexicon_graphql/lexicon_parser 17 + import lexicon_graphql/types 18 + 19 + pub fn sorting_enum_input_types_are_unique_per_collection_test() { 20 + // Test: Each collection should have its own SortFieldInput type 21 + // Using introspection query to verify SocialGrainGalleryItemSortFieldInput exists 22 + 23 + let lexicons = load_social_grain_lexicons() 24 + 25 + // Create a stub fetcher that won't actually be called 26 + let stub_fetcher = fn( 27 + _uri: String, 28 + _params: dataloader.PaginationParams, 29 + ) -> Result( 30 + #( 31 + List(#(value.Value, String)), 32 + option.Option(String), 33 + Bool, 34 + Bool, 35 + option.Option(Int), 36 + ), 37 + String, 38 + ) { 39 + Error("Not implemented for test") 40 + } 41 + 42 + let assert Ok(graphql_schema) = 43 + db_schema_builder.build_schema_with_fetcher( 44 + lexicons, 45 + stub_fetcher, 46 + option.None, 47 + option.None, 48 + option.None, 49 + option.None, 50 + option.None, 51 + option.None, 52 + ) 53 + 54 + // Introspection query to check if SocialGrainGalleryItemSortFieldInput exists 55 + let query = " 56 + { 57 + __type(name: \"SocialGrainGalleryItemSortFieldInput\") { 58 + name 59 + kind 60 + inputFields { 61 + name 62 + type { 63 + name 64 + kind 65 + ofType { 66 + name 67 + kind 68 + } 69 + } 70 + } 71 + } 72 + } 73 + " 74 + 75 + let ctx = schema.context(option.None) 76 + let result = executor.execute(query, graphql_schema, ctx) 77 + 78 + // Should successfully execute 79 + result 80 + |> should.be_ok() 81 + 82 + // Verify the type exists and has the correct structure 83 + case result { 84 + Ok(executor.Response(data, errors)) -> { 85 + // Should have no errors 86 + errors 87 + |> should.equal([]) 88 + 89 + case data { 90 + value.Object(fields) -> { 91 + case list.key_find(fields, "__type") { 92 + Ok(value.Object(type_fields)) -> { 93 + // Verify type name 94 + case list.key_find(type_fields, "name") { 95 + Ok(value.String(name)) -> { 96 + name 97 + |> should.equal("SocialGrainGalleryItemSortFieldInput") 98 + } 99 + _ -> should.fail() 100 + } 101 + 102 + // Verify it's an INPUT_OBJECT 103 + case list.key_find(type_fields, "kind") { 104 + Ok(value.String(kind)) -> { 105 + kind 106 + |> should.equal("INPUT_OBJECT") 107 + } 108 + _ -> should.fail() 109 + } 110 + 111 + // Verify it has a "field" input field that uses the correct enum 112 + case list.key_find(type_fields, "inputFields") { 113 + Ok(value.List(input_fields)) -> { 114 + // Find the "field" input field 115 + let field_input = 116 + list.find(input_fields, fn(f) { 117 + case f { 118 + value.Object(field_data) -> { 119 + case list.key_find(field_data, "name") { 120 + Ok(value.String("field")) -> True 121 + _ -> False 122 + } 123 + } 124 + _ -> False 125 + } 126 + }) 127 + 128 + case field_input { 129 + Ok(value.Object(field_data)) -> { 130 + // Check the type is SocialGrainGalleryItemSortField (wrapped in NON_NULL) 131 + case list.key_find(field_data, "type") { 132 + Ok(value.Object(type_data)) -> { 133 + case list.key_find(type_data, "ofType") { 134 + Ok(value.Object(inner_type)) -> { 135 + case list.key_find(inner_type, "name") { 136 + Ok(value.String(enum_name)) -> { 137 + enum_name 138 + |> should.equal("SocialGrainGalleryItemSortField") 139 + } 140 + _ -> should.fail() 141 + } 142 + } 143 + _ -> should.fail() 144 + } 145 + } 146 + _ -> should.fail() 147 + } 148 + } 149 + _ -> should.fail() 150 + } 151 + } 152 + _ -> should.fail() 153 + } 154 + } 155 + _ -> should.fail() 156 + } 157 + } 158 + _ -> should.fail() 159 + } 160 + } 161 + _ -> should.fail() 162 + } 163 + } 164 + 165 + pub fn did_join_uses_correct_sort_enum_test() { 166 + // Test: DID join fields should use the SOURCE collection's sort enum 167 + // Query SocialGrainGallery.socialGrainGalleryItemByDid's sortBy argument 168 + 169 + let lexicons = load_social_grain_lexicons() 170 + 171 + // Create a stub fetcher that won't actually be called 172 + let stub_fetcher = fn( 173 + _uri: String, 174 + _params: dataloader.PaginationParams, 175 + ) -> Result( 176 + #( 177 + List(#(value.Value, String)), 178 + option.Option(String), 179 + Bool, 180 + Bool, 181 + option.Option(Int), 182 + ), 183 + String, 184 + ) { 185 + Error("Not implemented for test") 186 + } 187 + 188 + let assert Ok(graphql_schema) = 189 + db_schema_builder.build_schema_with_fetcher( 190 + lexicons, 191 + stub_fetcher, 192 + option.None, 193 + option.None, 194 + option.None, 195 + option.None, 196 + option.None, 197 + option.None, 198 + ) 199 + 200 + // Introspection query to check socialGrainGalleryItemByDid's sortBy argument 201 + let query = " 202 + { 203 + __type(name: \"SocialGrainGallery\") { 204 + fields { 205 + name 206 + args { 207 + name 208 + type { 209 + kind 210 + ofType { 211 + kind 212 + ofType { 213 + name 214 + } 215 + } 216 + } 217 + } 218 + } 219 + } 220 + } 221 + " 222 + 223 + let ctx = schema.context(option.None) 224 + let result = executor.execute(query, graphql_schema, ctx) 225 + 226 + result 227 + |> should.be_ok() 228 + 229 + // Find the socialGrainGalleryItemViaGallery field and verify its sortBy arg 230 + case result { 231 + Ok(executor.Response(data, errors)) -> { 232 + errors 233 + |> should.equal([]) 234 + 235 + case data { 236 + value.Object(response_fields) -> { 237 + case list.key_find(response_fields, "__type") { 238 + Ok(value.Object(type_fields)) -> { 239 + case list.key_find(type_fields, "fields") { 240 + Ok(value.List(fields)) -> { 241 + // Find socialGrainGalleryItemByDid field 242 + let did_join_field = 243 + list.find(fields, fn(field) { 244 + case field { 245 + value.Object(field_data) -> { 246 + case list.key_find(field_data, "name") { 247 + Ok(value.String("socialGrainGalleryItemByDid")) -> 248 + True 249 + _ -> False 250 + } 251 + } 252 + _ -> False 253 + } 254 + }) 255 + 256 + case did_join_field { 257 + Ok(value.Object(field_data)) -> { 258 + case list.key_find(field_data, "args") { 259 + Ok(value.List(args)) -> { 260 + // Find sortBy argument 261 + let sortby_arg = 262 + list.find(args, fn(arg) { 263 + case arg { 264 + value.Object(arg_data) -> { 265 + case list.key_find(arg_data, "name") { 266 + Ok(value.String("sortBy")) -> True 267 + _ -> False 268 + } 269 + } 270 + _ -> False 271 + } 272 + }) 273 + 274 + case sortby_arg { 275 + Ok(value.Object(arg_data)) -> { 276 + // Get the input type name: [SocialGrainGalleryItemSortFieldInput!] 277 + case list.key_find(arg_data, "type") { 278 + Ok(value.Object(type_data)) -> { 279 + case list.key_find(type_data, "ofType") { 280 + Ok(value.Object(non_null_data)) -> { 281 + case list.key_find(non_null_data, "ofType") { 282 + Ok(value.Object(input_type_data)) -> { 283 + case list.key_find(input_type_data, "name") { 284 + Ok(value.String(input_type_name)) -> { 285 + // Should use GalleryItem's input type, NOT Gallery's or Favorite's 286 + input_type_name 287 + |> should.equal( 288 + "SocialGrainGalleryItemSortFieldInput", 289 + ) 290 + } 291 + _ -> should.fail() 292 + } 293 + } 294 + _ -> should.fail() 295 + } 296 + } 297 + _ -> should.fail() 298 + } 299 + } 300 + _ -> should.fail() 301 + } 302 + } 303 + _ -> should.fail() 304 + } 305 + } 306 + _ -> should.fail() 307 + } 308 + } 309 + _ -> should.fail() 310 + } 311 + } 312 + _ -> should.fail() 313 + } 314 + } 315 + _ -> should.fail() 316 + } 317 + } 318 + _ -> should.fail() 319 + } 320 + } 321 + _ -> should.fail() 322 + } 323 + } 324 + 325 + // Helper to load social.grain lexicons for testing 326 + fn load_social_grain_lexicons() -> List(types.Lexicon) { 327 + let gallery_json = "{ 328 + \"lexicon\": 1, 329 + \"id\": \"social.grain.gallery\", 330 + \"defs\": { 331 + \"main\": { 332 + \"type\": \"record\", 333 + \"key\": \"tid\", 334 + \"record\": { 335 + \"type\": \"object\", 336 + \"required\": [\"title\"], 337 + \"properties\": { 338 + \"title\": {\"type\": \"string\"}, 339 + \"description\": {\"type\": \"string\"} 340 + } 341 + } 342 + } 343 + } 344 + }" 345 + 346 + let gallery_item_json = "{ 347 + \"lexicon\": 1, 348 + \"id\": \"social.grain.gallery.item\", 349 + \"defs\": { 350 + \"main\": { 351 + \"type\": \"record\", 352 + \"key\": \"tid\", 353 + \"record\": { 354 + \"type\": \"object\", 355 + \"required\": [\"gallery\", \"item\", \"position\"], 356 + \"properties\": { 357 + \"gallery\": {\"type\": \"string\"}, 358 + \"item\": {\"type\": \"string\"}, 359 + \"position\": {\"type\": \"integer\"} 360 + } 361 + } 362 + } 363 + } 364 + }" 365 + 366 + let favorite_json = "{ 367 + \"lexicon\": 1, 368 + \"id\": \"social.grain.favorite\", 369 + \"defs\": { 370 + \"main\": { 371 + \"type\": \"record\", 372 + \"key\": \"tid\", 373 + \"record\": { 374 + \"type\": \"object\", 375 + \"required\": [\"subject\"], 376 + \"properties\": { 377 + \"subject\": {\"type\": \"string\"}, 378 + \"createdAt\": {\"type\": \"string\"} 379 + } 380 + } 381 + } 382 + } 383 + }" 384 + 385 + [ 386 + lexicon_parser.parse_lexicon(gallery_json) |> result.unwrap(empty_lexicon()), 387 + lexicon_parser.parse_lexicon(gallery_item_json) 388 + |> result.unwrap(empty_lexicon()), 389 + lexicon_parser.parse_lexicon(favorite_json) |> result.unwrap(empty_lexicon()), 390 + ] 391 + } 392 + 393 + fn empty_lexicon() -> types.Lexicon { 394 + types.Lexicon(id: "empty", defs: types.Defs(main: option.None, others: dict.new())) 395 + }