How do I stream large result sets using pgx?

When working with large result sets in Go using the pgx library, it is essential to manage memory efficiently. Streaming results allows you to process each row as it is read from the database, instead of loading the entire result set into memory at once. This approach is particularly beneficial for large datasets, as it can significantly reduce memory consumption and improve performance.

The pgx library provides a straightforward way to stream results using the `pgx.Conn.Query()` method. The result set can be iterated over, allowing you to handle each row without accumulating them in memory.


package main

import (
    "context"
    "fmt"
    "github.com/jackc/pgx/v5"
    "log"
)

func main() {
    conn, err := pgx.Connect(context.Background(), "your-database-url-here")
    if err != nil {
        log.Fatal(err)
    }
    defer conn.Close(context.Background())

    // Example query to stream results
    query := "SELECT id, name FROM large_table"
    rows, err := conn.Query(context.Background(), query)
    if err != nil {
        log.Fatal(err)
    }
    defer rows.Close()

    // Iterate through the result set
    for rows.Next() {
        var id int
        var name string
        err = rows.Scan(&id, &name)
        if err != nil {
            log.Fatal(err)
        }
        fmt.Printf("ID: %d, Name: %s\n", id, name)
    }

    if err = rows.Err(); err != nil {
        log.Fatal(err)
    }
}
    

Go pgx streaming large result sets database query memory management Go database driver