This example shows how to stream large JSON arrays using the Gin framework in Go. It provides an efficient way to serve JSON data without overwhelming the server's memory.
Go, Gin, Streaming JSON, Large JSON Arrays, REST API
package main
import (
"github.com/gin-gonic/gin"
"net/http"
"time"
)
func main() {
router := gin.Default()
// Sample route to stream a large JSON array
router.GET("/stream-json", func(c *gin.Context) {
c.Stream(http.StatusOK, "application/json", func(w io.Writer) bool {
// Simulating streaming of a large JSON array
w.Write([]byte("["))
for i := 0; i < 1000; i++ {
// Writing individual JSON objects
w.Write([]byte(`{"id":` + fmt.Sprintf("%d", i) + `,"value":"example"}`))
// Add a comma after each item except the last
if i < 999 {
w.Write([]byte(","))
}
time.Sleep(10 * time.Millisecond) // Simulate delay
}
w.Write([]byte("]"))
return false // No more data to send
})
})
router.Run(":8080")
}
How do I avoid rehashing overhead with std::set in multithreaded code?
How do I find elements with custom comparators with std::set for embedded targets?
How do I erase elements while iterating with std::set for embedded targets?
How do I provide stable iteration order with std::unordered_map for large datasets?
How do I reserve capacity ahead of time with std::unordered_map for large datasets?
How do I erase elements while iterating with std::unordered_map in multithreaded code?
How do I provide stable iteration order with std::map for embedded targets?
How do I provide stable iteration order with std::map in multithreaded code?
How do I avoid rehashing overhead with std::map in performance-sensitive code?
How do I merge two containers efficiently with std::map for embedded targets?