Performing bulk inserts using GORM is straightforward and efficient for inserting multiple records in one operation. GORM's `Create` method can accept a slice of structs, making it convenient to bulk insert data into a database.
package main
import (
"gorm.io/driver/sqlite"
"gorm.io/gorm"
)
type User struct {
ID uint `gorm:"primaryKey"`
Name string
Age int
}
func main() {
db, err := gorm.Open(sqlite.Open("test.db"), &gorm.Config{})
if err != nil {
panic("failed to connect database")
}
db.AutoMigrate(&User{})
// Prepare a slice of users for bulk insert
users := []User{
{Name: "John Doe", Age: 30},
{Name: "Jane Doe", Age: 25},
{Name: "Sam Smith", Age: 22},
}
// Bulk insert
result := db.Create(&users)
if result.Error != nil {
panic(result.Error)
}
// Print number of records inserted
println("Inserted", result.RowsAffected, "users.")
}
How do I avoid rehashing overhead with std::set in multithreaded code?
How do I find elements with custom comparators with std::set for embedded targets?
How do I erase elements while iterating with std::set for embedded targets?
How do I provide stable iteration order with std::unordered_map for large datasets?
How do I reserve capacity ahead of time with std::unordered_map for large datasets?
How do I erase elements while iterating with std::unordered_map in multithreaded code?
How do I provide stable iteration order with std::map for embedded targets?
How do I provide stable iteration order with std::map in multithreaded code?
How do I avoid rehashing overhead with std::map in performance-sensitive code?
How do I merge two containers efficiently with std::map for embedded targets?