In Go, you can deduplicate a slice by converting it into a map, which inherently eliminates duplicates due to its unique key constraint. Below is an example demonstrating how to achieve this.
package main
import (
"fmt"
)
func deduplicate(slice []int) []int {
unique := make(map[int]struct{})
for _, item := range slice {
unique[item] = struct{}{}
}
result := make([]int, 0, len(unique))
for key := range unique {
result = append(result, key)
}
return result
}
func main() {
nums := []int{1, 2, 2, 3, 4, 4, 5}
uniqueNums := deduplicate(nums)
fmt.Println(uniqueNums) // Output: [1 2 3 4 5]
}
How do I avoid rehashing overhead with std::set in multithreaded code?
How do I find elements with custom comparators with std::set for embedded targets?
How do I erase elements while iterating with std::set for embedded targets?
How do I provide stable iteration order with std::unordered_map for large datasets?
How do I reserve capacity ahead of time with std::unordered_map for large datasets?
How do I erase elements while iterating with std::unordered_map in multithreaded code?
How do I provide stable iteration order with std::map for embedded targets?
How do I provide stable iteration order with std::map in multithreaded code?
How do I avoid rehashing overhead with std::map in performance-sensitive code?
How do I merge two containers efficiently with std::map for embedded targets?