In C++, serializing and deserializing `std::set` can be a crucial requirement when handling large datasets. This process allows for the easy storage and retrieval of data in a format that can be sent over a network or saved to a file. Here, we provide an example demonstrating how to achieve this using the C++ standard library.
#include <iostream>
#include <set>
#include <fstream>
// Function to serialize a set to a file
void serializeSet(const std::set<int>& s, const std::string& filename) {
std::ofstream ofs(filename);
for (const auto& item : s) {
ofs << item << std::endl;
}
}
// Function to deserialize a set from a file
std::set<int> deserializeSet(const std::string& filename) {
std::set<int> s;
std::ifstream ifs(filename);
int item;
while (ifs >> item) {
s.insert(item);
}
return s;
}
int main() {
std::set<int> originalSet = {1, 2, 3, 4, 5};
std::string filename = "set_data.txt";
// Serialize the set
serializeSet(originalSet, filename);
// Deserialize the set
std::set<int> newSet = deserializeSet(filename);
// Output the contents of the new set
for (const auto& item : newSet) {
std::cout << item << " ";
}
return 0;
}
How do I avoid rehashing overhead with std::set in multithreaded code?
How do I find elements with custom comparators with std::set for embedded targets?
How do I erase elements while iterating with std::set for embedded targets?
How do I provide stable iteration order with std::unordered_map for large datasets?
How do I reserve capacity ahead of time with std::unordered_map for large datasets?
How do I erase elements while iterating with std::unordered_map in multithreaded code?
How do I provide stable iteration order with std::map for embedded targets?
How do I provide stable iteration order with std::map in multithreaded code?
How do I avoid rehashing overhead with std::map in performance-sensitive code?
How do I merge two containers efficiently with std::map for embedded targets?