Serialization and deserialization are crucial processes in C++ for transferring data structures over networks or saving them to files. This involves converting an object into a format that can be easily stored and reconstructed later. Below is a simple example demonstrating how to serialize and deserialize a custom data structure in C++ using JSON as the format.
// Example of serialization and deserialization in C++
#include
#include
#include
#include
using json = nlohmann::json;
struct Person {
std::string name;
int age;
// Method to serialize the object
json toJSON() const {
return json{{"name", name}, {"age", age}};
}
// Method to deserialize the object
static Person fromJSON(const json& j) {
Person p;
j.at("name").get_to(p.name);
j.at("age").get_to(p.age);
return p;
}
};
int main() {
// Create a Person object
Person person{"John Doe", 30};
// Serialize the object to JSON
json j = person.toJSON();
std::ofstream file("person.json");
file << j.dump(4); // Write pretty JSON to the file
file.close();
// Deserialize the object from JSON
std::ifstream inputFile("person.json");
json j2;
inputFile >> j2;
Person newPerson = Person::fromJSON(j2);
std::cout << "Name: " << newPerson.name << ", Age: " << newPerson.age << std::endl;
return 0;
}
How do I avoid rehashing overhead with std::set in multithreaded code?
How do I find elements with custom comparators with std::set for embedded targets?
How do I erase elements while iterating with std::set for embedded targets?
How do I provide stable iteration order with std::unordered_map for large datasets?
How do I reserve capacity ahead of time with std::unordered_map for large datasets?
How do I erase elements while iterating with std::unordered_map in multithreaded code?
How do I provide stable iteration order with std::map for embedded targets?
How do I provide stable iteration order with std::map in multithreaded code?
How do I avoid rehashing overhead with std::map in performance-sensitive code?
How do I merge two containers efficiently with std::map for embedded targets?