In Python, you can serialize dictionaries in a memory-efficient way using the `json` or `pickle` modules, depending on your requirements. The `json` module is suitable for lightweight serialization, whereas `pickle` is more versatile but can use more memory. Here’s an example of how to use both methods:
import json
import pickle
# Example dictionary
data = {'name': 'Alice', 'age': 30, 'city': 'New York'}
# Serialize using json
json_serialized = json.dumps(data)
# Serialize using pickle
pickle_serialized = pickle.dumps(data)
# Optionally, write to a file for persistence
with open('data.json', 'w') as json_file:
json_file.write(json_serialized)
with open('data.pkl', 'wb') as pickle_file:
pickle_file.write(pickle_serialized)
How do I avoid rehashing overhead with std::set in multithreaded code?
How do I find elements with custom comparators with std::set for embedded targets?
How do I erase elements while iterating with std::set for embedded targets?
How do I provide stable iteration order with std::unordered_map for large datasets?
How do I reserve capacity ahead of time with std::unordered_map for large datasets?
How do I erase elements while iterating with std::unordered_map in multithreaded code?
How do I provide stable iteration order with std::map for embedded targets?
How do I provide stable iteration order with std::map in multithreaded code?
How do I avoid rehashing overhead with std::map in performance-sensitive code?
How do I merge two containers efficiently with std::map for embedded targets?