Python generators are a special type of iterator that allow you to create iterables using a function that contains the yield
statement. Generators provide a way to iterate through data without having to store the entire dataset in memory at once, making them memory efficient and suitable for working with large data sets.
When a generator function is called, it returns an iterator but does not begin execution immediately. Each time the next()
function is called on the iterator, the generator function executes until it hits the yield
statement, which temporarily pauses the function and sends a value back to the caller. The state of the function is saved, and execution can be resumed when next()
is called again.
Here is a simple example of a generator function that yields the squares of numbers:
def square_generator(n):
for i in range(n):
yield i * i
# Using the generator
for square in square_generator(5):
print(square)
How do I avoid rehashing overhead with std::set in multithreaded code?
How do I find elements with custom comparators with std::set for embedded targets?
How do I erase elements while iterating with std::set for embedded targets?
How do I provide stable iteration order with std::unordered_map for large datasets?
How do I reserve capacity ahead of time with std::unordered_map for large datasets?
How do I erase elements while iterating with std::unordered_map in multithreaded code?
How do I provide stable iteration order with std::map for embedded targets?
How do I provide stable iteration order with std::map in multithreaded code?
How do I avoid rehashing overhead with std::map in performance-sensitive code?
How do I merge two containers efficiently with std::map for embedded targets?