- #1
fog37
- 1,549
- 107
- TL;DR Summary
- Iterators and generators and the data they generate
Hello,
I have been focusing on iterators and generators and I understood a lot but still have some subtle questions...
An iterator, which is a Python object with both the __iter__ and the __next__ methods, saves its current state. Applying the next() method to an iterator gives us one item of data at a time. On the other hand, when a regular Python list is created, all the data in the list is generated at once taking a lot of RAM. But when an iterator is created, I believe we are essentially saving the "recipe" on how to create the data but the data is generated a piece at a time and only upon our request. As we ask the iterator for data, step by step using the next() method, we are not creating and storing the data in RAM: the iterator does not save (unless we explicitly code for it) the data ahead of time and the data that is generates, correct?
Example: the data the iterator is using may already exist and be saved in the permanent memory. For example, there may be a huge text file saved on the computer. The iterator may pick a line at time from the text file without loading the entire file in RAM.
The iterator may also generate its data dynamically. For example, when we use an iterator to generate an infinite set of numbers: we don't really create those numbers in memory ahead of time or even save them after they are generated...I believe..
A generator is a special type of function with the return statement replaced by the yield statement. Is a generator just a function whose outcome/return is an iterator? Is a generator essentially a way to create a custom iterator? Python has iterator objects like range(), map(), etc. We can also convert certain iterable data structures, like lists, dictionaries, tuples, etc. into iterators using the iter() method... Are generators a way to create flexible iterators?
I have been focusing on iterators and generators and I understood a lot but still have some subtle questions...
An iterator, which is a Python object with both the __iter__ and the __next__ methods, saves its current state. Applying the next() method to an iterator gives us one item of data at a time. On the other hand, when a regular Python list is created, all the data in the list is generated at once taking a lot of RAM. But when an iterator is created, I believe we are essentially saving the "recipe" on how to create the data but the data is generated a piece at a time and only upon our request. As we ask the iterator for data, step by step using the next() method, we are not creating and storing the data in RAM: the iterator does not save (unless we explicitly code for it) the data ahead of time and the data that is generates, correct?
Example: the data the iterator is using may already exist and be saved in the permanent memory. For example, there may be a huge text file saved on the computer. The iterator may pick a line at time from the text file without loading the entire file in RAM.
The iterator may also generate its data dynamically. For example, when we use an iterator to generate an infinite set of numbers: we don't really create those numbers in memory ahead of time or even save them after they are generated...I believe..
A generator is a special type of function with the return statement replaced by the yield statement. Is a generator just a function whose outcome/return is an iterator? Is a generator essentially a way to create a custom iterator? Python has iterator objects like range(), map(), etc. We can also convert certain iterable data structures, like lists, dictionaries, tuples, etc. into iterators using the iter() method... Are generators a way to create flexible iterators?