In the previous article in our series on understanding transducers through Python we showed how to support early termination of a reduction operation. This time, we'll demonstrate how transducers can produce more items than they consume. Although this may seem obvious, it leads to some important consequences for implementing lazy evaluation of transducers, which is what we'll look at next time.

Consider a transducer `Repeating` which repeats each source item
multiple times into the output:

```
class Repeating:
def __init__(self, reducer, num_times):
self._reducer = reducer
self._num_times = num_times
def initial(self):
return self._reducer.initial()
def step(self, result, item):
for _ in range(self._num_times):
result = self._reducer.step(result, item)
return result
def complete(self, result):
return self._reducer.complete(result)
def repeating(num_times):
if num_times < 0:
raise ValueError("num_times cannot be negative")
def repeating_transducer(reducer):
return Repeating(reducer, num_times)
return repeating_transducer
```

The key point to notice here, is that each call to `Repeating.step()`
results in multiple calls to the underlying reducer's
`self._reducer.step()`, thereby injecting more items into the output
series than are received in the input series.

By composing it with our filtering primality checking predicate, we can use it to repeat each prime number three times:

```
>>> primes_repeating = compose(filtering(is_prime), repeating(3))
>>> transduce(primes_repeating, Appending(), range(100))
[2, 2, 2, 3, 3, 3, 5, 5, 5, 7, 7, 7, 11, 11, 11, 13, 13, 13, 17, 17,
17, 19, 19, 19, 23, 23, 23, 29, 29, 29, 31, 31, 31, 37, 37, 37, 41,
41, 41, 43, 43, 43, 47, 47, 47, 53, 53, 53, 59, 59, 59, 61, 61, 61,
67, 67, 67, 71, 71, 71, 73, 73, 73, 79, 79, 79, 83, 83, 83, 89, 89,
89, 97, 97, 97]
```

In the next article, we'll see
that although seemingly fairly innocuous, support for item injecting transducers
such as `Repeating` complicates lazy evaluation quite a bit!