Iterables for functional transformation
The post Map, Filter, Fold with optimized recursion continues, where basic functional transformations were implemented as simple lazy Iterables using trampolines. Iterables with iterators that don’t produce values until next() is called.
The basic map, filter and fold functions were demonstrated directly on Iterables, rather than reaching for Streams.
As an example:
Function<Integer, Integer> doubleUp = x -> 2 * x;
Predicate<Integer> isEven = x -> 0 == x % 2;
List<Integer> numbers = Arrays.asList(1, 2, 3, 4, 5);
// Definition of a lazy transformation, as an Iterable, using map and filter
Iterable<Integer> doubleEven= map(doubleUp, filter(isEven, numbers));
// While a terminal fold functions takes an iterator, and
// sums the values, starting with a base accumulator of 0.
// Result becomes: 12 (0 + 4 + 8)
int result = fold((sum, x) -> sum + x,
0,
doubleEven);
Java Streams has its drawbacks as ordering is difficult to understand, throws exceptions when used more than once, and so on. Iterables forms promises that can be used over and over again
This is not an attempt to beat Streams, but an example how you easily can do functional transformation yourself with simple Iterables, like ArrayList, with simple means, as a library you can grow yourself. Functional transformation becomes core when doing functional style programming, and it’s pretty neet to have control of that, to be able to adjust to potential clashes with other libraries and so on. Everything presented here is in one small java source file, and we are not going functional extravaganza.
First, Rest and With
First, rest and with are even more basic in functional programming than map, filter, and fold. They come with different names, like car, cdr and cons in traditional lisps, but they are all basic operations on linked lists.
The first function takes the first value in the linked list, rest takes the rest of the list, except the first element, while with associates a new element with the list, by placing it in the most natural place, in the beginning, the head, of the list.
An Iterable is fundamentally a list, so first returns the first value of an iterator. While rest returns another Iterable with all elements except first, the rest, an Iterable of Iterators that omit the first value. Last with creates another Iterable, with Iterators that retrieves the added value before the rest of the elements.
first
First returns the first value of an Iterable.
//a is 1
int a= first(Arrays.asList (1,2,3));
//b is 2 as it is increased
int b = first(map(x->x+1, Arrays.asList(1,2,3)));
First is very simple, just creates an iterator and fetches the first value.
// The first value of an Iterator
public static <T>
T first(final Iterable<T> iterable) {
Iterator<T> iterator = iterable.iterator();
return iterator.hasNext()
? iterator.next()
: null;
}
rest
Rest returns the Iterable except first lazily
Iterable<Integer> a = rest( Arrays,asList(1,2,3));
// a is [2, 3]
Iterable<Integer> b = rest(map(x->x+1, Arrays.asList(1,2,3)));
// b is [3, 4] due to map
The rest function is also simple, as no traversal is happening. It just returns an Iterable with iterator that lazilly skips the first element.
// transform to an Iterable where Iterators omit the first element, a lazily shorter Iterable
public static <T>
Iterable<T> rest(Iterable<T> iterable) {
requireNonNull(iterable);
return ()->{
Iterator<T> i = iterable.iterator();
if(!i.hasNext())
return emptyIterator();
i.next();
return i.hasNext() ? i : emptyIterator();
};
}
Funny enough, this creates a new way to iterate over Iterables, the linked list way, looking at the current value and then the value of the rest of the list, in contrast of using an Iterator.
// Will print 1,2 and 3
Iterable<Integer> l = Arrays.asList(1,2,3);
while (l != null){
println(first(l));
l = next(l);
}
with
We can prepend elements to an Iterable as if it was a linked list. Adding a value first is the natural choice, as we otherwise would have to traverse a linked list, to put it last.
public static <T>
Iterable<T> with(T t,
Iterable<? extends T> iterable) {
requireNonNull(iterable);
return () -> lazy(withI(t, iterable.iterator())).iterator();
}
private static <T>
Supplier<Continuation<T>> withI(T a, Iterator<? extends T> i) {
return () -> i.hasNext()
? seq(withI(i.next(), i), a)
: done(a);
}
The example below does not alter the ArrayList. The iterable returned from with is a lazy Iterable prepending 0 for all iterators.
// Will print 0,1,2 and 3, as a 0 is added before the arrayList
List<Integer> arrayList = Arrays.asList(1,2,3);
Iterable<Integer> iterable = with( 0, arrayList);
while (iterable != null){
println(first(iterable));
l = next(iterable);
}
withLast
But since with adds the element lazily, we can as easy do that in the end, without having to traverse.
public static <T>
Iterable<T> withLast(Iterable<T> iterable, T b){
requireNonNull(iterable);
return ()-> lazy(withLastI( iterable.iterator(), b)).iterator();
}
private static <T> Supplier<Continuation<T>> withLastI(Iterator<T> a, T b){
return ()-> a.hasNext()
? seq(withLastI(a,b), a.next())
: done(b);
}
The added value is simply provided when the original Iterable is exhausted.
reduce and reductions
To continue on fold, which accumulates values into a scalar value. It is usually called fold when the accumulator value is explicitly supplied, and reduce when the first element is considered initial accumulator value.
Java Streams use reduce in two forms:
T reduce(T identity, BinaryOperator<T> accumulator) // behaves like fold
Optional<T> reduce(BinaryOperator<T> accumulator) // needs empty handling
The first form is equivalent to what many functional languages call fold: a base case plus an accumulation function. It’s total and safe. The second form is more like classical reduce: it assumes the first element as base, and returns an Optional to account for emptiness.
In this post, we’ve written reduce as:
return fold(f, first(elements), rest(elements));
This gives us a clean value, or null if empty. A risk we consciously accept. But that choice also gives us a cleaner mental model.
If we want to be safe, we can layer Optional on top:
Optional<T> safe = elements.iterator().hasNext()
? Optional.of(reduce(...))
: Optional.empty();
The point is: fold deals in values. Optional deals in absence. Keeping those two concepts separate makes reasoning easier, and lets each abstraction stay true to its role.
The version where the first value in sequence becomes the accumulator, is called reduce. In Java Stream, they are both called reduce.
public static <T>
T reduce(BiFunction<? super T, ? super T, ? extends T> f,
Iterable<? extends T> elements) {
return fold(f, first(elements), rest(elements));
}
The first(iterable) above picks the first value from the Iterable, while rest(iterable) creates an Iterable of Iterators that discard first value.
Fold was defined as:
public static <A, V>
A fold(BiFunction<? super A, ? super V, ? extends A> f,
A accumulator,
Iterable<? extends V> elements) {
requireNonNulls(f, elements);
// Accumulate by applying each with the accumulator to the BiFunction
for (V v : elements)
accumulator = f.apply(accumulator, v);
return accumulator;
}
reductions
It’s common to need the values of the folds as a sequence. We call that reductions.
reductions((acc,v)-> acc+v, 0, Arrays.asList(1,2,3,4))
// Creates an iterable with Iterators emitting 0,1,3,6 and 10
The implementation is pretty simple, and anything similar does not exist in Streams.
public static <A, V>
Iterable<V> reductions(final BiFunction<? super V, ? super A, ? extends V> f,
final V accumulator,
final Iterable<? extends A> elements) {
requireNonNulls(elements, f);
return () -> with(accumulator,
lazy(reductionsI(f, accumulator, elements.iterator()))).iterator();
}
private static <V, A>
Supplier<Continuation<V>> reductionsI(BiFunction<? super V, ? super A, ? extends V> f,
V a,
final Iterator<? extends A> elements) {
return () -> {
if (!elements.hasNext())
return stop();
V acc = f.apply(a, elements.next());
return elements.hasNext()
? seq(reductionsI(f, acc, elements), acc)
: done(acc);
};
}
flatmap
Flatmap, or bind as it is often called, is a litle bit trickier to implement. Since the Iterable is some sort of list, we implement flatMap as for list monads. It processes each element to produce another list, and concatenates the results:
flatMap(x -> Arrays.asList( x , x*x),
Arrays.asList(5,6));
// Creates an Iterable, with iterables producing 5, 25, 6 and 36
And the implementation is a little bit tricky
public static <V, A>
Iterable<A> flatMap(Function<? super V, ? extends Iterable<? extends A>> f,
Iterable<? extends V> data) {
requireNonNulls(data, f);
return lazy(flatMapI(f, null, data.iterator()));
}
private static <V, A>
Supplier<Continuation<A>> flatMapI(Function<? super V, ? extends Iterable<? extends A>> f,
Iterator<? extends A> current,
Iterator<? extends V> data) {
return () -> {
if (current != null && current.hasNext()) {
return seq(flatMapI(f, current, data), current.next());
}
while (data.hasNext()) {
Iterable<? extends A> nextChunk = f.apply(data.next());
if (nextChunk != null) {
Iterator<? extends A> nextIter = nextChunk.iterator();
if (nextIter.hasNext()) {
return recur(flatMapI(f, nextIter, data));
}
}
}
return stop();
};
}
Monads are not complete until you provide both variants of its usage: bind with return, and join with map. We have already used the javaish name for bind, flatMap. And return, used to lift a value into the monad, is a reserved word in Java. We will call them flatMap and unit, versus join and map.
Monad laws in action:
/** Flattens Iterable<Iterable <T>> into Iterable<T> */
public static <T>
Iterable<T> join(Iterable<? extends Iterable<? extends T>> nested) {
requireNonNull(nested);
return flatMap(Function.identity(), nested);
}
/** creates a singleton list of a value, lifting the value into List context */
public static <T>
Iterable<T> unit(T value) {
return Collections.singletonList(value);
}
The function map wich just applies a function, and makes the list a functor, is sometimes called fmap. Don’t comfuse this with flatMap, which is bind. Java naming is a bit unfortunate for novice java programmers.
examples
Our example did already produce monadic values, lists, so we didn’t need unit
// Creates an Iterable, with iterators producing 5,25,6 and 36
flatMap(x -> Arrays.asList( x , x*x),
Arrays.asList( 5,6));
But if it didn’t. If the function just returned a simple unwrapped value, we typically used unit to wrap it:
// Creates an Iterable, with iterables producing 25 and 36
flatMap(x -> unit(x*x),
Arrays.asList(1,2,3));
And if we used map instead of flatMap, we had to join the result:
Iterable<Iterable<Integer>> mapped = map(x -> Arrays.asList(x, x * x),
Arrays.asList(5,6));
Iterable<Integer> joined = join(mapped);
// joined is a Iterable of iterators producing 5,25,6 and 36
partitioning
Partitioning means chunking up the Iterable to an Iterable of Lists. The iterator applies function to each value in the Iterable, splitting it each time function returns a new value.
The following produces an Iterable, with Iterators of lists with [1,2] [4] [5,7] [6]
, partitioned by consecutive list of odd and even values.
Iterable<List<Integer>> groups = partitionBy(x -> x % 2,
Arrays.asList(1, 3, 4, 5, 7, 6));
PartitionBy is the most complex implementation here. Note that its not easy to create lazy sequences of lazy sequences from a lazy sequence. How do we know that we dont read the ahead with the wrong lazy sequence, hence partitionBy produces lazy Iterable of concrete Lists.
// Splits an Iterable into consecutive groups based on the result of applying a function.
// Each time the function result changes, a new group (List) is started.
public static <T>
Iterable<List<T>> partitionBy(Function<? super T, Object> function,
Iterable<T> iterable) {
requireNonNulls(function, iterable);
// Returns a lazy Iterable where each group is formed by consecutive elements with the same key.
return () -> lazy(partitionByI(function,
new Group<>(null, null), // Initial empty group, no key yet
iterable.iterator())).iterator();
}
// A simple container to hold the current group key and list of values.
static private
class Group<T> {
final Object key;
final List<T> values;
Group(Object key, List<T> values) {
this.key = key;
this.values = values;
}
}
private static <T>
Supplier<Continuation<List<T>>> partitionByI(Function<? super T, Object> function,
Group<T> acc,
Iterator<T> it) {
return () -> {
if (it.hasNext()) {
T value = it.next();
Object key = function.apply(value);
// If key matches the current group, add value to current group
if (Objects.equals(acc.key, key)) {
List<T> current = acc.values != null ? acc.values : new ArrayList<>();
current.add(value);
return recur(partitionByI(function, new Group<>(key, current), it));
}
// If key changes and we have a previous group, emit it and start a new one
if (acc.values != null) {
List<T> newGroup = new ArrayList<>();
newGroup.add(value);
return seq(
partitionByI(function, new Group<>(key, newGroup), it),
Collections.unmodifiableList(acc.values) // Emit completed group
);
}
// No current group yet (first element), start new group and continue
List<T> firstGroup = new ArrayList<>();
firstGroup.add(value);
return recur(partitionByI(function, new Group<>(key, firstGroup), it));
}
// Iterator is exhausted, emit last group if it exists
return acc.values == null
? stop()
: done(Collections.unmodifiableList(acc.values));
};
}
As an example, what if we want to remove consequtive odd numbers that start with 5.
Partition by odd, filter not starting with 5, and join
// As before partition by odd
Iterable<List<Integer>> groups = partitionBy(x -> x % 2,
Arrays.asList(1, 3, 4, 5, 7, 6));
// And those funny expressions you need when the type system can't play along.
// An Iterable of List is not and Iterable of Iterable, So manual casting needed
Iterable<Iterable<Integer>> safeGroups = map(x -> (Iterable<Integer>) x,
groups);
//Only those not starting with 5
Iterable<Iterable<Integer>> filtered = filter (x -> 5 != first(x),
safeGroups);
// join Iterables of [1,3] [4] [6] to Iterable of 1,3,4 and 6
Iterable<Integer> joined = join(filtered);
Iterators of joined Iterator will partition and filter as needed, but in concrete chunks for each category.
infinite sequences
Lazy sequences can be infinte. Nothing prevents them from being generated lazilly without an ending.
The iterate functions iterate the values of x: like a sequence of:
x,
f(x),
f(f(x)),
f(f(f(x)))
..and so on.
public static <T>
Iterable<T> iterate(Function<? super T, ? extends T> f,
T x) {
requireNonNull(f);
return () -> with(x, lazy(iterateI(f, x))).iterator();
}
private static <T>
Supplier<Continuation<T>> iterateI(Function<? super T, ? extends T> f,
T x) {
return () -> {
T n = f.apply(x);
return seq(iterateI(f,n), n);
};
}
Even though many algorithms are easily implemented iterativly, they are most often used in limitation. Take limits the Iterable to only produce a certain amount of values in their iterators.
public static <T>
Iterable<T> take(final int num,
final Iterable<T> iterable) {
requireNonNull(iterable);
return () -> num < 1
? emptyIterator()
: lazy(takeI(num, iterable.iterator())).iterator();
}
private static <T>
Supplier<Continuation<T>> takeI(int num,
Iterator<? extends T> iterator) {
return () -> iterator.hasNext() && num > 0
? seq(takeI(num - 1, iterator), iterator.next())
: stop();
}
The take obviously works with any Iterable, not just iterate.
// Generates Iterable that emits 0,1,2,3 and 4
Iterable<Integer> x = take(5,iterate (x -> x + 1, 0))
takeWhile and dropWhile
It is very likely that you want to take values until a condition is met, rather that just taking a fixed number of elements
public static <T>
Iterable<T> takeWhile(Predicate<? super T> pred,
Iterable<T> iterable) {
requireNonNulls(pred,iterable);
return () -> lazy(takeWhileI(pred,
iterable.iterator())).iterator();
}
private static <T>
Supplier<Continuation<T>> takeWhileI(Predicate<? super T> pred,
Iterator<? extends T> iterator) {
return () -> {
if (!iterator.hasNext())
return stop();
T t = iterator.next();
return pred.test(t)
? seq(takeWhileI(pred, iterator), t)
: stop();
};
}
With algorithms that create inifinte sequences, it is likely that you want drop items until a condition is met.
dropWhile
public static <T> Iterable<T> dropWhile(Predicate<? super T> predicate,
Iterable<T> iterable) {
requireNonNulls( iterable, predicate);
return () -> lazy(dropWhileI(predicate,
true, i
terable.iterator())).iterator();
}
// Note recur rather than seq, while dropping elements.
private static <T>
Supplier<Continuation<T>> dropWhileI(Predicate<? super T> predicate,
boolean dropping,
Iterator<T> i){
return ()->{
if(i.hasNext()){
T v=i.next();
return (dropping && predicate.test(v))
? recur(dropWhileI(predicate, true, i))
: seq(dropWhileI(predicate, false, i), v);
}
return stop();
};
}
Summary
Java Streams are powerful, but sometimes opaque. Their fixed API and implicit behavior can make certain transformations harder to reason about — especially when laziness, ordering, or reusability matters.
In contrast, building small transformation primitives on top of Iterable gives you full control. The implementations shown here are compact, readable, and easy to maintain. They introduce no dependencies, and live entirely within your own project.
This approach isn’t meant to replace Streams, but to offer a more explicit and adaptable alternative when the situation calls for it. You can inspect the code, understand every step, and modify the behavior to suit your needs — without surprises.
It’s a reminder that core control structures don’t have to be imported. Sometimes, it’s better when they’re just yours.
The impementation is provided as a single source file rather than a library, for you to integrate smoothly and extend to your needs.