A brief Public Service Announcement before today’s episode: I had the pleasure last night of being part of the tech crew for the Seattle edition (and final stop) on the Circus Now! tour. They are an advocacy group that seeks to have contemporary circus arts recognized as art forms deserving of study and funding, like other forms of theatre. If you compare the state of circus schools in the United States, with its handful of schools and no degree programs, to, say, France with its six hundred circus schools, it’s clear that there is huge untapped potential here. It was a genuine pleasure to meet Duncan Wall, who was our host, and I’m looking forward to reading his book.

I had intended to spend this episode on the history of the relationship between `SelectMany`

and the LINQ query expression syntax, but before I do that I want to briefly address a point raised in the comments of the previous episode. We know that the key bits of the monad pattern are (1) the ability to construct a monad value that “wraps” an underlying value, and (2) the ability to apply a function from `A`

to `M<R>`

to a value of type `M<A>`

. The first operation is traditionally called `unit`

. The second operation is traditionally called `bind`

, but is called `SelectMany`

in LINQ. We saw how you could build `Select`

out of only `SelectMany`

and a helper method that effectively just calls `unit`

— and indeed, you can build a `Select`

for any monad because all you need is the `unit`

and `bind`

; this operation is traditionally called `fmap`

. I also showed how you could build `Where`

— an inefficient `Where`

to be sure — out of `SelectMany`

and a helper method, but reader “Medo” reasonably pointed out that there is no *general* monadic `where`

because my implementation actually required *three *operations: `unit`

, `bind`

and *make an empty sequence*.

This criticism is entirely justified; though every monad has an analog of the sequence monad’s `Select`

and `SelectMany`

, not every monad has an analog of `Where`

.

Let’s see if we can generalize the `Where`

pattern. The action of our `WhereHelper`

method is a bit tricky to see because it uses `yield return`

conditionally. Basically the action of the helper is: if the predicate is true then it uses the `unit`

operation; it returns a sequence with one element. If the predicate is false then it returns a sequence with zero elements. **The key fact about a sequence with zero elements is that it can be concatenated onto any sequence without changing it.**

In arithmetic, the number zero has two really interesting properties. First, it is the *additive identity*; when you add zero to any number, the result you get is the number. Second, zero multiplied by anything produces zero. It sure looks like the empty sequence is the additive identity where “sequence addition” is defined as *concatenation of two sequences of the same type*.

Is the empty sequence also a zero in the *multiplicative *sense? What could that even mean? Well, we know how to take the product of two sequences:

from x in sequence1 from y in sequence2 select new {x, y}

We know that is the **Cartesian product** of the two sequences– which is a *product*, and that makes us think about *multiplication*. This is not a coincidence. What if we said:

from x in AnEmptySequence from y in AnyFunction(x) select y

? Clearly that has to be an empty sequence no matter what sequence `AnyFunction(x)`

returns. How about:

from x in AnySequence from y in AnEmptySequence select y

That also has to be an empty sequence no matter what `AnySequence`

is.** The empty sequence is the multiplicative zero of the sequence monad when multiplication is defined as SelectMany!**

What we’ve found here is a sub-pattern of the monad pattern. An “additive monad” is a monad with five additional rules on top of the existing monad pattern rules:

- There is a zero value of every construction of the monadic type.
- There is a way to add two monads with the same underlying type together.[1. Note that this typically does not have anything whatsoever to do with arithmetic, which can be confusing when thinking about monads that wrap numbers. When we “add” two sequences of ints together in this sense we are concatenating the sequences, not pairing the integers off and adding them together.]
- The zero value is the identity of the add operation.[2. Note that similar to the other monad laws, we require that zero + m, m + zero and m be
*logically*the same value, not necessarily that they be*reference equal*.] - Using bind to apply any function to the zero produces the zero. (Zero multiplied by anything is zero.)
- Using bind to apply the function
`a=>zero`

to any`M<A>`

produces the zero. (Anything multiplied by zero is zero.)

So we can produce an analog of `Where`

for any additive monad; the helper method returns the zero if the predicate is false.

I’ll leave you with an exercise:

Of the five monadic types we’ve been looking at in this series — `OnDemand<T>`

, `Task<T>`

, `Nullable<T>`

, `Lazy<T>`

and `IEnumerable<T>`

, which ones are additive monads? Of those, what are their zero and monadic addition operations? [3. Reader Joker_vD gives some of the answers in the comments to the previous episode, so don’t read those if you want to avoid spoilers.]

**Next time on FAIC:** SelectMany in LINQ, as originally planned.

SPOILER ALERT-

The answer, I believe, is Nullable and IEnumerable. Addition for the Nullable monad is the coalesce operator. Not coincidentally, these are the two monads in the list that deal with multiplicity of values (0..1 and 0..Many).

Nice. And yes, if

`x`

and`y`

are, say, nullable ints then we can choose the monadic addition`x ⊕ y`

to have the semantics “if`x`

is not null then`x`

, if`x`

is null then`y`

“, which, though unfortunately asymmetrical, does meet the requirements for an additive monad. Of course in this definition, the null value is the zero. You are correct to note that`x ⊕ y`

is implemented in C# by`x ?? y`

when both operands are of the same nullable value type.I’ve been following (not always fully understanding), but in this case if “AnySequence” actually performed side effects (bad practice, I know), then I’m not sure if you could necessarily “multiply” the “AnEmptySequence” against it.

In your first set:

from x in AnEmptySequence

from y in AnySequence

select y

Nothing would happen (multiply by 0)

But the second set:

from x in AnySequence

from y in AnEmptySequence

select y

Would execute AnySequence and its side effects.

Does this imply anything about the multiplicative sense/identity, or simply just a good example of why iteration with side effects is bad?

The latter. Query expressions were designed with the idea that the operations in them do not produce side effects. If they do produce (or consume!) side effects then strange things can happen.

I’m not sure I follow what consume side effects means. unless its consume input? but I’d still think that was producing a side effect.

Consider a delegate that returns

`DateTime.Now`

. It does notproducea side effect; it does not cause the system clock to change. Itconsumesthe side effect.I think Task could be additive too. If we define the zero value of Task as any faulted Task (i.e. throws when awaited), and define addition as sequentially awaiting two tasks (effectively ignoring the return value of the first await) then the resulting task throws (= is zero) when either task throws (= was zero).

Ah, but addition has the property that x + zero is x. Your proposal has the property that x + zero is zero.

Right. Next try. Add would await task1 and catch all exceptions. Same with task2. If both tasks have thrown then throw. If no task has thrown, return task1. Else, return the task that did not throw. Multiplication, i.e. Bind would await the original task, pass its return value to the function and return the outcome.

Clarification for add: If both tasks throw OR if both tasks do not throw, return task1. Else return the not throwing task (I was mixing up an async implementation with a normal one)

You could more simply use

task1.ContinueWith(() => return task2.Result, TaskContinuationOptions.OnlyOnFaulted);

You’d get the first of those that isn’t null, or the last error. This has a nice parallel to the Nullable monad. I suspect that’s how Haskell implements the Error monad as well.

Great insight, thanks.

It seems that Task is both additive and multiplicative, though each has different zeros. Please correct me if I’m wrong:

The addition operator is Task.WhenAll. The additive identify of Task is a completed Task. A completed Task preserves the semantics of the other task(s), in terms of their propagation of cancellation, faults, and of course time and value.

The multiplication operator is ContinueWith with TaskContinuationOptions.NotOnCanceled specified. The multiplicative zero is therefore a cancelled Task. Cancellation is propagated.

In terms of Where, both “zeros” seem to make sense, depending upon the intended semantics.

For example, the multiplicative zero (a cancelled Task) can be returned from Where when the function returns false in order to propagate cancellation. This is perhaps the most intuitive choice. Applying this kind of Where and returning false implies that you want the asynchronous workflow to end; i.e., you want it cancelled.

To the contrary, the additive identity (a completed Task) could be returned when the function returns false in order to remove the normal Task semantics; namely, the fact that it represents a hot, asynchronous function. This is true because a completed Task represents a cold, synchronous function. Applying this kind of Where and returning false implies that you want the asynchronous workflow’s current notion of time and propagation to end immediately. In terms of the multiplicative zero, binding to this result does seem to produce zero as well, though only in the sense that continuing the workflow after a task has been “filtered” is as if the filtered task never even existed in the first place.

Or am I reaching?

Hmm, small correction: I actually believe that a completed Task represents a “HOT”, synchronous function. So it seems all Task are hot. I don’t think that changes my point though.

Actually, I’m no longer sure about my final proposal: that a completed Task is relevant as the result of a Where implementation when its function returns false. The problem is that I’d expect the signature of the predicate function to be Func. In other words, it’s invoked with the value of the Task, so it must wait for it to complete. It means that it’s not possible to prematurely terminate the Task simply by returning false. Oh well :)

OnDemand is an additive monad with built in support in C#. The zero value is null and addition is the + operator (the built in multi-cast delegate operation). When OnDemand is executed all functions that have been added run with that last one returning the value.

You could define custom zero and addition behavior for Task and Lazy as some posters have attempted, but I don’t believe there is any built in support for these in C#.

For Task, the zero value is a faulted or cancelled task. We add two tasks together with TaskFactory::ContinueWhenAll or Task::WhenAll, for which a faulted or cancelled tasks is the identity since it’s already completed. If we pass a faulted or cancelled task to bind, then a function a -> Task can’t be applied as there is no a to extract from that task.

A solution to an Additive Task Monad would be to define zero as a task that never completes and addition using WhenAny. I believe that this would meet all the requirements of the Additive Monad. Addition of Tasks would logically mean you have two tasks that calculate the same thing and you are taking the result of the fastest one.

Just to complete the Additive Monads for the last of the 5 examples. I think the best way to define an Additive Lazy monad would be to use the same logic as OnDemand. Lazy zero would be null and addition would chain non-null Lazy Monads together so that each value would be evaluated and buffered, but only that last one return its value.

So in conclusion, all 5 example monads could be Additive Monads, and the Additive behavior could be defined differently even for the same monad as long as it meets the 5 additive rules.

Except a “Lazy” monad has no null value. You only made it into an additive Monad by turning it into a LazyAndNullable monad. You could, however, use ‘exception’ as the empty value like they’ve already done for OnDemand and Task. I’m not sure that using the exceptional state this way is 100% correct, however, since exception isn’t really a single value, but rather a multiplicity of possible “empty” values. Null is null is null, but there are many possible exceptions that are not equal to each other. Furthermore, is this somewhat contorted construction of an additive monad useful? I could see some situations where this “error-coalesce” additive operator would be useful, but it seems to me that it would fairly rare.

I agree with you Kevin. I don’t like the idea of having null be a value of a Monad. It was a stretch to make OnDemand or Lazy additive monads and they probably would not have well behaved characteristics if bind was used on a function that returned null as the result monad and then addition was also used. It was more of a interesting exercise to see what sort of additive properties made any sense. I always thought the + operation on delegates was very strange in C#, but it is kind of neat to see it closely match an additive monad properties.

Isnt this a ring? Your code makes it seem like so, but I didnt carefully check.