After my previous MOI post, this one is is going to seem a little like I just decided to explore a different subject simply because it has “incidence” in its name. This post introduces Incidence Algebras, which are generally used in algebraic combinatorics as defined by Rota. Just to confuse matters, the “incidence” in Incidence Algebra has little to do (at least directly) with the incidence relation of Formal Concept Analysis. But, the point is not just to explore another version of “incidence”, the point is that the incidence algebra gives us another way to consider lattices using mathematical machinery that is more computationally useful, or at least familiar.

The “unadulterated” in the title represents my aspiration to an implicit tagline Most of the Math, with none of the motivation, but I am trying to remember to be a good mathematical writer and add definitions and background for the uninitiated, but if I happen to miss anything, I would suggest you find a resource on Abstract Algebra (such this first year graduate course text by Ash). Most of what is needed on posets and Formal Concept Analysis should be in the first post, and I give relevant definitions here. However, this post does have some blatant omissions in the proof department.

Rings, Fields and Algebras

I expect that most computational readers have learned linear algebra, but not necessarily abstract algebra. So let’s fill in the gaps with some definitions.

A ring is a set closed on addition and multiplication, where addition satisfies

  1. (associativity)
  2. (commutativity),
  3. has identity : , and
  4. inverses for : ; and

multiplication

  1. has identity , , and
  2. distributes across addition, .

A ring is associative if multiplication is associative, and commutative if multiplication commutes. We will generally have associative rings, but incidence algebras are non-commutative.

A field is a ring with multiplicative inverses. The integers are a ring, while the real numbers are a field.

A vector space over a field is a set with sum and scalar products for and such that

  1. ,
  2. ,
  3. , and
  4. for the unit ,

for and .

An algebra over a field is a set with sum, product and scalar multiplication such that

  1. is a ring,
  2. is a vector space over ,
  3. for all , .

Incidence Algebra

We are interested mainly in lattices, but the definitions work for posets. Remember that the interval , for elements of . Let be the set of intervals in . A poset is locally finite if every interval is finite (Rota64).

Now, let be an associative ring, and be locally finite. The incidence algebra of , , is the set of functions mapping intervals of to , which we write as with the property if , . The incidence algebra is closed on sums of functions

scalar products of functions

for , and products defined as the convolution

Note that this sum ranges over , which is finite by the locally finite assumption.

This structure is an associative, since the product is associative. But the algebra is not commutative unless any two elements of are incomparable, which means that all interesting examples are non-commutative.

The identity element is

for , and the idempotent elements are

for . An idempotent element satisfies .

From the combinatorics perspective, a couple of the elements of the incidence algebra are especially useful. The first is the function defined by whenever and is zero, otherwise. The second is the function , which is the inverse of , and generalizes the inclusion-exclusion principle that we see when defining the cardinality of a set union: .

Representations

We can define a formal representation of an incidence algebra by representing intervals symbolically: using as a symbol. We define the product as

and, then the incidence algebra is -linear combinations of intervals

It is an easy switch to thinking of the symbolic as matrices. To do this we have to enumerate the elements of , then the interval corresponds to a matrix with the th-entry equal to and all other entries zero. Then elements of the algebra can be defined as linear combinations like before, and with product being the matrix product.

OK, time for an example. Let’s go simple and use the powerset lattice of the three element set which can be drawn as here.

Powerset lattice for a three element set

To represent this lattice using matrices we have to decide how to order the elements. Let’s just use a linear extension of the inclusion order with a lexicographic order on the elements where . This gives us an order that we use to determine how elements correspond to rows and columns of the matrix. Here, just using the unit for nonzero entries, we get

In fact, the matrices for a powerset (or Boolean) lattice will always have this structure with a linear extension of the inclusion ordering.

The fact we can switch into matrix representations is cool enough, but let’s pull in graphs too.
This is going to seem like I just want to reference my dissertation in a post, but really its not (exactly) why. We are going to look at a different representation of the incidence that we can also use computationally, which is something that the reference to my dissertation will hint at (but not directly address).

For this, let be a field, and be a directed graph, then the path algebra consists of -linear combinations of finite paths of the graph . We can also define the incidence algebra this way.

First, remember that ( covers ) means that and there is no such that . Given a locally finite poset , we can create a covering graph . The covering graph is a simple directed graph where we have an arc from to whenever . This is the graph that we draw as the Hasse diagram of the lattice except now we draw labeled arcs, and vertices are maybe labeled differently. For our simple lattice, we would have the graph:

Graph of powerset lattice of three elements

For interval of , there is a subgraph of consisting of paths from to . We use this set of paths to represent the interval as a sum of paths from to . (Again, it is useful to order the terms, but we can skip that for now.) For instance, for in the lattice, we have

Epilogue

My goal is to eventually explain how we can compute over the Galois connection(s) derived from an incidence relation to discover dependencies within a data set viewed this way. We could stay in the lattice theoretic framework, but stepping sideways into an incidence algebra can help us by switching into settings where definitions/computations are already established – we just have to squint the right way. Things that could be more natural from this perspective include defining (information) measures over a concept lattice, studying the Galois connection as an algebraic object, and using constructions within the incidence algebra to suggest different algorithmic strategies. I’m not sure I will go to either of the latter two in these notes, but will at least hint.

Before proceeding with incidence algebras, I will switch back to the concept analysis perspective to look at attribute logic, and relate it back to intervals and dependencies in the lattice.

Rota64 Rota, G-C. On the foundations of combinatorial theory. Z. Wahrscheinlichkeitstheorie, 2 (1964) 340–368.

Keller97 Keller, B.J. Algorithms and Orders for Finding non-commutative Gröbner Bases, Ph.D. thesis, Virginia Tech, 1997.