Category theory and abstract algebra deal with the way functions can be combined with other functions. Complexity theory deals with how hard a function is to compute. It's weird to me that I haven't seen anyone merge these fields of study, since they seem like such natural pairs. Has anyone done this before?
As a motivating example, let's take a look at semigroups. Semigroups have an associative binary operation. It's well known that operations in semigroups can be parallelized due to the associativity. For example, if there's a billion numbers we want added together, we could break the problem into 10 problems of adding 100 million numbers together, then add the results.
But parallelizing the addition operator makes sense only because it can be computed in constant time and space. What if this weren't the case? For example, lists and the append operation form a common semigroup in functional programming, but the append operator takes time and space $O(n)$.
It seems like there should be a convenient way to describe the differences between these two semigroups. In general, it seems like algebras that took into account the complexity of their operations would be very useful to computer scientists like myself.
Best Answer
It seems that the idea of growth of groups is extremely relevant here. In the example in the question the operation may be viewed as concatenation of words, and so is very similar to the standard way in which we view the free group on $m$ generators $F(\mathbf{x}_m)$. The operation $V\circ W$ takes $O(|V|+|W|)$-time* to evaluate in $F(\mathbf{x}_m)$.
Every finitely-generated group can be constructed by taking such a (finite rank) free group $F(\mathbf{x}_m)$ and adding relators. These relators correspond to "short cuts" in our computation. For example, consider the free group $F(a)$ and add the relator $a^3=\epsilon$ (here, $\epsilon$ denotes the empty word) to obtain the group with presentation $\langle a\mid a^3\rangle$. (This group is actually cyclic of order $3$.) Now, I can evaluate $a^{100}\circ a^{34}$ to get $a^{134}$, but why would I? As I am in my group I know that $a^{100}=a$ and $a^{34}=a$ so $a^{100}\circ a^{34}=a\circ a=a^2$. The idea of growth of groups is to make sense of these "shortcuts" computationally. Roughtly, the "growth" of a group corresponds to how many group elements there are at distance $n$ from the identity element. Crucially, growth rate does not depend on the choice of generating set. Please see the above Wikipedia link for more details. However, I will leave you with some examples of groups and their growth rate (taken and amended from Wikipedia, of course!):
*This is an easily-computed upper bound. You can probably do something clever to reduce this though.