The purpose of this answer is to use the second moment method to make rigorous the heuristic argument of Michael Lugo. (Here is why his argument is only heuristic: If $N$ is a nonnegative integer random variable, such as the number of length-$r$ increasing consecutive sequences in a random permutation of $\{1,2,\ldots,n\}$, knowing that $E[N] \gg 1$ does not imply that $N$ is positive with high probability, because the large expectation could be caused by a rare event in which $N$ is very large.)
Theorem: The expected length of the longest increasing block in a random permutation of $\{1,2,\ldots,n\}$ is $r_0(n) + O(1)$ as $n \to \infty$, where $r_0(n)$ is the smallest positive integer such that $r_0(n)!>n$. ("Block" means consecutive subsequence $a_{i+1},a_{i+2},\ldots,a_{i+r}$ for some $i$ and $r$, with no conditions on the relative sizes of the $a_i$.)
Note: As Michael pointed out, $r_0(n)$ is of order $(\log n)/(\log \log n)$.
Proof of theorem: Let $P_r$ be the probability that there exists an increasing block of length at least $r$. The expected length of the longest increasing block is then $\sum_{r \ge 0} r(P_r-P_{r+1}) = \sum_{r \ge 1} P_r$. We will bound the latter sum from above and below.
Upper bound: The probability $P_r$ is at most the expected number of increasing blocks of length $r$, which is exactly $(n-r+1)/r!$, since for each of the $n-r+1$ values of $i$ in $\{0,\ldots,n-r\}$ the probability that $a_{i+1},\ldots,a_{i+r}$ are in increasing order is $1/r!$. Thus $P_r \le n/r!$. By comparison with a geometric series with ratio $2$, we have $\sum_{r > r_0(n)} P_r \le P_{r_0(n)} \le 1$. On the other hand $\sum_{1 \le r \le r_0(n)} P_r \le \sum_{1 \le r \le r_0(n)} 1 \le r_0(n)$, so $\sum_{r \ge 1} P_r \le r_0(n) + 1$.
Lower bound: Here we need the second moment method. For $i \in \{1,\ldots,n-r\}$, let $Z_i$ be $1$ if $a_{i+1}<a_{i+2}<\ldots<a_{i+r}$ and $a_i>a_{i+1}$, and $0$ otherwise. (The added condition $a_i>a_{i+1}$ is a trick to reduce the positive correlation between nearby $Z_i$.) The probability that $a_{i+1}<a_{i+2}<\ldots<a_{i+r}$ is $1/r!$, and the probability that this holds while $a_i>a_{i+1}$ fails is $1/(r+1)!$, so $E[Z_i]=1/r! - 1/(r+1)!$. Let $Z=\sum_{i=1}^{n-r} Z_i$, so
$$E[Z]=(n-r)(1/r! - 1/(r+1)!).$$
Next we compute the second moment $E[Z^2]$ by expanding $Z^2$. If $i=j$, then $E[Z_i Z_j] = E[Z_i] = 1/r! - 1/(r+1)!$; summing this over $i$ gives less than or equal to $n/r!$. If $0<|i-j|<r$, then $E[Z_i Z_j]=0$ since the inequalities are incompatible. If $|i-j|=r$, then $E[Z_i Z_j] \le 1/r!^2$ (the latter is the probability if we drop the added condition in the definition of $Z_i$ and $Z_j$). If $|i-j|>r$, then $Z_i$ and $Z_j$ are independent, so $E[Z_i Z_j] = (1/r! - 1/(r+1)!)^2$. Summing over all $i$ and $j$ shows that
$$E[Z^2] \le \frac{n}{r!} + E[Z]^2 \le \left(1 + O(r!/n) \right) E[Z]^2.$$
The second moment method gives the second inequality in
$$P_r \ge \operatorname{Prob}(Z \ge 1) \ge \frac{E[Z]^2}{E[Z^2]} \ge 1 - O(r!/n).$$
If $r \le r_0(n)-2$, then $r! \le (r_0(n)-1)!/(r_0(n)-1)$, so $r!/n \le 1/(r_0(n)-1)$, so $P_r \ge 1 - O(1/r_0(n))$. Thus
$$\sum_{r \ge 1} P_r \ge \sum_{r=1}^{r_0(n)-2} \left( 1 - O(1/r_0(n)) \right) = r_0(n) - O(1).$$
For RSK the answer is "well known". You can find the statements neatly arranged in an article by Christian Krattenthaler http://arxiv.org/abs/math/0510676.
I think the right framework for this question is Sergey Fomin's theory of dual graded graphs.
However, I don't think there are many other insertion algorithms where the Greene-Kleitman invariant is known. One is the insertion algorithm for shifted tableaux, and another, easy one is the pair (BinTree, BinWord).
In fact, whenever you have such a Greene-Kleitman invariant and whenever this invariant behaves well with respect to "promotion", you are in a good position to get a result parallel to http://arxiv.org/abs/math/0604140. For the pair (BinTree, BinWord) this is indeed the case (and interesting), but I never managed to write it up due to time constraints...
For Edelman-Greene the story is slightly different I think. If I recall correctly you can say at least a little bit about the shape of the word by staring long enough at the article by Christian Stump and Luis Serrano http://arxiv.org/abs/1009.4690 or myself http://arxiv.org/abs/1009.3919.
EDIT:
The Kleitman Greene invariants for some insertion algorithms (for the standard case, i.e., where the words are permutations) are described in Sergey Fomin's paper "Schensted algorithms for dual graded graphs":
1) Theorem 4.4.4: Young-Fibonacci insertion (due to Tom Roby and Sergey Fomin, perhaps the invariant for Janvier Nzeutchap's algorithm is different).
2) Just below Proposition 4.5.2: Shifted insertion (attributed to Worley and Bruce Sagan, see Richard Stanley's answer for the description in the semistandard case due to Luis Serrano)
3) Proposition 4.6.2: (BinTree, BinWord)-insertion (independently due to Xavier Viennot)
I'd be interested in learning about Kleitman-Greene invariants for other insertion algorithms. In particular, is it known for domino insertion (as described by Marc van Leeuwen, see also this paper by Thomas Lam)
Best Answer
The Longest Increasing Subsequence has been used as a test statistic for non-parametric tests by García and González-López in
Independence tests for continuous random variables based on the longest increasing subsequence, Journal of Multivariate Analysis, 2014 (link)
and also in this (unpublished?) preprint. I don't know whether they were the first to consider this test statistic.