I think one reason it is so hard to answer this is that R is so powerful and flexible that a real introduction to R programming goes well beyond what is normally needed in an introduction to statistics. The books that teach statistics using MiniTab, JMP or SPSS are doing relatively straightforward things with the software that barely scratch the surface of what R is capable of when it comes to data manipulation, simulations, custom-built functions, etc.
Having said that, I think that Wilcox's Modern Statistics for the Social and Behavioral Sciences: A Practical Introduction (2012) is a brilliant new book. It assumes no statistical knowledge and takes you from scratch right through to a big range of modern robust techniques; and assumes not much more R knowledge than the ability to open it up and load a dataset. It covers many of the classical techniques too including ANOVA (mentioned in the OP).
I would see this book as the equivalent of the books that introduce stats and a stats package like SPSS at the same time. However, it won't teach you to program in R - only how to do modern statistical analysis with it, with an emphasis on robust techniques that address the known problems with classical analysis that are sidelined by most other approaches to teaching statistics.
The three problems with classical methods that this book particularly addresses right from the beginning are sampling from heavy-tailed distributions; skewness; and heteroscedasticity.
Wilcox uses R because "In terms of taking advantage of modern statistical techniques, R clearly dominates. When analyzing data, it is undoubtedly the most important software development during the last quarter of a century. And it is free. Although classic methods have fundamental flaws, it is not suggested that they be completely abandoned... Consequently, illustrations are provided on how to apply standard methods with R. Of particular importance here is that, in addition, illustrations are provided regarding how to apply modern methods using over 900 R functions written for this book."
This book is so excellent that after we bought a copy for work I purchased my own copy at home.
The chapter headings are:
- numerical and graphical summaries of data;
- probability and related concepts;
- sampling distributions and confidence intervals;
- hypothesis testing;
- regression and correlation;
- bootstrap methods;
- comparing two independent groups;
- comparing two dependent groups;
- one-way ANOVA;
- two-way and three-way designs;
- comparing more than two dependent groups;
- multiple comparisons;
- some multivariate methods;
- robust regression and measures of association;
- basic methods for analyzing categorical data;
Further edit - having checked out the David Moore example of what you are looking for, I really think Wilcox's book meets the need.
Best Answer
I think of the Encyclopaedia Britannica of nonparametric statistics as being:
I'm not sure if I would characterize this as introductory or advanced. Many of the sections are a bit terse, in my opinion, and are written with a good deal of mathematical notation. This will be intimidating / off-putting for people who have some math anxiety. On the other hand, it's not really deriving theorems, it's just using mathematical notation to express the ideas. There are some problems included at the end of each section; you could definitely use the book to learn nonparametric statistics.
For a treatment that is much more introductory:
will be much less intimidating, I think. I have skimmed some portions of it, and it seems to be a gentle introduction for people who don't have a strong statistical background. It is very clear, but does not have anything like the depth or coverage of Hollander & Wolfe.