The short answer is to fit an AR(1) model & check it. If what you're left with after that is pretty much white noise, you might well be safe to assume they're AR(1) - if that's a reasonable model a priori, & depending on what it is you're wanting to do with them.
The ACF & PACF suggest, however, that there's perhaps more structure there than a simple AR(1) model. You shouldn't necessarily be bothered about the fourth lag in the PACF being just over the 5% significance level (assuming that's what the blue line is - you didn't say) - there's no correction for multiple testing, so in 20-odd lags you'd expect that. But the wavy ACF could indicate you need either to difference or to put in at least an extra AR term. Given how slowly the ACF is decaying, most likely the former.
AIC is helpful, but if you're using it in an automatic fashion, you'll often find a number of models with not much difference in AIC (a difference of less than 2 is often taken as equivalent to "just as good").
In response to the comments:
(1) Is the series stationary or not? It's hard to tell for a short, highly autocorrelated series like this. Unit root tests (KPSS & augmented Dickey-Fuller) might help (but in my experience they rarely tell you anything that isn't obvious from the correlograms & the time series plot itself). A random walk & an AR(1) model with a high AR parameter can both look plausible & pass any diagnostic tests you might perform. Only over the long term are you likely to be able to tell. NB You may have good a priori reasons to pick one or the other.
(2) If it's stationary, AR(1) or more complex model? The ACF hints at other possibilities that are worth testing, but doesn't rule out an AR(1) - remember that real ACFs from short series can look quite different from the theoretical ones. Most people would go for the simplest, at least for the time being, provided that it fits well enough (& see above about AICs). NB A priori considerations can be important here too.
The statement is related to the fact that the ACF of a stationary AR process of order p goes to zero at an exponential rate, while the PACF becomes zero after lag p. For an MA process of order q the theoretical ACF and PACF show the reverse behaviour, the ACF truncates after lag q and the PACF goes to zero at an exponential rate.
These properties can be used as a guide to choose the orders of an ARMA model.
See for instance, Chapter 3 in Time Series: Theory and Methods by Peter J. Brockwell and Richard A. Davis and this.
Best Answer
They are algebraically relatable http://en.wikipedia.org/wiki/Autoregressive_model#Yule-Walker_equations thus there is no more information in one than the other. What do they speak is the unconditional relationship ACF and the conditional relationship PACF. If one has the ACF then one can impute the PACF and vice-versa.