That is OK, and quite reasonable. It is referred to as the two-sample Kolmogorov-Smirnov test. Measuring the difference between two distribution functions by the supnorm is always sensible, but to do a formal test you want to know the distribution under the hypothesis that the two samples are independent and each i.i.d. from the same underlying distribution. To rely on the usual asymptotic theory you will need continuity of the underlying common distribution (not of the empirical distributions). See the Wikipedia page linked to above for more details.
In R, you can use the ks.test
, which computes exact $p$-values for small sample sizes.
First, on the programing side, passing 'uniform'
is essentially passing scipy.stats.uniform.cdf()
to kstest
. So whatever you have in args=
will be passed scipy.stats.uniform.cdf()
as parameters, which only takes two parameters, location and scale (see the document for detail). If you have more than two values in args=
, the extra will simply ignored:
>>> a=np.random.random(10)
>>> stats.kstest(a, 'uniform', args=(0.5,1,3,4))
(0.303993262358352, 0.25725219759419549)
>>> stats.kstest(a, 'uniform', args=(0.5,1,300, 400)) #see how these two give same result
(0.303993262358352, 0.25725219759419549)
Second, since you already normalized CDF of the photon arrival times, it will make sense to do one-sample KS test against the standard uniform distribution. http://journals.ametsoc.org/doi/abs/10.1175/1520-0450%281975%29014%3C1600%3AANOTPM%3E2.0.CO%3B2 Basically what that paper says is that if If one or more parameters must be estimated from the sample, then $D$ no longer follows a Kolmogrov-Smirnov distribution and if you still the CDF of KS to get $P$ value from $D$, you will get wrong $P$. Also, I don't think it is correct apporach to generate a uniform distributed random variable and apply 2-sample KS test.
Third, the CDF of Kolmogrov-Smirnov distribution is given by:
$\operatorname{Pr}(K\leq x)=1-2\sum_{k=1}^\infty (-1)^{k-1} e^{-2k^2 x^2}=\frac{\sqrt{2\pi}}{x}\sum_{k=1}^\infty e^{-(2k-1)^2\pi^2/(8x^2)}$
and this is how you can calculate $P$ from $D$. In scipy
it is not provided by pure python code, but by a C
extension.
Best Answer
I have found some information about how to modify the KS and AD tests for weighted samples in Numerical Methods of Statistics by Monohan, pg. 334 in 1E and pg. 358 in 2E. This Google Books link may show the relevant page. I have also attached a screenshot of the relevant page.