We measure distance between two empirical cumulative distribution functions (ECDF). For simplicity, we only take an input of ecdf objects from stats package.

ecdfdist(elist, method = c("KS", "Lp", "Wasserstein"), p = 2, as.dist = FALSE)

Arguments

elist

a length \(N\) list of ecdf objects.

method

name of the distance/dissimilarity measure. Case insensitive.

p

exponent for Lp or Wasserstein distance.

as.dist

a logical; TRUE to return dist object, FALSE to return an \((N\times N)\) symmetric matrix of pairwise distances.

Value

either dist object of an \((N\times N)\) symmetric matrix of pairwise distances by as.dist argument.

See also

Examples

# \donttest{
## toy example : 10 of random and uniform distributions
mylist = list()
for (i in 1:10){
  mylist[[i]] = stats::ecdf(stats::rnorm(50, sd=2))
}
for (i in 11:20){
  mylist[[i]] = stats::ecdf(stats::runif(50, min=-5))
}

## compute Kolmogorov-Smirnov distance
dm = ecdfdist(mylist, method="KS")

## visualize
mks  =" KS distances of 2 Types"
opar = par(no.readonly=TRUE)
par(pty="s")
image(dm[,nrow(dm):1], axes=FALSE, main=mks)

par(opar)
# }