Computer Science – Learning
Scientific paper
2010-10-07
Computer Science
Learning
Scientific paper
To classify time series by nearest neighbors, we need to specify or learn one or several distances. We consider variations of the Mahalanobis distances which rely on the inverse covariance matrix of the data. Unfortunately -- for time series data -- the covariance matrix has often low rank. To alleviate this problem we can either use a pseudoinverse, covariance shrinking or limit the matrix to its diagonal. We review these alternatives and benchmark them against competitive methods such as the related Large Margin Nearest Neighbor Classification (LMNN) and the Dynamic Time Warping (DTW) distance. As we expected, we find that the DTW is superior, but the Mahalanobis distances are one to two orders of magnitude faster. To get best results with Mahalanobis distances, we recommend learning one distance per class using either covariance shrinking or the diagonal approach.
Lemire Daniel
Prekopcsák Zoltán
No associations
LandOfFree
Time Series Classification by Class-Based Mahalanobis Distances does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Time Series Classification by Class-Based Mahalanobis Distances, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Time Series Classification by Class-Based Mahalanobis Distances will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-510835