Computer Science – Learning
Scientific paper
2009-09-04
Computer Science
Learning
Scientific paper
We consider the problem of high-dimensional non-linear variable selection for supervised learning. Our approach is based on performing linear selection among exponentially many appropriately defined positive definite kernels that characterize non-linear interactions between the original variables. To select efficiently from these many kernels, we use the natural hierarchical structure of the problem to extend the multiple kernel learning framework to kernels that can be embedded in a directed acyclic graph; we show that it is then possible to perform kernel selection through a graph-adapted sparsity-inducing norm, in polynomial time in the number of selected kernels. Moreover, we study the consistency of variable selection in high-dimensional settings, showing that under certain assumptions, our regularization framework allows a number of irrelevant variables which is exponential in the number of observations. Our simulations on synthetic datasets and datasets from the UCI repository show state-of-the-art predictive performance for non-linear regression problems.
No associations
LandOfFree
High-Dimensional Non-Linear Variable Selection through Hierarchical Kernel Learning does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with High-Dimensional Non-Linear Variable Selection through Hierarchical Kernel Learning, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and High-Dimensional Non-Linear Variable Selection through Hierarchical Kernel Learning will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-35732