Efficient Learning of Sparse Conditional Random Fields for Supervised Sequence Labelling

Computer Science – Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

Conditional Random Fields (CRFs) constitute a popular and efficient approach for supervised sequence labelling. CRFs can cope with large description spaces and can integrate some form of structural dependency between labels. In this contribution, we address the issue of efficient feature selection for CRFs based on imposing sparsity through an L1 penalty. We first show how sparsity of the parameter set can be exploited to significantly speed up training and labelling. We then introduce coordinate descent parameter update schemes for CRFs with L1 regularization. We finally provide some empirical comparisons of the proposed approach with state-of-the-art CRF training strategies. In particular, it is shown that the proposed approach is able to take profit of the sparsity to speed up processing and hence potentially handle larger dimensional models.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Efficient Learning of Sparse Conditional Random Fields for Supervised Sequence Labelling does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Efficient Learning of Sparse Conditional Random Fields for Supervised Sequence Labelling, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Efficient Learning of Sparse Conditional Random Fields for Supervised Sequence Labelling will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-296435

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.