Search
3 results for "distillation"
- ai · arxiv/cs.LG · 4 min
Efficient Rationale Retrieval via Student-Teacher Distillation
Rabtriever reduces computational cost of LLM-based document ranking by distilling cross-encoder knowledge into independent query-document encoders.
Apr 28, 2026 Read → - ai · arxiv/cs.LG · 8 min
Dataset Distillation Fails Without Hard Labels
Soft labels mask poor dataset quality in distillation methods, making random subsets nearly as effective as curated ones.
Apr 22, 2026 Read → - ai · arxiv/cs.AI · 8 min
Token Importance in On-Policy Distillation: Entropy and Disagreement
Research identifies two regions of high-value tokens in knowledge distillation: high-entropy positions and low-entropy positions where student and teacher disagree, enabling 50–80% token reduction.
Apr 17, 2026 Read →