Cited By
View all- Rao JMeng XDing LQi SLiu XZhang MTao D(2024)Parameter-Efficient and Student-Friendly Knowledge DistillationIEEE Transactions on Multimedia10.1109/TMM.2023.332148026(4230-4241)Online publication date: 1-Jan-2024
- Zheng YWang CTao CLin SQian JWu J(2024)Restructuring the Teacher and Student in Self-DistillationIEEE Transactions on Image Processing10.1109/TIP.2024.346342133(5551-5563)Online publication date: 1-Jan-2024
- Meng XRao JQi SWang LXiao JWang X(2024)Harnessing the�Power of�Prompt Experts: Efficient Knowledge Distillation for�Enhanced Language UnderstandingMachine Learning and Knowledge Discovery in Databases. Research Track and Demo Track10.1007/978-3-031-70371-3_13(218-234)Online publication date: 8-Sep-2024