SCoTTi: Save Computation at Training Time with an adaptive framework
Résumé
On-device training is an emerging approach in machine learning where models are trained on edge devices, aiming to enhance privacy protection and real-time performance. However, edge devices typically possess restricted computational power and resources, making it challenging to perform computationally intensive model training tasks. Consequently, reducing resource consumption during training has become a pressing concern in this field. To this end, we propose SCoTTi (Save Computation at Training Time), an adaptive framework that addresses the aforementioned challenge. It leverages an optimizable threshold parameter to effectively reduce the number of neuron updates during training which corresponds to a decrease in memory and computation footprint. Our proposed approach demonstrates superior performance compared to the state-of-the-art methods regarding computational resource savings on various commonly employed benchmarks and popular architectures, including ResNets, MobileNet, and Swin-T.
Fichier principal
Li_SCoTTi_Save_Computation_at_Training_Time_with_an_Adaptive_Framework_ICCVW_2023_paper.pdf (1.22 Mo)
Télécharger le fichier
Origine | Fichiers éditeurs autorisés sur une archive ouverte |
---|