PolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: Experiments

  • 📰 hackernoon
  • ⏱ Reading Time:
  • 18 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 10%
  • Publisher: 51%

Portugal Manchetes Notícia

Portugal Últimas Notícias,Portugal Manchetes

This paper investigates how the configuration of on-device hardware affects energy consumption for neural network inference with regular fine-tuning.

This paper is available on arxiv under CC BY-NC-ND 4.0 DEED license. Authors: Minghao Yan, University of Wisconsin-Madison; Hongyi Wang, Carnegie Mellon University; Shivaram Venkataraman, myan@cs.wisc.edu. Table of Links Abstract & Introduction Motivation Opportunities Architecture Overview Proble Formulation: Two-Phase Tuning Modeling Workload Interference Experiments Conclusion & References A. Hardware Details B. Experimental Results C. Arithmetic Intensity D.

The overhead of performing CBO is also minimal. As shown in Figure 5, CBO only requires around 15 samples to find a near-optimal solution and the optimization procedure can be completed in a few minutes. In cases where a new model is deployed, only a few minutes of overhead are needed to find optimal configurations for the new model.

 

Obrigado pelo seu comentário. Seu comentário será publicado após ser revisado.
Resumimos esta notícia para que você possa lê-la rapidamente. Se você se interessou pela notícia, pode ler o texto completo aqui. Consulte Mais informação:

 /  🏆 532. in PT

Portugal Últimas Notícias, Portugal Manchetes

Similar News:Você também pode ler notícias semelhantes a esta que coletamos de outras fontes de notícias.

PolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: Experimental ResultsThis paper investigates how the configuration of on-device hardware affects energy consumption for neural network inference with regular fine-tuning.
Fonte: hackernoon - 🏆 532. / 51 Consulte Mais informação »

PolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: OpportunitiesThis paper investigates how the configuration of on-device hardware affects energy consumption for neural network inference with regular fine-tuning.
Fonte: hackernoon - 🏆 532. / 51 Consulte Mais informação »

PolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: MotivationThis paper investigates how the configuration of on-device hardware affects energy consumption for neural network inference with regular fine-tuning.
Fonte: hackernoon - 🏆 532. / 51 Consulte Mais informação »

PolyThrottle: Energy-efficient Neural Network Inference on Edge Devices: Arithmetic IntensityThis paper investigates how the configuration of on-device hardware affects energy consumption for neural network inference with regular fine-tuning.
Fonte: hackernoon - 🏆 532. / 51 Consulte Mais informação »

Using Machine Learning & Neural Networks to Maximize Solar Energy GloballyOther ways to generate solar energy that just trying to make solar cells efficient potential of luminescent solar concentration
Fonte: cleantechnica - 🏆 565. / 51 Consulte Mais informação »