Redirigiendo al acceso original de articulo en 18 segundos...
Inicio  /  Applied Sciences  /  Vol: 12 Par: 18 (2022)  /  Artículo
ARTÍCULO
TITULO

Fractional Derivative Gradient-Based Optimizers for Neural Networks and Human Activity Recognition

Oscar Herrera-Alcántara    

Resumen

In this paper, fractional calculus principles are considered to implement fractional derivative gradient optimizers for the Tensorflow backend. The performance of these fractional derivative optimizers is compared with that of other well-known ones. Our experiments consider some human activity recognition (HAR) datasets, and the results show that there is a subtle difference between the performance of the proposed method and other existing ones. The main conclusion is that fractional derivative gradient descent optimizers could help to improve the performance of training and validation tasks and opens the possibility to include more fractional calculus concepts to neural networks applied to HAR.

 Artículos similares