Item Infomation

Full metadata record
DC FieldValueLanguage
dc.contributor.authorMohamed, Reyad-
dc.contributor.authorAmany M., Sarhan-
dc.contributor.authorM., Arafa-
dc.date.accessioned2023-04-27T01:47:28Z-
dc.date.available2023-04-27T01:47:28Z-
dc.date.issued2023-
dc.identifier.urihttps://link.springer.com/article/10.1007/s00521-023-08568-z-
dc.identifier.urihttps://dlib.phenikaa-uni.edu.vn/handle/PNK/8343-
dc.descriptionCC BYvi
dc.description.abstractDeep Neural Networks (DNNs) are widely regarded as the most effective learning tool for dealing with large datasets, and they have been successfully used in thousands of applications in a variety of fields. Based on these large datasets, they are trained to learn the relationships between various variables. The adaptive moment estimation (Adam) algorithm, a highly efficient adaptive optimization algorithm, is widely used as a learning algorithm in various fields for training DNN models. However, it needs to improve its generalization performance, especially when training with large-scale datasets. Therefore, in this paper, we propose HN Adam, a modified version of the Adam Algorithm, to improve its accuracy and convergence speed. The HN_Adam algorithm is modified by automatically adjusting the step size of the parameter updates over the training epochs.vi
dc.language.isoenvi
dc.publisherSpringervi
dc.subjectDNNsvi
dc.subjectHN Adamvi
dc.titleA modified Adam algorithm for deep neural network optimizationvi
dc.typeBookvi
Appears in Collections
OER - Công nghệ thông tin

Files in This Item: