Thông tin tài liệu
Thông tin siêu dữ liệu biểu ghi
| Trường DC | Giá trị | Ngôn ngữ |
|---|---|---|
| dc.contributor.author | Mohamed, Reyad | - |
| dc.contributor.author | Amany M., Sarhan | - |
| dc.contributor.author | M., Arafa | - |
| dc.date.accessioned | 2023-04-27T01:47:28Z | - |
| dc.date.available | 2023-04-27T01:47:28Z | - |
| dc.date.issued | 2023 | - |
| dc.identifier.uri | https://link.springer.com/article/10.1007/s00521-023-08568-z | - |
| dc.identifier.uri | https://dlib.phenikaa-uni.edu.vn/handle/PNK/8343 | - |
| dc.description | CC BY | vi |
| dc.description.abstract | Deep Neural Networks (DNNs) are widely regarded as the most effective learning tool for dealing with large datasets, and they have been successfully used in thousands of applications in a variety of fields. Based on these large datasets, they are trained to learn the relationships between various variables. The adaptive moment estimation (Adam) algorithm, a highly efficient adaptive optimization algorithm, is widely used as a learning algorithm in various fields for training DNN models. However, it needs to improve its generalization performance, especially when training with large-scale datasets. Therefore, in this paper, we propose HN Adam, a modified version of the Adam Algorithm, to improve its accuracy and convergence speed. The HN_Adam algorithm is modified by automatically adjusting the step size of the parameter updates over the training epochs. | vi |
| dc.language.iso | en | vi |
| dc.publisher | Springer | vi |
| dc.subject | DNNs | vi |
| dc.subject | HN Adam | vi |
| dc.title | A modified Adam algorithm for deep neural network optimization | vi |
| dc.type | Book | vi |
| Bộ sưu tập | ||
| OER - Công nghệ thông tin | ||
Danh sách tệp tin đính kèm:

