Item Infomation

Full metadata record
DC FieldValueLanguage
dc.contributor.authorWei, Dai-
dc.contributor.authorChuanfeng, Ning-
dc.contributor.authorShiyu, Pei-
dc.date.accessioned2023-03-31T09:08:06Z-
dc.date.available2023-03-31T09:08:06Z-
dc.date.issued2023-
dc.identifier.urihttps://link.springer.com/article/10.1007/s44244-023-00004-4-
dc.identifier.urihttps://dlib.phenikaa-uni.edu.vn/handle/PNK/7402-
dc.descriptionCC BYvi
dc.description.abstractAs a randomized learner model, SCNs are remarkable that the random weights and biases are assigned employing a supervisory mechanism to ensure universal approximation and fast learning. However, the randomness makes SCNs more likely to generate approximate linear correlative nodes that are redundant and low quality, thereby resulting in non-compact network structure. In light of a fundamental principle in machine learning, that is, a model with fewer parameters holds improved generalization. This paper proposes orthogonal SCN, termed OSCN, to filtrate out the low-quality hidden nodes for network structure reduction by incorporating Gram–Schmidt orthogonalization technology.vi
dc.language.isoenvi
dc.publisherSpringervi
dc.subjectSCNsvi
dc.subjectOSCNvi
dc.titleOrthogonal stochastic configuration networks with adaptive construction parameter for data analyticsvi
dc.typeBookvi
Appears in CollectionsOER - Công nghệ thông tin

Files in This Item: