Shalamov Vyacheslav V. (Postgraduate student, ITMO University,
Russia, St. Petersburg)
Efimova Valeria Aleksandrovna (Postgraduate student, ITMO University,
Russia, St. Petersburg)
Filchenkov Andrey Aleksandrovich (Ph.D., ITMO University,
Saint-Petersburg)
|
Recently, deep learning models, in particular neural networks, have been widely used in the world. Many neural network architectures are created manually by humans, which does not allow achieving optimal results. There are neural network architecture search algorithms that allow you to automate the process of searching and creating neural network architecture. Recent studies have shown that if you use a vector representation of a neural network, you can reduce the search for neural network architecture. An algorithm is proposed that converts a neural network into a graph vector representation and vice versa, which minimizes information loss during conversion. Three compression models were considered: an auto-encoder, a variational auto-encoder, and a sequence-in-sequence. The variations of the proposed model were compared with each other and with well-known models based on a variational auto-encoder: D-VAE and DVAE-EMB by compression losses and by the Kullback-Leibler divergence. The results of the work of neural networks compressed and decompressed using the proposed method remain no worse than those of existing methods, while the proposed solution allows achieving a smaller dimension of the hidden space using the graph structure and encoding attributes separately from the vertices of the graph.
Keywords:neural network architecture search; vector representation; variational auto-encoder; graphs with attributes
|
|
|
Read the full article …
|
Citation link: Shalamov V. V., Efimova V. A., Filchenkov A. A. Translation of a neural network into a vector representation // Современная наука: актуальные проблемы теории и практики. Серия: Естественные и Технические Науки. -2022. -№10. -С. 159-162 DOI 10.37882/2223-2966.2022.10.38 |
|
|