Neural networks with low-resolution parameters

dc.contributor.authorCABRAL, EDUARDO L.L.
dc.contributor.authorDRIEMEIER, LARISSA
dc.coverageInternacional
dc.date.accessioned2026-03-13T11:54:07Z
dc.date.available2026-03-13T11:54:07Z
dc.date.issued2025
dc.description.abstractThe expanding scale of large neural network models introduces significant challenges, driving efforts to reduce memory usage and enhance computational efficiency. Such measures are crucial to ensure the practical implementation and effective application of these sophisticated models across a wide array of use cases. This study examines the impact of parameter bit precision on model performance compared to standard 32-bit models, with a focus on multiclass object classification in images. The models analyzed include those with fully connected layers, convolutional layers, and transformer blocks, with model weight resolution ranging from 1 bit to 4.08 bits. The findings indicate that models with lower parameter bit precision achieve results comparable to 32-bit models, showing promise for use in memory-constrained devices. While low-resolution models with a small number of parameters require more training epochs to achieve accuracy comparable to 32-bit models, those with a large number of parameters achieve similar performance within the same number of epochs. Additionally, data augmentation can destabilize training in low-resolution models, but including zero as a potential value in the weight parameters helps maintain stability and prevents performance degradation. Overall, 2.32-bit weights offer the optimal balance of memory reduction, performance, and efficiency. However, further research should explore other dataset types and more complex and larger models. These findings suggest a potential new era for optimized neural network models with reduced memory requirements and improved computational efficiency, though advancements in dedicated hardware are necessary to fully realize this potential.
dc.format.extent1-19
dc.identifier.citationCABRAL, EDUARDO L.L.; DRIEMEIER, LARISSA. Neural networks with low-resolution parameters. <b>Neural Networks</b>, v. 191, p. 1-19, 2025. DOI: <a href="https://dx.doi.org/10.1016/j.neunet.2025.107763">10.1016/j.neunet.2025.107763</a>. Disponível em: https://repositorio.ipen.br/handle/123456789/49428.
dc.identifier.doi10.1016/j.neunet.2025.107763
dc.identifier.issn0893-6080
dc.identifier.percentilfi85.1
dc.identifier.percentilfiCiteScore88.50
dc.identifier.urihttps://repositorio.ipen.br/handle/123456789/49428
dc.identifier.vol191
dc.language.isoeng
dc.relation.ispartofNeural Networks
dc.rightsopenAccess
dc.titleNeural networks with low-resolution parameters
dc.typeArtigo de periódico
dspace.entity.typePublication
ipen.autorEDUARDO LOBO LUSTOSA CABRAL
ipen.codigoautor496
ipen.contributor.ipenauthorEDUARDO LOBO LUSTOSA CABRAL
ipen.identifier.fi6.3
ipen.identifier.fiCiteScore10.6
ipen.identifier.ipendoc31505
ipen.identifier.iwosWoS
ipen.range.fi6.000 ou mais
ipen.range.percentilfi75.00 - 100.00
ipen.type.genreArtigo
relation.isAuthorOfPublicationde87f375-d22e-4af7-82ec-48091108be70
relation.isAuthorOfPublication.latestForDiscoveryde87f375-d22e-4af7-82ec-48091108be70
sigepi.autor.atividadeEDUARDO LOBO LUSTOSA CABRAL:496:420:S

Pacote Original

Agora exibindo 1 - 1 de 1
Carregando...
Imagem de Miniatura
Nome:
31505.pdf
Tamanho:
7.6 MB
Formato:
Adobe Portable Document Format

Licença do Pacote

Agora exibindo 1 - 1 de 1
Carregando...
Imagem de Miniatura
Nome:
license.txt
Tamanho:
1.71 KB
Formato:
Item-specific license agreed upon to submission
Descrição:

Coleções