On Randomized Classification Layers and Their Implications in Natural Language Generation

Gal Lev Shalev, Gabi Shalev, Joseph Keshet

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

In natural language generation tasks, a neural language model is used for generating a sequence of words forming a sentence. The topmost weight matrix of the language model, known as the classification layer, can be viewed as a set of vectors, each representing a target word from the target dictionary. The target word vectors, along with the rest of the model parameters, are learned and updated during training. In this paper, we analyze the properties encoded in the target vectors and question the necessity of learning these vectors. We suggest to randomly draw the target vectors and set them as fixed so that no weights updates are being made during training. We show that by excluding the vectors from the optimization, the number of parameters drastically decreases with a marginal effect on the performance. We demonstrate the effectiveness of our method in image-captioning and machine-translation.

Original languageEnglish
Title of host publicationMultimodal Artificial Intelligence, MAI Workshop 2021 - Proceedings of the 3rd Workshop
EditorsAmir Zadeh, Louis-Philippe Morency, Paul Pu Liang, Candace Ross, Ruslan Salakhutdinov, Soujanya Poria, Erik Cambria, Kelly Shi
PublisherAssociation for Computational Linguistics (ACL)
Pages6-11
Number of pages6
ISBN (Electronic)9781954085251
DOIs
StatePublished - 2021
Event3rd NAACL Workshop on Multimodal Artificial Intelligence, MAI Workshop 2021 - Mexico City, Mexico
Duration: 6 Jun 2021 → …

Publication series

NameMultimodal Artificial Intelligence, MAI Workshop 2021 - Proceedings of the 3rd Workshop

Conference

Conference3rd NAACL Workshop on Multimodal Artificial Intelligence, MAI Workshop 2021
Country/TerritoryMexico
CityMexico City
Period6/06/21 → …

Bibliographical note

Publisher Copyright:
© 2021 Association for Computational Linguistics

Fingerprint

Dive into the research topics of 'On Randomized Classification Layers and Their Implications in Natural Language Generation'. Together they form a unique fingerprint.

Cite this