TY - JOUR
T1 - Text prediction recurrent neural networks using long short-term memory-dropout
AU - Iparraguirre-Villanueva, Orlando
AU - Guevara-Ponce, Victor
AU - Ruiz-Alvarado, Daniel
AU - Beltozar-Clemente, Saul
AU - Sierra-Liñan, Fernando
AU - Zapata-Paulini, Joselyn
AU - Cabanillas-Carbonell, Michael
N1 - Publisher Copyright:
© 2023 Institute of Advanced Engineering and Science. All rights reserved.
PY - 2023/3
Y1 - 2023/3
N2 - Unit short-term memory (LSTM) is a type of recurrent neural network (RNN) whose sequence-based models are being used in text generation and/or prediction tasks, question answering, and classification systems due to their ability to learn long-term dependencies. The present research integrates the LSTM network and dropout technique to generate a text from a corpus as input, a model is developed to find the best way to extract the words from the context. For training the model, the poem "La Ciudad y los perros" which is composed of 128,600 words is used as input data. The poem was divided into two data sets, 38.88% for training and the remaining 61.12% for testing the model. The proposed model was tested in two variants: word importance and context. The results were evaluated in terms of the semantic proximity of the generated text to the given context.
AB - Unit short-term memory (LSTM) is a type of recurrent neural network (RNN) whose sequence-based models are being used in text generation and/or prediction tasks, question answering, and classification systems due to their ability to learn long-term dependencies. The present research integrates the LSTM network and dropout technique to generate a text from a corpus as input, a model is developed to find the best way to extract the words from the context. For training the model, the poem "La Ciudad y los perros" which is composed of 128,600 words is used as input data. The poem was divided into two data sets, 38.88% for training and the remaining 61.12% for testing the model. The proposed model was tested in two variants: word importance and context. The results were evaluated in terms of the semantic proximity of the generated text to the given context.
KW - Dropout
KW - Prediction
KW - Recurrent neural network
KW - Text
KW - Unit short-term memory
UR - http://www.scopus.com/inward/record.url?scp=85144397373&partnerID=8YFLogxK
U2 - 10.11591/ijeecs.v29.i3.pp1758-1768
DO - 10.11591/ijeecs.v29.i3.pp1758-1768
M3 - Original Article
AN - SCOPUS:85144397373
SN - 2502-4752
VL - 29
SP - 1758
EP - 1768
JO - Indonesian Journal of Electrical Engineering and Computer Science
JF - Indonesian Journal of Electrical Engineering and Computer Science
IS - 3
ER -