DOI to all article
Articles can be submission online
We Follow Peer Review Process
Call for Papers: May/June 2021 Issue
Welcome to IJMCR

Article Published In Vol.9 (May-June 2021)

Modified Long Short-Term memory and Utilizing in Building sequential model

Pages : 207-211, DOI:

Author : Hozan K. Hamarashid

Download PDF

A robust model of neural network was built to control sequence dependency difficulties which is known as recurrent neural networks RNN. The Long Short Term Memory LSTM network is a model of recurrent neural network RNN which is utilized in deep learning due to huge architectures or constructions could be well trained. LSTM is obviously established to prevent the long term dependency difficulty. Keeping information for long periods of time is basically its evade behaviour. Modified Long Short Term Memory MLSTM is an adaptation of LSTM model. This modification is conducted on one of the parameters of LSTM. Threshold value is essential to control the input value. This paper defines deep learning classification with modified long short term memory. Shapes and designs of Arrays will be specifically concentrated on due to it is one of the most frequent faced and misunderstood difficulty. As a consequence, this paper covers; one hot encoding. Besides, selecting input and output dimensions in the layers is illustrated. In addition, training the modified long short term memory is addressed. Before the last, assessment of the model in the perspective of training and testing is conducted. Lastly, utilizing MLSTM in making sequential model.

Keywords: modified LSTM, MLSTM, Long Short Term Memory, LSTM, RNN model.



All the persons belonging directly or indirectly to Microbiology, Biotechnology, Biochemistry, Virology, Environmental Sciences, Medical and Pharmaceutical Sciences, Food and Nutrition, Botany, Zoology, Mycology, Phycology and Agricultural Sciences.