Academy & Industry Research Collaboration Center (AIRCC)

Volume 11, Number 15, September 2021

Arabic Poems Generation using LSTM, Markov-LSTM and Pre-Trained GPT-2 Models

  Authors

Asmaa Hakami, Raneem Alqarni, Mahila Almutairi and Areej Alhothali, King Abdulaziz University, Saudi Arabia

  Abstract

Nowadays, artificial intelligence applications are increasingly integrated into every aspect of our lives. One of the newest applications in artificial intelligence and natural language is text generation, which has received considerable attention in recent years due to the advancements in deep learning and language modeling techniques. Text generation has been investigated in different domains to generate essays and books. Writing poetry is a highly complex intellectual process for humans that requires creativity and high linguistic capability. Several researchers have examined automatic poem generation using deep learning techniques, but only a few attempts have looked into Arabic poetry. Attempts to evaluate the generated pomes coherence in terms of meaning and themes still require further investigation. In this paper, we examined character-based LSTM, Markov-LSTM, and pre-trained GPT-2 models in generating Arabic praise poems. The results of all models were evaluated using BLEU scores and human evaluation. The results of both BLEU scores and human evaluation show that the Markov-LSTM has outperformed both LSTM and GPT-2, where the character-based LSTM model gave the lowest yields in terms of meaning due to its tendency to create unknown words.

  Keywords

Arabic Poems, Markov, GPT-2, Deep Neural Networks, & Natural Language Processing.