Download PDFOpen PDF in browser

Arabic Automatic Question Generation Using Transformer Model

EasyChair Preprint no. 8588

12 pagesDate: August 3, 2022


Questions play a vital role in the educational assessment process and enhance learning outcomes for students of all ages. Preparing question exams is a challenging and time-consuming task that requires a thorough comprehension of the topic and the ability to construct the questions, which becomes more difficult as the text size increases. Automatic question generation (AQG) is the task of generating natural relevant questions from diverse text data inputs with optionally giving an answer. A few contributions have been made to address this issue in Arabic language. Many previous works rely on manually constructing question styles using Rule-based methods and input text from kids’ books, stories, or textbooks. These models are limited in linguistic diversity, and the tasks become more complex and challenging when the text size becomes more extensive. Transformer is one of the most adaptable deep-learning models, having been successfully applied to various Natural Language Processing (NLP) tasks. In this paper, we proposed an end-to-end Arabic automatic question generation (AAQG) model based on the Transformer architecture to generate N interrogative questions for educational content from a single unlimited-length document.

Keyphrases: Arabic Question Generation, NLP, Sentence Extraction, TextRank, transformer

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
  author = {Saleh Alhashedi and Norhaida Mohd Suaib and Aryati Bakri},
  title = {Arabic Automatic Question Generation Using Transformer Model},
  howpublished = {EasyChair Preprint no. 8588},

  year = {EasyChair, 2022}}
Download PDFOpen PDF in browser