A Generative AI Framework for Marathi Grammar Learning
Dr. Gauri Deshpande1, Prof. Vandana Sharma2, Prof. Gayatri Dharap3
1,2,3 CSE-DS, Saraswati College of Engineering, Kharghar, India
Abstract - Marathi is one of the popular oldest languages of India and possesses the greater syntactic complexity. Nouns, verbs and compound words of this language have very clear and simple rules that make the learning of the language an easy task for anybody. However, a more effective tool is required to consolidate the particularity of the higher-level grammar, especially the regional dialects. In the last few years there has been a lot of work done in the era of Artificial Intelligence (AI) and Natural Language Processing (NLP) where various languages related complicated tasks have been made easy. Pre-trained generative AI models like BERT (Bidirectional Encoder Representations from Transformers), GPT-3 (Generative Pre-trained Transformer), T5 (Text-to-Text Transfer Transformer) and many others hold much promise in different language uses like text generation, translation, and grammar checks. These models can formulate, interpreting and modifying any linguistic behaviour that might be useful in handling other challenges of Marathi language such as noun declensions and verb conjugations. AI model built from the transformer architecture can be adapted to handle several of linguistic problems in Marathi language that involves syntax analysis and error detection. This paper presents a generative AI model for learning the Marathi grammar which is further categorized into two parts. The first part of the study is devoted to the elementary grammar training, and the second part is dedicated to the intermediate and advanced levels. Using generative AI, this current model gives higher accuracy than rule-based system and presents effective ideas towards modern grammar. Also, this model became improving tools in computational linguistics for regional languages and for promoting language education.
Keywords: Marathi language, generative AI, BERT, GPT-3, T5, computational linguistics