Improving Pantun Generator Performance with Fine Tuning Generative Pre-Trained Transformers

Authors

  • Achmat Sodikkun Universitas Negeri Semarang Author
  • Kholiq Budiman Universitas Negeri Semarang Author

DOI:

https://doi.org/10.15294/ge6xey51

Keywords:

Pantun, Generative, Pre-trained, Transformers, Fine-tuning

Abstract

Purpose: The study aims to address the challenges in generating high-quality pantun, an important element of Indonesian cultural heritage. Traditional methods struggle with limited vocabulary, variation, and consistency in rhyme patterns. This research seeks to enhance the performance of a pantun generator by applying fine-tuning techniques to the Generative Pre-trained Transformers (GPT) model, coupled with post-processing, and validated by linguistic experts.

Methods/Study design/approach: The research involves fine-tuning the GPT model using a dataset of Indonesian pantun. The methodology includes dataset collection, data pre-processing for cleaning and adjustment, and hyperparameter optimization. The effectiveness of the model is evaluated using perplexity and rhyme accuracy metrics. The study also incorporates post-processing to refine the generated pantun further.

Result/Findings: The study achieved a best perplexity value of 14.64, indicating a strong predictive performance by the model. Post-processing significantly improved the rhyme accuracy of the generated pantun to 89%, a substantial improvement over previous studies by Siallagan and Alfina, which only achieved 50%. These results demonstrate that fine-tuning the GPT model, supported by appropriate hyperparameter settings and post-processing techniques, effectively enhances the quality of generated pantun.

Novelty/Originality/Value: This research contributes to the development of generative applications in Indonesian, particularly in the context of cultural preservation. The findings highlight the potential of fine-tuning GPT models to improve language generation tasks and provide valuable insights for creative and educational applications. The validation by experts ensures that the generated pantun adheres to established writing standards

Downloads

Published

2025-10-17

Article ID

13048

How to Cite

Improving Pantun Generator Performance with Fine Tuning Generative Pre-Trained Transformers. (2025). Recursive Journal of Informatics, 3(2), 122-134. https://doi.org/10.15294/ge6xey51