Kumru LLM: Turkish 7.4B Parameter Large Language Model

Kumru is a 7.4B parameter decoder-only Turkish LLM pre-trained from scratch on 500GB of Turkish corpora, surpassing significantly larger multilingual models on the Cetvel benchmark.

September 29, 2025 路 1 min 路 M. Erdi ARI

VBART: The Turkish LLM

VBART are the first Turkish sequence-to-sequence LLMs pre-trained on a large corpus from scratch, surpassing prior state-of-the-art on summarization, paraphrasing, QA, and more.

March 2, 2024 路 1 min 路 M. Erdi ARI