Kumru LLM: Turkish 7.4B Parameter Large Language Model
Kumru is a 7.4B parameter decoder-only Turkish LLM pre-trained from scratch on 500GB of Turkish corpora, surpassing significantly larger multilingual models on the Cetvel benchmark.
Kumru is a 7.4B parameter decoder-only Turkish LLM pre-trained from scratch on 500GB of Turkish corpora, surpassing significantly larger multilingual models on the Cetvel benchmark.