๐ LLM Compression Leaderboard (Base Model)
Welcome to Uncheatable Eval LLM Compression Leaderboard, where fancy fine-tuning and cheating won't work ๐ซ; only compute ๐ป, data ๐, and real innovation ๐ฅ can prevail!
- 0,
- 50
Data Sources
| Name | Params (B) | Average (lower=better) | github cpp | github javascript | github python | github markdown | github other | arxiv math | arxiv physics | arxiv cs | arxiv other | biorxiv all | wikipedia english | bbc news | ao3 english | wikipedia nonenglish |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Qwen3.5-35B-A3B-Base | 34.661 | 5.653 | 2.763 | 2.724 | 2.891 | 5.235 | 2.649 | 5.611 | 6.234 | 6.170 | 5.929 | 6.355 | 7.605 | 8.420 | 9.643 | 6.919 |
| gemma-3-27b-pt | 27.432 | 5.755 | 2.913 | 3.125 | 3.283 | 5.930 | 2.998 | 5.868 | 6.401 | 6.496 | 6.187 | 6.515 | 7.152 | 7.771 | 9.475 | 6.460 |
| Mistral-Small-3.1-24B-Base | 24.011 | 5.757 | 2.983 | 3.112 | 3.225 | 5.766 | 2.943 | 5.858 | 6.296 | 6.401 | 6.105 | 6.353 | 7.342 | 8.016 | 9.289 | 6.914 |
| NVIDIA-Nemotron-3-Nano-30B | 31.578 | 5.826 | 2.874 | 3.049 | 3.057 | 5.642 | 2.927 | 5.976 | 6.444 | 6.396 | 6.179 | 6.489 | 7.329 | 8.102 | 9.607 | 7.490 |
| JoyAI-LLM-Flash-Base | 48.943 | 5.846 | 2.666 | 2.665 | 2.717 | 5.370 | 2.511 | 5.733 | 6.281 | 6.275 | 5.993 | 6.295 | 7.504 | 8.468 | 9.981 | 9.381 |
| Seed-OSS-36B-Base | 36.151 | 5.847 | 2.790 | 2.884 | 2.937 | 5.407 | 2.688 | 5.772 | 6.526 | 6.437 | 6.163 | 6.435 | 7.483 | 8.195 | 9.533 | 8.602 |
| rwkv7-g1e-13.3b-20260309 | 13.269 | 5.881 | 3.139 | 3.258 | 3.236 | 5.709 | 3.058 | 6.056 | 6.613 | 6.493 | 6.296 | 6.691 | 7.519 | 8.167 | 9.198 | 6.901 |
| Ministral-3-14B-Base-2512 | 13.945 | 5.895 | 3.072 | 3.273 | 3.334 | 5.901 | 3.065 | 5.812 | 6.314 | 6.378 | 6.109 | 6.349 | 7.640 | 8.389 | 9.780 | 7.109 |
| rwkv7-g1d-13.3b-20260131 | 13.269 | 5.902 | 3.161 | 3.291 | 3.283 | 5.755 | 3.088 | 6.065 | 6.629 | 6.515 | 6.312 | 6.700 | 7.538 | 8.174 | 9.196 | 6.914 |
| Qwen3-30B-A3B-Base | 30.532 | 5.921 | 2.860 | 2.960 | 3.089 | 5.894 | 2.876 | 5.674 | 6.491 | 6.574 | 6.227 | 6.577 | 7.858 | 8.476 | 9.753 | 7.589 |
| Qwen2.5-32B | 32.764 | 5.923 | 2.964 | 3.174 | 3.207 | 5.934 | 2.944 | 5.651 | 6.486 | 6.580 | 6.204 | 6.530 | 7.744 | 8.238 | 9.640 | 7.624 |
| rwkv7-g1c-13.3b-20251231 | 13.269 | 5.925 | 3.196 | 3.327 | 3.327 | 5.797 | 3.112 | 6.084 | 6.655 | 6.547 | 6.336 | 6.723 | 7.545 | 8.172 | 9.194 | 6.930 |
| Qwen3-14B-Base | 14.768 | 5.945 | 2.898 | 3.018 | 3.099 | 5.826 | 2.902 | 5.718 | 6.509 | 6.535 | 6.216 | 6.582 | 7.897 | 8.516 | 9.824 | 7.693 |
| Qwen3.5-9B-Base | 8.954 | 5.964 | 3.066 | 3.023 | 3.140 | 5.545 | 2.897 | 5.878 | 6.488 | 6.376 | 6.150 | 6.581 | 8.022 | 8.780 | 10.058 | 7.493 |
| gemma-3-12b-pt | 12.187 | 6.006 | 3.133 | 3.348 | 3.472 | 6.244 | 3.193 | 6.140 | 6.624 | 6.695 | 6.401 | 6.739 | 7.453 | 7.953 | 9.758 | 6.929 |
| granite-4.0-h-small-base | 32.207 | 6.012 | 3.128 | 3.324 | 3.458 | 6.035 | 3.138 | 5.938 | 6.512 | 6.575 | 6.280 | 6.570 | 7.688 | 8.371 | 9.899 | 7.253 |
| Mistral-Nemo-Base-2407 | 12.248 | 6.034 | 3.305 | 3.466 | 3.578 | 6.362 | 3.279 | 6.119 | 6.614 | 6.715 | 6.365 | 6.506 | 7.509 | 8.069 | 9.293 | 7.296 |
| Qwen2.5-14B | 14.770 | 6.072 | 3.120 | 3.325 | 3.348 | 6.173 | 3.092 | 5.817 | 6.656 | 6.733 | 6.354 | 6.666 | 7.841 | 8.308 | 9.764 | 7.806 |
| Ministral-3-8B-Base-2512 | 8.918 | 6.115 | 3.244 | 3.446 | 3.486 | 6.162 | 3.221 | 6.028 | 6.526 | 6.575 | 6.309 | 6.535 | 7.901 | 8.586 | 10.133 | 7.452 |
| Qwen3-8B-Base | 8.191 | 6.188 | 3.089 | 3.224 | 3.269 | 6.115 | 3.072 | 5.922 | 6.736 | 6.736 | 6.421 | 6.801 | 8.194 | 8.780 | 10.135 | 8.142 |
| Olmo-3-1125-32B | 32.234 | 6.218 | 3.606 | 3.838 | 3.564 | 6.089 | 3.569 | 5.978 | 6.559 | 6.460 | 6.196 | 6.492 | 7.715 | 8.223 | 9.521 | 9.241 |
| Meta-Llama-3.1-8B | 8.030 | 6.240 | 3.483 | 3.715 | 3.794 | 6.695 | 3.424 | 6.149 | 6.729 | 6.925 | 6.517 | 6.734 | 7.655 | 8.173 | 9.815 | 7.550 |
| rwkv7-g1e-7.2b-20260301 | 7.199 | 6.241 | 3.425 | 3.588 | 3.533 | 6.203 | 3.380 | 6.457 | 6.969 | 6.826 | 6.625 | 6.999 | 7.877 | 8.470 | 9.542 | 7.477 |
| rwkv7-g1d-7.2b-20260131 | 7.199 | 6.264 | 3.446 | 3.626 | 3.580 | 6.253 | 3.409 | 6.483 | 7.001 | 6.852 | 6.656 | 7.026 | 7.889 | 8.469 | 9.530 | 7.481 |
| Motif-2-12.7B-Base | 12.704 | 6.266 | 3.519 | 3.736 | 3.747 | 6.260 | 3.451 | 6.245 | 6.743 | 6.648 | 6.405 | 6.791 | 7.874 | 8.375 | 9.915 | 8.015 |
| rwkv7-g1c-7.2b-20251231 | 7.199 | 6.294 | 3.504 | 3.667 | 3.648 | 6.316 | 3.439 | 6.506 | 7.030 | 6.895 | 6.688 | 7.057 | 7.886 | 8.462 | 9.519 | 7.497 |
| marin-32b-base | 32.520 | 6.313 | 3.668 | 3.905 | 3.809 | 6.447 | 3.593 | 6.112 | 6.685 | 6.818 | 6.410 | 6.645 | 7.395 | 7.859 | 9.406 | 9.636 |
| Qwen3.5-4B-Base | 4.206 | 6.327 | 3.379 | 3.318 | 3.401 | 5.930 | 3.153 | 6.206 | 6.801 | 6.648 | 6.429 | 6.888 | 8.486 | 9.229 | 10.558 | 8.147 |
| Falcon-H1-7B-Base | 7.586 | 6.376 | 3.486 | 3.643 | 3.780 | 6.616 | 3.452 | 5.966 | 6.664 | 6.788 | 6.425 | 6.775 | 8.003 | 8.509 | 10.089 | 9.068 |
| Trinity-Mini-Base | 26.124 | 6.391 | 3.306 | 3.612 | 3.690 | 6.365 | 3.324 | 6.439 | 6.922 | 6.866 | 6.566 | 6.835 | 7.935 | 8.295 | 9.898 | 9.416 |
| Qwen2.5-7B | 7.616 | 6.457 | 3.361 | 3.610 | 3.558 | 6.586 | 3.329 | 6.139 | 7.035 | 7.057 | 6.676 | 7.051 | 8.334 | 8.722 | 10.291 | 8.652 |
| Mistral-7B-v0.1 | 7.242 | 6.482 | 3.715 | 3.918 | 3.996 | 6.768 | 3.555 | 6.364 | 6.926 | 7.041 | 6.674 | 6.838 | 7.777 | 8.187 | 9.804 | 9.187 |
| Qwen3-4B-Base | 4.022 | 6.515 | 3.319 | 3.469 | 3.466 | 6.461 | 3.277 | 6.164 | 7.036 | 6.997 | 6.683 | 7.101 | 8.630 | 9.201 | 10.587 | 8.817 |
| Hunyuan-7B-Pretrain | 7.505 | 6.539 | 3.500 | 3.591 | 3.522 | 6.331 | 3.294 | 6.563 | 7.244 | 7.059 | 6.831 | 7.205 | 8.350 | 8.976 | 10.614 | 8.461 |
| Ministral-3-3B-Base-2512 | 3.849 | 6.549 | 3.572 | 3.802 | 3.803 | 6.663 | 3.543 | 6.435 | 6.913 | 6.936 | 6.673 | 6.859 | 8.407 | 9.055 | 10.825 | 8.203 |
| marin-8b-base | 8.030 | 6.567 | 3.868 | 4.119 | 4.118 | 6.815 | 3.778 | 6.206 | 6.693 | 6.772 | 6.503 | 6.788 | 7.858 | 8.187 | 9.850 | 10.382 |
| rwkv7-g1e-2.9b-20260312 | 2.948 | 6.573 | 3.744 | 3.898 | 3.832 | 6.655 | 3.664 | 6.815 | 7.293 | 7.119 | 6.922 | 7.262 | 8.240 | 8.781 | 9.846 | 7.958 |
| rwkv7-g1d-2.9b-20260131 | 2.948 | 6.594 | 3.755 | 3.920 | 3.874 | 6.696 | 3.685 | 6.841 | 7.330 | 7.157 | 6.964 | 7.304 | 8.240 | 8.774 | 9.828 | 7.948 |
| gemma-3-4b-pt | 4.300 | 6.599 | 3.660 | 3.844 | 3.933 | 6.996 | 3.663 | 6.729 | 7.200 | 7.203 | 6.923 | 7.293 | 8.137 | 8.442 | 10.492 | 7.879 |
| rwkv7-g1c-2.9b-20251231 | 2.948 | 6.614 | 3.775 | 3.947 | 3.921 | 6.739 | 3.707 | 6.868 | 7.360 | 7.196 | 6.991 | 7.325 | 8.240 | 8.764 | 9.813 | 7.953 |
| Llama-2-13b-hf | 13.016 | 6.657 | 3.901 | 4.170 | 4.328 | 7.289 | 3.871 | 6.823 | 7.256 | 7.333 | 6.988 | 7.150 | 7.711 | 8.149 | 9.784 | 8.437 |
| Minitron-8B-Base | 8.272 | 6.749 | 3.677 | 4.101 | 4.118 | 7.297 | 3.892 | 7.137 | 7.418 | 7.300 | 7.033 | 7.106 | 8.150 | 8.555 | 10.056 | 8.640 |
| Llama-3.2-3B | 3.213 | 6.781 | 3.925 | 4.197 | 4.194 | 7.370 | 3.861 | 6.722 | 7.279 | 7.409 | 7.022 | 7.256 | 8.250 | 8.632 | 10.378 | 8.437 |
| granite-4.0-h-tiny-base | 6.939 | 6.812 | 3.748 | 3.949 | 3.962 | 6.842 | 3.665 | 6.659 | 7.400 | 7.294 | 7.002 | 7.418 | 8.620 | 9.010 | 10.804 | 8.994 |
| Zamba2-7B | 7.357 | 6.818 | 3.934 | 4.212 | 4.231 | 7.112 | 3.846 | 6.733 | 7.263 | 7.206 | 6.895 | 7.091 | 7.870 | 8.502 | 9.875 | 10.682 |
| Nanbeige4-3B-Base | 3.934 | 6.859 | 3.667 | 3.820 | 3.703 | 6.479 | 3.605 | 6.522 | 7.085 | 6.978 | 6.731 | 7.124 | 8.783 | 9.363 | 11.362 | 10.803 |
| falcon-mamba-7b | 7.273 | 6.869 | 3.756 | 4.028 | 4.192 | 7.226 | 3.751 | 6.543 | 7.082 | 7.249 | 6.861 | 7.148 | 8.290 | 8.644 | 9.986 | 11.413 |
| Olmo-3-1025-7B | 7.298 | 6.898 | 4.147 | 4.429 | 3.998 | 6.932 | 4.140 | 6.537 | 7.143 | 6.946 | 6.713 | 7.013 | 8.389 | 8.835 | 10.335 | 11.016 |
| Qwen2.5-3B | 3.086 | 6.899 | 3.703 | 3.960 | 3.884 | 7.161 | 3.665 | 6.560 | 7.483 | 7.467 | 7.081 | 7.460 | 8.805 | 9.162 | 10.788 | 9.411 |
| Falcon-H1-3B-Base | 3.149 | 6.942 | 3.999 | 4.134 | 4.239 | 7.277 | 3.954 | 6.397 | 7.144 | 7.199 | 6.844 | 7.299 | 8.647 | 9.061 | 10.774 | 10.215 |
| Falcon3-7B-Base | 7.456 | 6.959 | 3.719 | 3.968 | 4.119 | 7.276 | 3.728 | 6.298 | 6.915 | 7.004 | 6.634 | 6.952 | 8.669 | 9.143 | 10.568 | 12.438 |
| granite-4.0-h-micro-base | 3.191 | 6.960 | 3.851 | 4.092 | 4.037 | 6.931 | 3.789 | 6.820 | 7.602 | 7.344 | 7.096 | 7.570 | 8.731 | 8.992 | 10.743 | 9.841 |
| Llama-2-7b-hf | 6.738 | 6.989 | 4.187 | 4.442 | 4.596 | 7.729 | 4.133 | 7.200 | 7.605 | 7.630 | 7.297 | 7.454 | 8.035 | 8.391 | 10.110 | 9.042 |
| Yi-1.5-6B | 6.061 | 6.991 | 3.789 | 3.947 | 4.069 | 7.416 | 3.907 | 6.747 | 7.448 | 7.480 | 7.113 | 7.418 | 8.380 | 8.766 | 10.319 | 11.081 |
| Qwen3.5-2B-Base | 1.882 | 7.023 | 4.056 | 3.915 | 3.956 | 6.768 | 3.698 | 6.843 | 7.436 | 7.230 | 7.013 | 7.475 | 9.277 | 10.027 | 11.430 | 9.201 |
| SmolLM3-3B-Base | 3.075 | 7.038 | 3.848 | 4.273 | 4.190 | 7.492 | 4.063 | 7.299 | 7.839 | 7.603 | 7.337 | 7.552 | 8.510 | 8.946 | 10.408 | 9.170 |
| rwkv7-g1e-1.5b-20260309 | 1.527 | 7.050 | 4.193 | 4.323 | 4.216 | 7.267 | 4.071 | 7.288 | 7.731 | 7.514 | 7.324 | 7.677 | 8.754 | 9.242 | 10.361 | 8.741 |
| Qwen3-1.7B-Base | 1.721 | 7.069 | 3.738 | 3.912 | 3.876 | 7.155 | 3.695 | 6.682 | 7.576 | 7.480 | 7.165 | 7.613 | 9.237 | 9.774 | 11.255 | 9.808 |
| rwkv7-g1d-1.5b-20260212 | 1.527 | 7.073 | 4.209 | 4.346 | 4.249 | 7.309 | 4.094 | 7.321 | 7.785 | 7.567 | 7.381 | 7.736 | 8.746 | 9.229 | 10.334 | 8.720 |
| gemma-2-2b | 2.614 | 7.074 | 4.033 | 4.302 | 4.461 | 7.818 | 4.123 | 7.254 | 7.676 | 7.722 | 7.355 | 7.467 | 8.587 | 8.705 | 10.686 | 8.852 |
| Trinity-Nano-Base | 6.120 | 7.082 | 3.759 | 4.129 | 4.220 | 7.314 | 3.804 | 7.179 | 7.624 | 7.510 | 7.219 | 7.476 | 8.594 | 8.836 | 10.582 | 10.902 |
| Llama-3.1-Minitron-4B-Width | 4.513 | 7.094 | 3.858 | 4.278 | 4.272 | 7.578 | 4.058 | 7.354 | 7.678 | 7.546 | 7.276 | 7.359 | 8.387 | 8.834 | 10.413 | 10.421 |
| rwkv7-g1c-1.5b-20260110 | 1.527 | 7.094 | 4.221 | 4.383 | 4.304 | 7.358 | 4.121 | 7.342 | 7.812 | 7.610 | 7.412 | 7.764 | 8.741 | 9.219 | 10.314 | 8.719 |
| Minitron-4B-Base | 4.191 | 7.213 | 3.988 | 4.488 | 4.453 | 7.877 | 4.286 | 7.699 | 7.911 | 7.691 | 7.445 | 7.482 | 8.591 | 8.947 | 10.609 | 9.509 |
| stablelm-3b-4e1t | 2.795 | 7.248 | 4.093 | 4.486 | 4.593 | 8.183 | 4.451 | 7.435 | 7.812 | 7.841 | 7.480 | 7.538 | 8.377 | 8.728 | 10.395 | 10.062 |
| Qwen2.5-1.5B | 1.544 | 7.317 | 3.977 | 4.261 | 4.155 | 7.652 | 3.944 | 6.941 | 7.893 | 7.826 | 7.441 | 7.862 | 9.267 | 9.571 | 11.299 | 10.351 |
| Youtu-LLM-2B-Base | 1.962 | 7.337 | 3.836 | 4.000 | 3.695 | 6.640 | 3.610 | 7.137 | 7.656 | 7.359 | 7.202 | 7.669 | 9.116 | 9.980 | 12.069 | 12.749 |
| Index-1.9B | 2.173 | 7.391 | 4.488 | 5.137 | 4.898 | 8.567 | 4.765 | 6.848 | 7.645 | 7.783 | 7.399 | 7.718 | 8.964 | 9.373 | 10.985 | 8.901 |
| Falcon-H1-1.5B-Deep-Base | 1.555 | 7.469 | 4.477 | 4.662 | 4.644 | 7.979 | 4.467 | 6.840 | 7.725 | 7.638 | 7.250 | 7.734 | 9.179 | 9.542 | 11.199 | 11.236 |
| mamba2attn-2.7b | 2.698 | 7.475 | 4.484 | 4.859 | 5.087 | 8.419 | 4.541 | 6.946 | 7.532 | 7.971 | 7.461 | 7.448 | 8.849 | 9.234 | 10.711 | 11.104 |
| Llama-3.1-Minitron-4B-Depth | 4.540 | 7.477 | 4.077 | 4.553 | 4.523 | 8.025 | 4.338 | 7.765 | 8.079 | 7.886 | 7.630 | 7.717 | 8.767 | 9.186 | 10.769 | 11.361 |
| Llama-3.2-1B | 1.236 | 7.498 | 4.519 | 4.824 | 4.753 | 8.340 | 4.466 | 7.442 | 7.986 | 8.081 | 7.680 | 7.916 | 8.976 | 9.266 | 11.187 | 9.540 |
| granite-4.0-h-1b-base | 1.462 | 7.516 | 4.232 | 4.525 | 4.415 | 7.554 | 4.197 | 7.323 | 8.172 | 7.828 | 7.588 | 8.130 | 9.323 | 9.501 | 11.372 | 11.067 |
| Falcon-H1-1.5B-Base | 1.555 | 7.631 | 4.628 | 4.820 | 4.809 | 8.205 | 4.627 | 7.006 | 7.871 | 7.778 | 7.389 | 7.874 | 9.326 | 9.666 | 11.365 | 11.468 |
| SmolLM2-1.7B | 1.711 | 7.716 | 4.247 | 4.568 | 4.602 | 8.333 | 4.401 | 7.542 | 8.217 | 8.045 | 7.683 | 7.956 | 9.149 | 9.301 | 10.755 | 13.220 |
| stablelm-2-1_6b | 1.645 | 7.745 | 4.832 | 5.026 | 5.041 | 8.599 | 4.668 | 7.800 | 8.323 | 8.327 | 7.951 | 8.070 | 8.891 | 9.179 | 10.995 | 10.729 |
| Zamba2-2.7B | 2.662 | 7.760 | 5.473 | 5.714 | 5.442 | 8.544 | 5.441 | 7.442 | 7.799 | 7.660 | 7.399 | 7.506 | 8.343 | 8.848 | 10.488 | 12.544 |
| Qwen3.5-0.8B-Base | 0.752 | 7.834 | 4.763 | 4.611 | 4.598 | 7.713 | 4.342 | 7.531 | 8.131 | 7.898 | 7.668 | 8.201 | 10.225 | 11.021 | 12.443 | 10.528 |
| Qwen3-0.6B-Base | 0.596 | 7.975 | 4.359 | 4.601 | 4.545 | 8.260 | 4.367 | 7.424 | 8.402 | 8.230 | 7.907 | 8.445 | 10.297 | 10.837 | 12.350 | 11.622 |
| rwkv7-g1d-0.4b-20260210 | 0.451 | 8.167 | 5.146 | 5.283 | 5.098 | 8.678 | 5.042 | 8.383 | 8.849 | 8.510 | 8.349 | 8.753 | 9.901 | 10.338 | 11.522 | 10.492 |
| gemma-3-1b-pt | 1.000 | 8.192 | 5.713 | 5.772 | 5.479 | 9.268 | 5.746 | 8.723 | 8.840 | 8.586 | 8.351 | 8.598 | 9.297 | 9.329 | 11.759 | 9.233 |
| rwkv7-g1a-0.4b-20250905 | 0.451 | 8.313 | 5.289 | 5.600 | 5.585 | 9.212 | 5.270 | 8.421 | 8.918 | 8.714 | 8.471 | 8.845 | 9.849 | 10.234 | 11.396 | 10.570 |
| Qwen2.5-0.5B | 0.494 | 8.439 | 4.768 | 5.089 | 4.896 | 8.961 | 4.725 | 7.968 | 9.027 | 8.846 | 8.457 | 9.013 | 10.530 | 10.751 | 12.588 | 12.521 |
| rwkv7a-g1d-0.1b-20260212 | 1.007 | 8.895 | 5.682 | 5.815 | 5.632 | 9.591 | 5.636 | 9.148 | 9.567 | 9.235 | 9.065 | 9.486 | 10.668 | 11.162 | 12.427 | 11.410 |
| ERNIE-4.5-0.3B-Base-PT | 0.361 | 8.951 | 5.481 | 5.682 | 5.622 | 9.342 | 5.353 | 8.610 | 9.214 | 9.028 | 8.745 | 9.084 | 10.961 | 11.366 | 13.167 | 13.665 |
| SmolLM2-360M | 0.362 | 8.997 | 5.275 | 5.567 | 5.551 | 9.968 | 5.467 | 8.611 | 9.295 | 9.066 | 8.684 | 9.020 | 10.401 | 10.526 | 12.238 | 16.287 |
| Falcon-H1-0.5B-Base | 0.521 | 9.084 | 5.475 | 5.808 | 5.677 | 9.640 | 5.600 | 7.880 | 8.755 | 8.572 | 8.162 | 8.782 | 10.531 | 10.545 | 12.924 | 18.830 |
| rwkv7-g1d-0.1b-20260129 | 0.191 | 9.386 | 6.206 | 6.381 | 6.094 | 10.188 | 6.158 | 9.564 | 10.007 | 9.572 | 9.426 | 9.891 | 11.187 | 11.542 | 12.721 | 12.463 |
| rwkv7-g1a-0.1b-20250728 | 0.191 | 9.585 | 6.369 | 6.721 | 6.706 | 10.796 | 6.398 | 9.750 | 10.182 | 9.885 | 9.645 | 10.038 | 11.132 | 11.351 | 12.496 | 12.715 |
| gemma-3-270m | 0.268 | 9.858 | 7.467 | 7.754 | 6.799 | 11.579 | 7.857 | 9.828 | 10.149 | 9.970 | 9.684 | 10.076 | 10.957 | 10.750 | 13.424 | 11.719 |
| SmolLM2-135M | 0.135 | 10.299 | 6.141 | 6.738 | 6.854 | 11.299 | 7.440 | 10.137 | 10.502 | 10.190 | 9.862 | 10.114 | 11.443 | 11.604 | 13.338 | 18.522 |
This page presents the average per-position byte compression rate, demonstrating the long-context capabilities of the models.
Calculated at the byte level, these metrics allow for direct comparison across different tokenizers.
Benchmarks utilize the UncheatableEval-Long dataset series, with sequence lengths extending up to 32k characters.
Model / Dataset Selection
Compression Ratio Scaling Law
Explore how compression ratio scales with model parameters across different datasets.
Uncheatable Eval
GitHub page: https://github.com/Jellyfish042/uncheatable_eval
Dataset page: https://huggingface.co/collections/Jellyfish042/uncheatableeval
Introduction
Traditional LLM benchmarks are easily compromised by unintentional or intentional data leakage, making many benchmarks unreliable and unable to truly reflect the capabilities of LLMs.
Uncheatable Eval addresses this issue by testing LLMs on real-time, newly generated data from the internet, ensuring that the evaluation is immune to data leaks and cannot be gamed.
How?
Uncheatable Eval assesses the language modeling capabilities of LLMs on new data from various sources such as recent papers on arXiv, new projects on GitHub, news articles, and more. Since this data is brand new (e.g., from the past 1-2 weeks), it is impossible for these data to be included in the training sets of publicly released models, thus avoiding the impact of unintentional or intentional data leaks.
Specifically, we calculate the sum of negative log probabilities of the models on these texts. In other words, models that are more likely to generate these texts are considered better.
Note : Uncheatable Eval only tests base models.
Q&A
Why Calculate the Sum of Negative Log Probabilities?
First, the goal of language models, at least today's language models, is to generate text that is as realistic as possible, maximizing the probability of real text. They are trained and designed to do exactly this. Calculating the sum of negative log probabilities on real text is the most direct way to test this capability.
Second, from the perspective of "compression is intelligence," a good way to test a language model would be to use the model with an entropy coding algorithm for compression and test the model's compression rate [1][2]. A model with a lower compression rate is considered better. Using a language model + arithmetic coding as an example, it is easy to prove that a model's ability to compress a piece of text is proportional to the sum of its negative log probabilities on that text (see proof).
Therefore, the compression rate of a model can be directly calculated through the sum of negative log probabilities, and the method for this has been provided in show_results_v2.ipynb.
Can Models Using Different Tokenizers Be Directly Compared?
Yes. When calculating the sum of negative log probabilities, we essentially treat the model + tokenizer as a single entity or system. As long as this system has a high probability of generating real text, we consider it better. From the perspective of compression, you can choose any tokenizer. From the compression rate perspective, we don't care; we only care about whether your system can compress the text more effectively.