1. Intro:
LLM-Compressive evaluates LLMs via data compression on data collected every month from 2017 to 2024.2. Issues:
If you have problems or want to request results of a new model, please head to our project page and open an issue.3. Benchmark Performance:
4. Context Length Performance: