Our work is always open-source*. Our published work is available on this page and on the GPT-Research Hub.
*Within reason
CamelGPT-Mini: A 56k Parameter Language Model
10/12/2023
Converters Makes Model Inferencing Dead Simple
9/28/2023
100X Gains With Eager Precached Dynamic Pruning
9/3/2023
Open-Sourcing Our Model Distribution Platform
6/22/2023