Our work is always open-source*. Our published work is available on this page and on the GPT-Research Hub.

*Within reason

  • CamelGPT-Mini: A 56k Parameter Language Model

    10/12/2023

    View Paper

  • Converters Makes Model Inferencing Dead Simple

    9/28/2023

    View Paper

  • 100X Gains With Eager Precached Dynamic Pruning

    9/3/2023

    View Paper

  • Open-Sourcing Our Model Distribution Platform

    6/22/2023

    View Paper



Join The World's Leading AI Research Organization.