Skip to content

Popular repositories Loading

  1. exllamav2 exllamav2 Public

    A fast inference library for running LLMs locally on modern consumer-class GPUs

    Python 4.1k 308

  2. exui exui Public

    Web UI for ExLlamaV2

    JavaScript 491 46

  3. exllamav3 exllamav3 Public

    An optimized quantization and inference library for running LLMs locally on modern consumer-class GPUs

    Python 211 11

Repositories

Showing 3 of 3 repositories
  • exllamav3 Public

    An optimized quantization and inference library for running LLMs locally on modern consumer-class GPUs

    turboderp-org/exllamav3’s past year of commit activity
    Python 211 MIT 11 3 1 Updated Apr 10, 2025
  • exllamav2 Public

    A fast inference library for running LLMs locally on modern consumer-class GPUs

    turboderp-org/exllamav2’s past year of commit activity
    Python 4,106 MIT 308 119 18 Updated Mar 15, 2025
  • exui Public

    Web UI for ExLlamaV2

    turboderp-org/exui’s past year of commit activity
    JavaScript 491 MIT 46 33 3 Updated Feb 5, 2025

Top languages

Loading…

Most used topics

Loading…