Skip to content

Open Source llama.cpp Backend Integration with ipex-llm #13234

Open
@Davidqian123

Description

@Davidqian123

Hi ipex-llm team,

Thanks for the great work on this project. I’m wondering if there are any plans to open source your customized llama.cpp implementation — particularly the parts related to the SYCL backend for GPU and the OpenVINO backend for NPU. Having access to this source code would greatly help the community better understand your integration, contribute improvements, and enable broader support for more models.

Looking forward to your thoughts!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions