Skip to content

Commit 2bf2f39

Browse files
authored
Merge pull request #57 from eli5-org/oss-llm-references
Add more references to open source libraries for LLM explainability
2 parents 089ac0f + 84082f7 commit 2bf2f39

File tree

3 files changed

+28
-0
lines changed

3 files changed

+28
-0
lines changed

docs/source/_notebooks/explain_llm_logprobs.rst

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -669,3 +669,11 @@ Then you can explain predictions with a custom client:
669669

670670

671671

672+
673+
Consider checking other libraries which support explaining predictions
674+
of open source LLMs:
675+
676+
- `optillm <https://github.com/codelion/optillm>`__, e.g. see
677+
`codelion/optillm#168 <https://github.com/codelion/optillm/discussions/168#discussioncomment-12399569>`__
678+
- you can also visualize the ouputs using the `logprobs
679+
visualizer <https://huggingface.co/spaces/codelion/LogProbsVisualizer>`__

docs/source/libraries/openai.rst

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -57,11 +57,20 @@ you may call :func:`eli5.explain_prediction` with
5757
See the :ref:`tutorial <explain-llm-logprobs-tutorial>` for a more detailed usage
5858
example.
5959

60+
Consider also checking other libraries which support explaining predictions of open source LLMs:
61+
62+
- `optillm <https://github.com/codelion/optillm>`_, e.g. see
63+
`codelion/optillm#168 <https://github.com/codelion/optillm/discussions/168#discussioncomment-12399569>`_
64+
- you can also visualize the ouputs using the
65+
`logprobs visualizer <https://huggingface.co/spaces/codelion/LogProbsVisualizer>`_
66+
6067
.. note::
6168
While token probabilities reflect model uncertainty in many cases,
6269
they are not always indicative,
6370
e.g. in case of `Chain of Thought <https://arxiv.org/abs/2201.11903>`_
6471
preceding the final response.
72+
See the :ref:`tutorial's limitations section <explain-llm-logprobs-tutorial>`
73+
for an example of that.
6574

6675
.. note::
6776
Top-level :func:`eli5.explain_prediction` calls are dispatched

notebooks/explain_llm_logprobs.ipynb

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -751,6 +751,17 @@
751751
" model=\"mlx-community/Mistral-7B-Instruct-v0.3-4bit\",\n",
752752
")"
753753
]
754+
},
755+
{
756+
"cell_type": "markdown",
757+
"id": "cb6fce33-edb1-42ab-ac93-7bd4472075ca",
758+
"metadata": {},
759+
"source": [
760+
"Consider checking other libraries which support explaining predictions of open source LLMs:\n",
761+
"\n",
762+
"- [optillm](https://github.com/codelion/optillm), e.g. see [codelion/optillm#168](https://github.com/codelion/optillm/discussions/168#discussioncomment-12399569)\n",
763+
"- you can also visualize the ouputs using the [logprobs visualizer](https://huggingface.co/spaces/codelion/LogProbsVisualizer)"
764+
]
754765
}
755766
],
756767
"metadata": {

0 commit comments

Comments
 (0)