-
Notifications
You must be signed in to change notification settings - Fork 26
[29] Logging in json format #68
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
||
# TODO: performance: we repeatedly open the file for each call. Better for multiprocessing | ||
# but we can probably do better and rely for example on the logging module. | ||
with open(os.path.join(self.path_run, "metrics.json"), "ab") as f: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suggest that we start with this simple version, we can always improve performance if it turns out to be a bottleneck
pyproject.toml
Outdated
|
||
[tool.uv.sources] | ||
flash-attn = { url = "https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp312-cp312-linux_x86_64.whl" } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had to make this change to use uv on the hpc2020 cluster. I am not sure if this is going to be a breaking change for people. @clessig , do we assume that different HPCs can use different versions of CUDA? That sounds like a nightmare.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We do not assume it, we know it ;) One can write a script that detects the available CUDA (and the python version if this is a variable) and then assembles the string that defines the wheel to be downloaded. @tjhunter : To what extent could one integrate this into pyproject toml?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And could we open an issues to track this? :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have the script in branch of the private repo but not committed yet:
#57
Ass discussed, will be followed up by #90 |
Closes #29
First part: prototyping the new format.