You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Those who use the interactive chat mode how do you go to the begin of the answer? Is it possible to not have the terminal window scroll down to the bottom of the answer when the answer is streaming, letting me scroll down from the begin of answer at my will?
For me if the answer is really long and I need to scroll up, I have to put extra effort to find the spot where my prompt is because it's undistinguisable from the responses.
For example is it possible to change the color of my prompt so it would be easy to spot it? Like it would be nice if there was an option (maybe a command-line flag like --color-prompt blue or an environment variable) to print the user's submitted prompt line in a specific color using ANSI escape codes before displaying the LLM response.
Edit: I realized that if I don't pipe llm chat through sd.py ( https://github.com/kristopolous/Streamdown/tree/main ) then the prompt is preceded with >. I guess that's something, but not good enough. The response can also have lines beginning with > character etc.
I wonder if since I already pipe things through sd.py, maybe it could modify the appearance of the prompts lines?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Those who use the interactive chat mode how do you go to the begin of the answer? Is it possible to not have the terminal window scroll down to the bottom of the answer when the answer is streaming, letting me scroll down from the begin of answer at my will?
For me if the answer is really long and I need to scroll up, I have to put extra effort to find the spot where my prompt is because it's undistinguisable from the responses.
For example is it possible to change the color of my prompt so it would be easy to spot it? Like it would be nice if there was an option (maybe a command-line flag like --color-prompt blue or an environment variable) to print the user's submitted prompt line in a specific color using ANSI escape codes before displaying the LLM response.
Edit: I realized that if I don't pipe llm chat through sd.py ( https://github.com/kristopolous/Streamdown/tree/main ) then the prompt is preceded with
>
. I guess that's something, but not good enough. The response can also have lines beginning with>
character etc.I wonder if since I already pipe things through sd.py, maybe it could modify the appearance of the prompts lines?
Beta Was this translation helpful? Give feedback.
All reactions