- Making a good UI to interact with LLMs
- Adding a feature to save the model and load it later
- Loading the thread names based on their modification datetime
- Delete threads using button
- Feature to save the last used LLM with the thread
- Open it automatically with that thread...
- Changed UI
- Added feature to Check and Stop Running Models
- Added Modularity to the code
- Added Image Support (LLama vision and Llava)
- Completely modularized the threads codes
- Refactored the image logic Completely
- Save the images in local file, store new name (path) in thread json
- Make the subprocess to copy the parsed image to the local folder
- Updated requirements
- Fixed compatibility with latest versions of ollama and streamlit
- try except in the write_stream to save the half response at least in case of error
- Update the docstring-s of functions in the app_threads file
- Move the LLM model sidebar out of app.py
- Integrate lang-chain in it