- Clone of ChatGPT to get same functionality using open source LLMs offline, locally.
- Gives interface to interact with LLMs installed locally via Ollama and Streamlit.
To provide an interface to interact with LLMs installed locally, offline, and to save the threads and models used for later use.
- Runs Locally with Streamlit
- Any model from Ollama can be used
- Supports Streaming Responses with Live Preview
- Support attachments like Images for vision models like LLama Vision and LLava
- Supports Threads (Chat Archives)
- Can rename, create and delete threads
- Threads are listed as per last used
- Remembers the last model used with thread, which is auto-loaded next time
- Can switch between models within same thread
- Running models can be checked and stopped
- Python
- Streamlit
- Ollama
-
Clone the repository:
git clone https://github.com/Bbs1412/LocalGPT
-
Navigate to the project directory:
cd LocalGPT
-
Install the required packages:
pip install -r requirements.txt
-
Run the app:
streamlit run app.py
-
Open the link in the browser:
http://localhost:8501
Any contributions or suggestions are welcome!
- This project is licensed under the
MIT License
- See the LICENSE file for details.
- You can use the code with proper credits to the author.
- Email - [email protected]
- LinkedIn - /bhushan-songire