This project is a chatbot powered by FastAPI, LangChain, and OpenAI's GPT model. The project includes a simple frontend to interact with the chatbot.
- Backend: FastAPI with LangChain for OpenAI GPT-3 integration
- Frontend: Basic HTML/JS to interact with the chatbot
- Containerized using Docker
- LangSmith logging for conversations
- Docker (For containerized deployment)
- Python 3.9+ (For running locally without Docker)
- OpenAI API Key (For GPT-4 interaction)
- LangSmith API Key (For conversation logging)
- Clone the repository:
git clone https://github.com/dheerajreddy2020/langchain-wonders.git cd langchain-wonders
To build a chatbot using Docker, follow these steps:
-
Build docker image:
docker build -t mychatbot .
-
Run docker image:
docker run -p 3000:3000 -p 8000:8000 mychatbot
-
Access the Application:
- Frontend: Open your web browser and go to
http://localhost:3000
to view the frontend. - Backend API: The FastAPI application will be available at
http://localhost:8000
.
- Frontend: Open your web browser and go to
To test the application locally without Docker, follow these steps:
-
Install Python Dependencies:
pip install -r requirements.txt
-
Start the FastAPI Application:
python ./app/app.py
-
Serve the Frontend Files:
Open a new terminal window and navigate to the
frontend
directory:cd ./frontend
Then, start a simple HTTP server to serve the static files:
python -m http.server 3000
-
Access the Application:
- Frontend: Open your web browser and go to
http://localhost:3000
to view the frontend. - Backend API: The FastAPI application will be available at
http://localhost:8000
.
- Frontend: Open your web browser and go to