An interactive AI chatbot built using Streamlit and a locally hosted language model via Ollama. The application supports conversational memory and runs completely offline without relying on paid cloud APIs.
- Real-time conversational chatbot
- Chat memory using session state
- Clean chat-style UI
- Local LLM integration (offline)
- No API keys or cloud dependency
- Exportable and reproducible project setup
- Python
- Streamlit
- Ollama (Phi model)
chatbot/
│
├── app.py
├── requirements.txt
└── README.md
pip install -r requirements.txt
Download and install Ollama from:
ollama pull phi
streamlit run app.py
The chatbot will open in your browser.
The chatbot sends user messages to a locally running language model through Ollama. Conversation history is stored using Streamlit session state to maintain chat memory.
- Response streaming
- Multiple model support
- Conversation export
- UI theming
- Voice input/output
This project is for educational and portfolio purposes.
a5980f9 (Initial chatbot release)

