A beautiful chat interface for interacting with Ollama's API. Built with React, TypeScript, and Material-UI.
- Modern, responsive design
- Dark/Light mode support
- Real-time chat interface
- Support for multiline messages
- Loading indicators
- Error handling
- Keyboard shortcuts (Enter to send, Shift+Enter for new line)
- Node.js (v14 or higher)
- npm or yarn
- Ollama running locally (default port: 11434)
- Clone this repository
- Install dependencies:
npm install # or yarn install
- Start the development server:
npm start # or yarn start
- Open http://localhost:3000 in your browser
You can also run the application using Docker:
-
Build the container:
docker build -t flufy-app .
-
Run the container:
docker run -p 80:80 -e REACT_APP_API_URL=your_api_url flufy-app
The container will serve the application on port 80. Make sure to replace your_api_url
with your actual API URL.
- Make sure Ollama is running locally
- Type your message in the input field
- Press Enter or click the send button to send your message
- The AI will respond in the chat window
- Toggle between dark and light mode using the theme button in the top right
The application is configured to use the llama2
model by default. You can change this by modifying the model
parameter in the handleSend
function in src/App.tsx
.
- Built with React 18
- TypeScript for type safety
- Material-UI for components
- Emotion for styling
MIT