





A powerful book generation application that leverages Ollama's local LLMs to create complete books through an intuitive Gradio interface. This tool helps writers outline, generate, and edit full-length books with minimal effort.
- ๐ Complete Book Generation: Create entire books from a single prompt with chapter-by-chapter generation
- ๐ง Local LLM Integration: Powered by Ollama for private, local AI inference
- ๐ฅ๏ธ User-friendly Interface: Intuitive Gradio UI with tabs for different aspects of book creation
- ๐ Project Management: Create, save, and load book projects with automatic UI synchronization
- ๐ World Building Tools: Define and manage world elements for consistent storytelling
- ๐ค Character Management: Create and track characters throughout your narrative
- โ๏ธ Advanced Editing: Edit individual chapters and perform find-and-replace across your book
- ๐ Outline Generation: Create detailed chapter outlines with key events, character developments, settings, and tone
- ๐ Progress Tracking: Monitor book generation progress with time estimates
- ๐ค Export Options: Combine chapters and export your book in text format
- Python 3.8 or higher
- Ollama installed and running locally
- At least one model loaded in Ollama (e.g., mistral, llama2, mixtral)
- Windows, macOS, or Linux operating system
- Download or clone this repository
- Run
simple_setup.bat
which will:- Check for Python and Ollama installation
- Create a virtual environment
- Install required dependencies
- Start the application
-
Ensure Python 3.8+ and Ollama are installed
-
Clone the repository:
git clone https://github.com/GeekyGhost/Geeky-Ghost-Writer.git cd Geeky-Ghost-Writer
-
Create and activate a virtual environment:
# Windows python -m venv venv venv\Scripts\activate # macOS/Linux python -m venv venv source venv/bin/activate
-
Install dependencies:
pip install gradio>=4.0.0 requests>=2.25.0 markdown>=3.3.0 pyyaml>=6.0
-
Run the application:
python simple_ollama_app.py
The application interface is divided into several tabs:
- Create new book projects with titles and writing style descriptions
- Load and manage existing projects
- View current project information
- Create detailed world elements (geography, magic systems, technology, etc.)
- Add and manage character profiles
- Edit or delete world-building elements and characters
- Enter your story prompt
- Select the number of chapters and Ollama model to use
- Generate a detailed chapter outline
- Generate the full book chapter by chapter
- Monitor generation progress
- Combine chapters into a complete book
- Load and edit individual chapters
- Perform find-and-replace operations across the entire book
- Save chapter changes
- Browse and view generated files
- Export your book in text format (with more formats planned for future updates)
For best results, use one of these Ollama models:
mistral
(balanced performance and quality)mixtral
(highest quality but slower)llama2
(good all-around performance)phi
(faster, lighter model)gemma
(Google's efficient model)
๐ก Model Management Tip: For an easy way to download, manage, and configure your Ollama models, check out Little Geeky's Learning UI - a companion tool that provides a user-friendly interface for Ollama model management.
Generated content is saved in the book_output
directory with this structure:
book_output/
โโโ YourBookTitle_UUID_Timestamp/
โโโ book_metadata.json
โโโ outline.txt
โโโ chapter_01.txt
โโโ chapter_02.txt
โโโ full_book.txt
The application uses a multi-step process:
- Project Creation: Set up book details, world, and characters
- Outline Generation: Create chapter outlines with key events
- Chapter Generation: Generate each chapter based on the outline
- Editing: Refine and improve content
- Export: Combine chapters and export the book
The application consists of:
- Gradio-based UI with multiple interactive tabs
- Direct integration with the Ollama API for text generation
- Project management with metadata persistence
- Thread-based chapter generation to prevent UI freezing
- Progress tracking and estimation
- Generation time increases with chapter count
- Quality depends on the Ollama model used
- May require significant system resources for larger models
- Currently only exports to text format
- Additional export formats (PDF, EPUB)
- Text-to-speech integration
- Illustration generation
- Enhanced editing tools
- Custom writing styles and templates
This project is licensed under the MIT License - see the LICENSE file for details.
Contributions are welcome! Please feel free to submit a Pull Request.
- Little Geeky's Learning UI: For now, you can use this as a companion tool for managing your Ollama models with a user-friendly interface
Geeky Ghost Writer - Let AI help craft your next literary masterpiece