Skip to main content
This guide covers detailed installation instructions for DeepWiki-Open, including system requirements, dependencies, and various setup options.

System Requirements

  • OS: Linux, macOS, or Windows 10+
  • Python: 3.8 or higher
  • Node.js: 16.0 or higher
  • Memory: 4GB RAM minimum
  • Storage: 2GB free space
  • Network: Internet connection for AI API calls

Prerequisites

Before installing DeepWiki-Open, ensure you have:
1

Python Installation

# Using Homebrew
brew install python@3.10

# Or download from python.org
# https://www.python.org/downloads/
Verify Python installation: python --version should show 3.8+
2

Node.js Installation

# Using Homebrew
brew install node

# Or using Node Version Manager
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.0/install.sh | bash
nvm install 18
nvm use 18
Verify Node.js installation: node --version should show 16.0+
3

Git Installation

# Git comes with Xcode Command Line Tools
xcode-select --install

# Or using Homebrew
brew install git
Verify Git installation: git --version

Installation Methods

Choose the installation method that best fits your needs:

Verification

After installation, verify that everything is working correctly:
1

Check Backend Health

# Test backend API
curl http://localhost:8001/health

# Or visit in browser
open http://localhost:8001/docs
Expected response:
{"status": "healthy", "version": "0.1.0"}
2

Check Frontend

Visit http://localhost:3000 in your browser.You should see:
  • DeepWiki-Open interface
  • Repository URL input field
  • Model selection dropdown
  • Generate Wiki button
3

Test Full Workflow

  1. Enter a small public repository (e.g., https://github.com/octocat/Hello-World)
  2. Select an AI model provider
  3. Click “Generate Wiki”
  4. Verify wiki generation completes successfully

Optional Dependencies

To run AI models locally using Ollama:
# Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh

# Or using Homebrew
brew install ollama

# Start Ollama service
ollama serve

# Pull a model
ollama pull llama3:8b
Update your .env:
OLLAMA_HOST=http://localhost:11434
For improved performance in production:
brew install redis
brew services start redis
Add to .env:
REDIS_URL=redis://localhost:6379

Troubleshooting Installation

Problem: python: command not foundSolutions:
# Check if python3 is available
python3 --version

# Create alias (add to ~/.bashrc or ~/.zshrc)
alias python=python3

# Or install Python properly
sudo apt install python-is-python3  # Ubuntu/Debian
Problem: npm ERR! EACCES: permission deniedSolutions:
# Use Node Version Manager (recommended)
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.0/install.sh | bash
nvm install 18
nvm use 18

# Or fix npm permissions
sudo chown -R $(whoami) ~/.npm
Problem: Cannot connect to Docker daemonSolutions:
# Start Docker service
sudo systemctl start docker

# Add user to docker group
sudo usermod -aG docker $USER
# Then log out and back in

# Or run with sudo (not recommended for development)
sudo docker-compose up
Problem: Port already in useSolutions:
# Find what's using the port
lsof -i :3000  # or :8001

# Kill the process
kill -9 PID

# Or use different ports in .env
PORT=8002
# Update frontend to use new backend port

Next Steps

Environment Configuration

Configure API keys and environment settings

Model Providers Setup

Set up AI model providers for documentation generation

Generate Your First Wiki

Create your first repository wiki

Production Deployment

Deploy DeepWiki for production use