A modern, high-performance file search system with semantic understanding, real-time indexing, and a native desktop interface.
- Go 1.21+: Backend development
- Node.js 16+: Frontend development
- Wails v2: Desktop application framework
- Podman (preferred) or Docker: Database container
# 1. Install Wails
go install github.com/wailsapp/wails/v2/cmd/wails@latest
# 2. Clone and setup
git clone <repository-url>
cd file-search
# 3. Start everything
make install # Install dependencies
make run-all # Start database, backend, and desktop appThe desktop application will open automatically and connect to the backend service running on localhost:8080.
βββββββββββββββββββββββ HTTP API βββββββββββββββββββββββββββ
β β β β
β Desktop App βββββββββββββββββΊβ Backend Service β
β (Wails + React) β β (Go + PostgreSQL) β
β β β β
βββββββββββββββββββββββ βββββββββββββββββββββββββββ
Native UI API Server + Database
- Location:
file-search-system/ - Technology: Go + PostgreSQL + pgVector
- Features: File indexing, semantic search, real-time monitoring
- API: REST endpoints on
localhost:8080
- Location:
file-search-desktop/ - Technology: Wails + React + TypeScript
- Features: Native UI, cross-platform, real-time updates
- Connection: HTTP client to backend API
make install # Install all dependencies
make run-all # Start database, backend, and desktop app
make stop-all # Stop all services
make clean-all # Clean all build artifacts
make status # Show status of all servicesmake run-backend # Start backend service (with database)
make stop-backend # Stop backend service and database
make build-backend # Build backend binary
make logs-backend # Show backend logsmake run-frontend # Build and run desktop app
make dev-frontend # Run desktop app in development mode
make build-frontend # Build desktop app for production
make clean-frontend # Clean frontend build artifactsmake db-start # Start PostgreSQL database
make db-stop # Stop database
make db-init # Initialize database schema
make db-reset # Reset database (WARNING: destroys data)make ollama-install # Install Ollama if not already installed
make ollama-start # Start Ollama service
make ollama-models # Pull all required models (nomic-embed-text ~274MB)
make ollama-status # Check Ollama service and models status
make ollama-list # List installed models
make ollama-stop # Stop Ollama servicemake dev-all # Start all services in development mode
make test # Run all tests
make lint # Run linters
make format # Format codeIf you prefer to set up components individually:
# Start PostgreSQL with pgVector
cd file-search-system
podman-compose up -d # or docker-compose up -d
# Initialize database schema
go run cmd/server/main.go -init-db# Configure environment
cd file-search-system
cp .env.example .env
# Edit .env with your settings
# Start backend service
go run cmd/server/main.go
# Backend API available at http://localhost:8080# Build and run desktop app
cd file-search-desktop
wails build
open build/bin/file-search-desktop.app # macOS- Hybrid Search: Vector similarity + full-text search
- AI-Powered: Uses Ollama with nomic-embed-text for semantic understanding
- Smart Filters: File type, date, size, path patterns
- Real-time Results: Instant search with suggestions
- Contextual Search: Understands meaning, not just keywords
- Real-time Monitoring: Automatic file system watching
- Incremental Updates: Only processes changed files
- Multi-format Support: 25+ programming languages and document types
- Smart Chunking: Document structure-aware processing
- Cross-platform: Native apps for macOS, Windows, Linux
- Modern UI: React with Material Design components
- Live Dashboard: Real-time system metrics and controls
- Offline Capable: Graceful degradation when backend unavailable
- Resource Monitoring: CPU, memory, disk usage tracking
- Auto-throttling: Adaptive rate limiting based on system load
- Configuration: Comprehensive settings management
- Logging: Structured logging with multiple levels
file-search/
βββ README.md # This file
βββ Makefile # Build and run automation
βββ ARCHITECTURE.md # Detailed architecture documentation
βββ IMPLEMENTATION_STATUS.md # Development progress
β
βββ file-search-system/ # Backend API service
β βββ cmd/server/ # Application entry point
β βββ internal/ # Core business logic
β β βββ api/ # REST API handlers
β β βββ search/ # Hybrid search engine
β β βββ service/ # Background services
β β βββ ...
β βββ pkg/ # Reusable packages
β βββ scripts/ # Database schema
β βββ docker-compose.yml # Database container
β βββ .env.example # Configuration template
β
βββ file-search-desktop/ # Desktop application
βββ app.go # Wails app backend
βββ api_client.go # HTTP client for backend
βββ frontend/ # React frontend
βββ build/ # Built application
βββ wails.json # Wails configuration
Edit file-search-system/.env:
# Database
DATABASE_URL=postgresql://username:password@localhost:5432/database_name?sslmode=disable
# Indexing
INDEXING_PATHS=~/Documents,~/Downloads
INDEXING_IGNORE_PATTERNS=*.tmp,node_modules/**,.git/**
# Performance
PERFORMANCE_CPU_THRESHOLD=90
PERFORMANCE_FILES_PER_MINUTE=60
# Search Weights
SEARCH_VECTOR_WEIGHT=0.6
SEARCH_BM25_WEIGHT=0.3
# Ollama (for embeddings)
OLLAMA_URL=http://localhost:11434
OLLAMA_MODEL=nomic-embed-textThe desktop app automatically connects to localhost:8080. To change this, modify file-search-desktop/app.go:
func NewApp() *App {
return &App{
apiClient: NewAPIClient("http://your-backend-url:8080"),
}
}# Check prerequisites
go version # Should be 1.21+
node --version # Should be 16+
wails doctor # Check Wails installation
# Check services
make status # Show status of all services# Check database
make db-start # Ensure database is running
make db-init # Reinitialize if needed
# Check backend logs
make logs-backend # View backend service logs
# Test API
curl http://localhost:8080/api/v1/status# Rebuild desktop app
make clean-frontend
make build-frontend
# Check backend connection
make status # Ensure backend is running# Check Ollama service and models
make ollama-status # Check if Ollama and models are available
make ollama-models # Install required embedding models
make ollama-logs # Check Ollama logs for errors
# Test search manually
curl -X POST http://localhost:8080/api/v1/search \
-H "Content-Type: application/json" \
-d '{"query":"test","limit":5}'# Check system resources
make status
# Adjust performance settings in file-search-system/.env:
PERFORMANCE_CPU_THRESHOLD=70
PERFORMANCE_FILES_PER_MINUTE=30
# Restart services
make restart-all# Reset database (WARNING: destroys all data)
make db-reset
make db-init
# Check database connection
make db-status- OS: macOS 10.13+, Windows 10+, or modern Linux
- CPU: 2 cores, 2.0 GHz
- RAM: 4GB (8GB recommended)
- Disk: 2GB free space + storage for indexed files
- Network: Internet connection for initial setup
- OS: Latest macOS, Windows 11, or Ubuntu 22.04+
- CPU: 4+ cores, 3.0 GHz+
- RAM: 8GB+
- Disk: SSD with 10GB+ free space
- Network: High-speed internet for Ollama model downloads
# Install development dependencies
make dev-deps
# Start development environment
make dev-all # All services in development mode
# Run tests
make test # All tests
make test-backend # Backend tests only
make test-frontend # Frontend tests only
# Code quality
make lint # Run all linters
make format # Format all code- Backend Changes: Modify files in
file-search-system/internal/ - Frontend Changes: Modify files in
file-search-desktop/frontend/src/ - API Changes: Update both backend handlers and frontend API client
- Database Changes: Add migrations to
file-search-system/scripts/
MIT License - see LICENSE file for details.
- pgVector: PostgreSQL extension for vector similarity search
- Ollama: Local LLM inference for embeddings
- Wails: Native cross-platform desktop framework
- Go: Efficient backend development language
- React: Modern frontend framework
Happy searching! π