A distributed system for processing and analyzing large log files in real-time, built with Go and Next.js.
This project consists of multiple microservices working together to handle log file processing, analysis, and visualization:
- Backend Service: REST API and WebSocket server for file management and real-time updates
- Frontend Service: Next.js web application for file upload and result visualization
- Log Generator Service: Utility service for generating test log files
- Log Processor Service: Service for processing and analyzing log files
- Redis: For real-time communication via Pub/Sub and caching
- PostgreSQL: For storing processing results and metadata
- Docker and Docker Compose
- Go 1.21 or later
- Node.js 18 or later
- pnpm (for frontend development)
- Redis Pub/Sub
. βββ backend-service/ # Go backend API service βββ frontend-service/ # Next.js frontend application βββ log-generator-service/ # Log file generator utility βββ log-processor-service/ # Log processing service βββ uploads/ # Shared volume for log files βββ docker-compose.yml # Docker Compose configuration The backend service provides:
- REST API endpoints for file management
- WebSocket server for real-time updates
- Authentication using JWT Token via Supabase
- File upload handling
- Integration with Redis and PostgreSQL
- Communication between services via Redis Pub/Sub
- CORS configuration for frontend communication
The frontend service features:
- Modern UI built with Next.js
- Real-time updates via WebSocket
- File upload interface
- Results visualization
- Supabase Authentication:
- Email/Password authentication
- GitHub OAuth
- Protected routes and API endpoints
- User session management
A utility service that generates test log files with:
- Random log levels (DEBUG, INFO, WARN, ERROR, FATAL)
- Timestamped entries
- Optional JSON payloads
- Configurable file sizes
Processes log files and provides:
- Log analysis
- Pattern detection
- Real-time processing
- Result storage in PostgreSQL
- Clone the repository:
git clone https://github.com/ijasmoopan/log-file-processor.git cd intucloud-task- Set up environment variables:
cp .env.example .env cp backend-service/.env.local.example backend-service/.env.local cp frontend-service/.env.local.example frontend-service/.env.local cp log-processor-service/.env.local.example log-processor-service/.env.local- Start the services using Docker Compose:
docker-compose up -d- Access the application:
- Frontend: http://localhost:3000
- Backend API: http://localhost:8080
- PostgreSQL: localhost:5432
- Redis: localhost:6379
- Backend Service:
cd backend-service go run main.go- Frontend Service:
cd frontend-service pnpm install pnpm dev- Log Processor Service:
cd log-processor-service go run cmd/processor/main.go- Log Generator Service:
cd log-generator-service go run main.goKey environment variables needed:
DB_HOST: PostgreSQL hostDB_PORT: PostgreSQL portDB_USER: Database userDB_PASSWORD: Database passwordDB_NAME: Database nameREDIS_HOST: Redis hostREDIS_PORT: Redis portSUPABASE_URL: Supabase URLSUPABASE_KEY: Supabase API key
NEXT_PUBLIC_BACKEND_URL: Backend service URLNEXT_PUBLIC_SUPABASE_URL: Supabase URLNEXT_PUBLIC_SUPABASE_ANON_KEY: Supabase anonymous key
REDIS_HOST: Redis hostREDIS_PORT: Redis port
POST /api/v1/upload: Upload log filesGET /api/v1/files: List uploaded filesPOST /api/v1/process: Process uploaded filesGET /api/v1/ws: WebSocket endpoint for real-time updatesGET /api/v1/results: Get processing resultsGET /api/v1/results/:id: Get result by IDGET /api/v1/results/filename/:filename: Get result by filename
The project includes Docker configurations for all services:
- Each service has its own
Dockerfile docker-compose.ymlorchestrates all services- Shared volumes for file storage and database persistence
- Network isolation using Docker networks
- Fork the repository
- Create your feature branch
- Commit your changes
- Push to the branch
- Create a new Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.