Skip to content

amarcoder01/NetPulse-showcase

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

3 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿš€ NetPulse v2.0

โšก Professional Network Performance Testing Platform

Built with React โ€ข TypeScript โ€ข Node.js โ€ข Cloudflare Edge

Version Node.js React TypeScript License

Accurate. Fast. Reliable.
Measure your internet speed with precision using advanced TCP analysis and real-time streaming


๐Ÿ“‘ Table of Contents


Code Availability

This repository is a sanitized showcase version of a production SaaS system.

Core business logic, sensitive components, and full implementation details are intentionally excluded to protect intellectual property and maintain system security.

The goal of this repository is to demonstrate:

  • System architecture
  • Code structure
  • Engineering practices

๐Ÿ“ฉ Full codebase access can be provided upon request for technical evaluation.

Contact: amar01pawar80@gmail.com

๐ŸŽฏ Vision & Overview

The Problem

Traditional speed tests often provide inaccurate results due to:

  • TCP slow-start contamination
  • No bufferbloat detection
  • Single-threaded testing bottlenecks
  • Server-side performance limitations
  • Poor protocol overhead handling

Our Solution

NetPulse delivers 99% accurate measurements through:

  • Advanced TCP Analysis - Dynamic grace periods exclude TCP slow-start phase
  • Multi-Core Processing - Cluster mode utilizes all CPU cores
  • Edge Computing - Cloudflare deployment for nearest-edge routing
  • Protocol Intelligence - XHR and WebSocket support with overhead detection
  • Real-Time Streaming - Chunked transfer with backpressure handling

What It Is

A production-grade speed testing platform featuring:

  • ๐ŸŽจ Modern React UI with Tailwind CSS
  • โšก High-performance Node.js backend (Express + Cluster)
  • ๐Ÿ”ต Cloudflare Pages Functions for edge deployment
  • ๐Ÿณ Docker & Kubernetes ready
  • ๐Ÿ“Š Real-time charts and PDF report generation

โœจ Key Features

๐ŸŽฏ Core Testing Capabilities

Feature Description Technology
Download Speed Test Multi-stream HTTP download with chunked encoding XHR / WebSocket
Upload Speed Test Binary stream upload with real-time byte counting POST / WebSocket
Ping & Jitter 5-sample averaging with high-resolution timers hrtime()
Bufferbloat Analysis A-F rating based on latency under load M-Labs algorithm
Packet Loss Detection Sent/received packet tracking Custom implementation
Protocol Overhead Automatic detection and compensation HTTP/TCP/IP factor

๐Ÿš€ Advanced Features

graph LR
    A[User Interface] --> B[Test Engine]
    B --> C[Web Worker]
    C --> D[XHR Protocol]
    C --> E[WebSocket Protocol]
    D --> F[Backend API]
    E --> G[WebSocket Server]
    F --> H[Cluster Mode]
    G --> H
    H --> I[Multi-Core CPU]
Loading

Intelligent Testing:

  • Dynamic Grace Period - Auto-adjusts 1-3s based on connection speed
  • TCP Slow-Start Exclusion - Removes first 2-3 seconds from calculation
  • Goodput Reporting - Reports application-level throughput
  • Parallel Connections - 4 concurrent streams by default
  • Abort Support - Clean test cancellation via AbortController

Developer Experience:

  • TypeScript Strict Mode - Full type safety
  • ESLint + Prettier - Code quality enforced
  • Hot Module Replacement - Vite dev server
  • Health Checks - Built-in monitoring endpoints
  • Comprehensive Logging - Morgan + custom metrics

๐Ÿง  Technology Stack

Frontend Architecture

Modern React SPA with TypeScript & Vite

graph BT
    A[User Interface] --> B[React Components]
    B --> C[State Management]
    C --> D[Web Workers]
    D --> E[Speed Test Engine]
    B --> F[UI Libraries]
    F --> G[Tailwind CSS]
    F --> H[Framer Motion]
    E --> I[Data Visualization]
    I --> J[Recharts]
    I --> K[jsPDF Reports]
Loading

Core Stack:

Layer Technology Version Purpose
Framework React 18.3.1 Component-based UI
Language TypeScript 5.5.3 Type safety & DX
Build Tool Vite 7.0.6 Fast HMR & bundling
Styling Tailwind CSS 3.4.1 Utility-first CSS
Animation Framer Motion Latest Smooth animations
Charts Recharts Latest Real-time graphs
Reporting jsPDF + html2canvas Latest PDF generation
Notifications React Hot Toast Latest User feedback

Key Features:

  • Type-Safe Development - Full TypeScript coverage
  • Hot Module Replacement - Instant feedback loop
  • Code Splitting - Optimized bundle sizes
  • Tree Shaking - Eliminate dead code
  • Responsive Design - Mobile-first approach

Backend Architecture

High-Performance Node.js with Cluster Mode

graph TB
    A[HTTP Requests] --> B[Express Server]
    B --> C[Middleware Layer]
    C --> D[Compression]
    C --> E[Helmet Security]
    C --> F[Rate Limiter]
    C --> G[Morgan Logger]
    B --> H[Cluster Master]
    H --> I[Worker 1]
    H --> J[Worker 2]
    H --> K[Worker N]
    I --> L[download endpoint]
    I --> M[upload endpoint]
    I --> N[WebSocket server]
    J --> L
    J --> M
    J --> N
    K --> L
    K --> M
    K --> N
Loading

Core Stack:

Layer Technology Version Purpose
Runtime Node.js 20+ JavaScript runtime
Framework Express 4.18.2 Web server
Clustering cluster module Built-in Multi-process
WebSocket ws 8.18.2 Real-time protocol
Security Helmet Latest Security headers
Compression compression Latest Gzip responses
Rate Limiting express-rate-limit Latest DDoS protection
Logging morgan Latest HTTP logger
CORS cors Latest Cross-origin support

Performance Optimizations:

  • Multi-Core Processing - Utilizes all CPU cores
  • Stream-Based Transfers - No buffering overhead
  • Backpressure Handling - Memory-efficient streaming
  • Pre-Generated Buffers - Reduced crypto calls
  • Chunked Encoding - Better throughput

Infrastructure & Deployment

Multi-Platform Deployment Strategy

Platform Type Use Case Benefits
Cloudflare Pages Edge Computing Global distribution 275+ locations, low latency
Docker Containerization Consistent environments Reproducible builds
Kubernetes Orchestration Production scaling Auto-healing, load balancing
Render Managed Hosting Easy deployment Zero DevOps, auto-SSL
Vercel Serverless Static + API routes Edge functions, analytics
Railway Cloud Platform Full-stack apps Database integration

DevOps Toolchain:

  • CI/CD - GitHub Actions (26 workflows)
  • Local Dev - Docker Compose
  • K8s Packaging - Helm Charts
  • Edge Deployment - Wrangler CLI
  • Monitoring - Health checks, logging

โš™๏ธ System Architecture

High-Level Overview

graph TB
    subgraph Client
        A[React SPA] --> B[SpeedTestEngine]
        B --> C[Web Worker]
        C --> D[XHR Protocol]
        C --> E[WebSocket Protocol]
    end
    
    subgraph Edge
        F[Cloudflare Pages] --> G[download function]
        F --> H[upload function]
        F --> I[ping function]
        F --> J[websocket function]
    end
    
    subgraph Origin
        K[Express Server] --> L[Cluster Master]
        L --> M[Worker 1]
        L --> N[Worker 2]
        L --> O[Worker N]
        M --> P[download endpoint]
        M --> Q[upload endpoint]
        M --> R[WebSocket server]
    end
    
    B --> F
    B --> K
    D --> G
    D --> P
    E --> J
    E --> R
Loading

Component Breakdown

Frontend (/project/src):

Backend (/backend):

  • server.js - Express app with clustering (888 lines)
  • start.js - Process launcher
  • public/ - Built frontend assets

Edge Functions (/functions):


๐Ÿ”„ Speed Test Flow

Step-by-Step Execution

sequenceDiagram
    participant U as User
    participant UI as React UI
    participant E as Engine
    participant W as Web Worker
    participant S as Server
    
    U->>UI: Click "Start Test"
    UI->>E: runSpeedTest()
    E->>W: postMessage(START_TEST)
    
    Note over W: Initialize AbortController
    W->>S: GET /ping (5 samples)
    S-->>W: Response timestamps
    W->>W: Calculate avg ping + jitter
    
    W->>S: GET /download?bytes=50MB
    S-->>W: Stream chunks (1-4MB)
    W->>W: Measure throughput
    Note over W: Exclude TCP slow-start
    
    W->>S: POST /upload (binary)
    S-->>W: Byte count ACK
    W->>W: Calculate upload speed
    
    W->>UI: TEST_COMPLETE
    UI->>U: Display results
Loading

Detailed Phase Breakdown

Phase 1: Initialization (0-10%)

  1. User clicks "Start Test"
  2. SpeedTestEngine.runSpeedTest() called
  3. Web Worker initialized with config
  4. AbortController created for cancellation

Phase 2: Ping Measurement (10-30%)

  1. Send 5 ping requests to /ping endpoint
  2. Server responds with precise timestamps
  3. Calculate average, min, max, jitter
  4. Update progress UI in real-time

Phase 3: Download Test (30-70%)

  1. Request random data (default 50MB)
  2. Server streams chunks via Transfer-Encoding: chunked
  3. Web Worker measures bytes/time
  4. TCP Grace Period: First 2-3 seconds excluded
  5. Dynamic adjustment: Fast connections = 1s grace, Slow = 3s

Phase 4: Upload Test (70-90%)

  1. Generate random binary data
  2. POST to /upload as application/octet-stream
  3. Server counts bytes in real-time (no buffering)
  4. Calculate throughput from duration

Phase 5: Bufferbloat (Optional, 90-95%)

  1. Create background download load
  2. Measure ping under load
  3. Compare to idle ping
  4. Assign A-F rating (M-Labs algorithm)

Phase 6: Completion (95-100%)

  1. Compile final results object
  2. Generate unique test ID
  3. Store in localStorage (history)
  4. Navigate to results page

๐Ÿ“‚ Project Structure

netpulse/
โ”œโ”€โ”€ ๐Ÿ“ .github/workflows/       # CI/CD pipelines (26 files)
โ”‚   โ”œโ”€โ”€ ci-cd.yml              # Build & test automation
โ”‚   โ”œโ”€โ”€ render-deploy.yml      # Auto-deploy to Render
โ”‚   โ”œโ”€โ”€ kubernetes-deploy.yml  # K8s deployment
โ”‚   โ””โ”€โ”€ security-scan.yml      # Dependency scanning
โ”‚
โ”œโ”€โ”€ ๐Ÿ“ api/                     # Legacy API routes
โ”‚   โ””โ”€โ”€ ping.js                # Simple ping endpoint
โ”‚
โ”œโ”€โ”€ ๐Ÿ“ backend/                 # Node.js Express server
โ”‚   โ”œโ”€โ”€ public/                # Built frontend assets
โ”‚   โ”‚   โ”œโ”€โ”€ index.html
โ”‚   โ”‚   โ””โ”€โ”€ assets/            # Vite build output
โ”‚   โ”œโ”€โ”€ server.js              # Main Express app (888 lines)
โ”‚   โ”œโ”€โ”€ server-azure.js        # Azure-specific variant
โ”‚   โ”œโ”€โ”€ start.js               # Process launcher
โ”‚   โ””โ”€โ”€ package.json           # Backend dependencies
โ”‚
โ”œโ”€โ”€ ๐Ÿ“ functions/               # Cloudflare Pages Functions
โ”‚   โ”œโ”€โ”€ index.ts               # Entry point
โ”‚   โ”œโ”€โ”€ download.ts            # Download speed test
โ”‚   โ”œโ”€โ”€ upload.ts              # Upload speed test
โ”‚   โ”œโ”€โ”€ ping.ts                # Ping/jitter measurement
โ”‚   โ”œโ”€โ”€ status.ts              # Server health
โ”‚   โ”œโ”€โ”€ websocket.ts           # WebSocket protocol
โ”‚   โ””โ”€โ”€ _routes.json           # Routing config
โ”‚
โ”œโ”€โ”€ ๐Ÿ“ helm/netpulse/          # Kubernetes Helm chart
โ”‚   โ”œโ”€โ”€ Chart.yaml
โ”‚   โ”œโ”€โ”€ values.yaml
โ”‚   โ””โ”€โ”€ templates/
โ”‚       โ””โ”€โ”€ deployment.yaml
โ”‚
โ”œโ”€โ”€ ๐Ÿ“ kubernetes/              # Raw K8s manifests
โ”‚   โ””โ”€โ”€ deployment.yaml        # Deployment + Service + PVC
โ”‚
โ”œโ”€โ”€ ๐Ÿ“ project/                 # React frontend (Vite)
โ”‚   โ”œโ”€โ”€ src/
โ”‚   โ”‚   โ”œโ”€โ”€ components/        # React components (31 files)
โ”‚   โ”‚   โ”‚   โ”œโ”€โ”€ NewSpeedTest.tsx      # Main test UI
โ”‚   โ”‚   โ”‚   โ”œโ”€โ”€ NewResultsPage.tsx    # Results display
โ”‚   โ”‚   โ”‚   โ”œโ”€โ”€ NewHeader.tsx         # Navigation
โ”‚   โ”‚   โ”‚   โ”œโ”€โ”€ NewFooter.tsx         # Footer
โ”‚   โ”‚   โ”‚   โ”œโ”€โ”€ BufferbloatAnalysis.tsx
โ”‚   โ”‚   โ”‚   โ”œโ”€โ”€ PacketLossCard.tsx
โ”‚   โ”‚   โ”‚   โ”œโ”€โ”€ ProtocolOverheadInfo.tsx
โ”‚   โ”‚   โ”‚   โ”œโ”€โ”€ TcpGracePeriodInfo.tsx
โ”‚   โ”‚   โ”‚   โ”œโ”€โ”€ ReportGenerator.tsx
โ”‚   โ”‚   โ”‚   โ””โ”€โ”€ ... (21 more)
โ”‚   โ”‚   โ”œโ”€โ”€ utils/
โ”‚   โ”‚   โ”‚   โ”œโ”€โ”€ speedTestEngine.ts    # Test orchestrator
โ”‚   โ”‚   โ”‚   โ”œโ”€โ”€ speedTestWorker.ts    # Web Worker (1471 lines)
โ”‚   โ”‚   โ”‚   โ””โ”€โ”€ webSocketService.ts   # WS client
โ”‚   โ”‚   โ”œโ”€โ”€ types/
โ”‚   โ”‚   โ”‚   โ””โ”€โ”€ speedTest.ts          # TypeScript interfaces
โ”‚   โ”‚   โ”œโ”€โ”€ config/
โ”‚   โ”‚   โ”‚   โ””โ”€โ”€ serverConfig.ts       # Environment config
โ”‚   โ”‚   โ”œโ”€โ”€ App.tsx                   # Root component
โ”‚   โ”‚   โ”œโ”€โ”€ main.tsx                  # Entry point
โ”‚   โ”‚   โ””โ”€โ”€ index.css                 # Global styles
โ”‚   โ”œโ”€โ”€ public/
โ”‚   โ”‚   โ””โ”€โ”€ speedometer.svg
โ”‚   โ”œโ”€โ”€ package.json                  # Frontend dependencies
โ”‚   โ”œโ”€โ”€ vite.config.ts                # Vite configuration
โ”‚   โ”œโ”€โ”€ tailwind.config.js            # Tailwind setup
โ”‚   โ””โ”€โ”€ tsconfig.json                 # TypeScript config
โ”‚
โ”œโ”€โ”€ ๐Ÿ“ scripts/                 # Build utilities
โ”‚   โ””โ”€โ”€ copy-frontend.js        # Copy build to backend
โ”‚
โ”œโ”€โ”€ ๐Ÿณ Dockerfile               # Multi-stage build
โ”œโ”€โ”€ ๐Ÿณ docker-compose.yml       # Local dev orchestration
โ”œโ”€โ”€ ๐Ÿณ docker-entrypoint.sh     # Container startup
โ”œโ”€โ”€ ๐Ÿ“„ render.yaml              # Render deployment config
โ”œโ”€โ”€ ๐Ÿ“„ vercel.json              # Vercel deployment
โ”œโ”€โ”€ ๐Ÿ“„ wrangler.toml            # Cloudflare Wrangler CLI
โ”œโ”€โ”€ ๐Ÿ“„ railway.toml             # Railway deployment
โ”œโ”€โ”€ ๐Ÿ“„ staticwebapp.config.json # Azure Static Web Apps
โ””โ”€โ”€ ๐Ÿ“„ package.json             # Root workspace config

๐Ÿ”Œ API Reference

Backend Endpoints (Express)

GET /ping - Latency Measurement

Purpose: Measure round-trip time to server

Query Parameters:

  • timestamp (optional): Client timestamp for RTT calculation
  • includeLoad (boolean): Include server load metrics

Response:

{
  "status": "ok",
  "message": "pong",
  "serverTimestamp": 1234567890123,
  "requestTimestamp": 1234567890100,
  "serverProcessingTime": 23,
  "preciseProcessingTimeMs": 0.145
}

Implementation: Uses process.hrtime() for nanosecond precision


GET /download - Download Speed Test

Purpose: Stream random data for download measurement

Query Parameters:

  • bytes (number): Size in bytes (default: 1MB, max: 500MB)
  • chunkSize (number): Chunk size for streaming (default: 1MB)

Headers:

Content-Type: application/octet-stream
Transfer-Encoding: chunked
X-Download-Size-Bytes: 52428800
X-Download-Duration: 4.523
X-Download-Throughput-MBps: 11.05

Features:

  • Pre-generated random buffers (crypto.randomBytes)
  • Backpressure handling via drain event
  • Client disconnect detection
  • Performance metric logging

POST /upload - Upload Speed Test

Purpose: Receive binary stream for upload measurement

Request Headers:

Content-Type: application/octet-stream
Content-Length: 52428800

Response:

{
  "status": "success",
  "receivedAt": 1234567890123,
  "byteLength": 52428800,
  "duration": "4.892",
  "throughputMBps": "10.23",
  "throughputMbps": "81.84"
}

Implementation:

  • Stream processing (no buffering)
  • Real-time byte counting
  • Memory efficient for large uploads

GET /status - Server Metrics

Purpose: Detailed server performance statistics

Response:

{
  "status": "ok",
  "timestamp": 1234567890123,
  "serverUptime": 3600.5,
  "metrics": {
    "totalRequests": 15234,
    "errors": 12,
    "requestsLastMinute": 245,
    "requestsPerSecond": "4.08",
    "endpointStats": {
      "ping": 5023,
      "download": 5102,
      "upload": 5109
    }
  },
  "system": { /* os.cpus(), os.freemem(), etc */ },
  "process": { /* process.memoryUsage(), v8 stats */ },
  "cluster": { /* worker info */ }
}

GET /health - Health Check

Purpose: Lightweight endpoint for load balancers

Response:

{
  "status": "ok",
  "timestamp": 1234567890123,
  "uptime": 3600.5,
  "requestsPerSecond": "4.08"
}

WebSocket /websocket - Real-Time Protocol

Purpose: Bidirectional communication for advanced testing

Client Messages:

{
  "type": "start_test",
  "payload": {
    "phase": "download",
    "size": 52428800
  }
}

Server Messages:

{
  "type": "upload_ack",
  "timestamp": 1234567890123,
  "bytesReceived": 65536,
  "totalBytesReceived": 1048576
}

Cloudflare Functions (Edge)

All backend endpoints are mirrored as Cloudflare Pages Functions:

Function Purpose Notes
functions/download.ts Edge download test Stream-based response
functions/upload.ts Edge upload test Request body streaming
functions/ping.ts Edge latency Minimal processing
functions/status.ts Edge health Basic status
functions/websocket.ts Edge WebSocket Not yet implemented

Key Differences from Backend:

  • Runs at edge (closer to users = lower latency)
  • Stateless (no cluster mode needed)
  • Limited to Cloudflare's runtime constraints
  • Automatic scaling to 275+ locations

๐Ÿ› ๏ธ Setup & Installation

Prerequisites

Required:

  • Node.js >= 20.12.0
  • npm or yarn
  • Git

Optional:

  • Docker Desktop (for containerized dev)
  • Kubernetes cluster (minikube/kind)
  • Wrangler CLI (Cloudflare deployment)

Option 1: Local Development (Recommended)

Step 1: Clone Repository

git clone https://github.com/yourusername/netpulse.git
cd netpulse

Step 2: Install Frontend Dependencies

cd project
npm install

Step 3: Install Backend Dependencies

cd ../backend
npm install

Step 4: Configure Environment (Optional) Create backend/.env:

PORT=3000
ENABLE_CLUSTER=true
RATE_LIMIT_WINDOW_MS=60000
RATE_LIMIT_MAX_REQUESTS=500
KEEP_ALIVE_TIMEOUT=65000
HEADERS_TIMEOUT=66000

Step 5: Start Backend Server

cd backend
npm start

Server runs on http://localhost:3000

Step 6: Start Frontend Dev Server (new terminal)

cd project
npm run dev

Frontend runs on http://localhost:5173

Access: Open http://localhost:5173 in browser


Option 2: Docker Development

Step 1: Build Docker Image

docker build -t netpulse:latest .

Step 2: Run Container

docker run -d \
  -p 3000:3000 \
  -e NODE_ENV=production \
  -e ENABLE_CLUSTER=true \
  --name netpulse \
  netpulse:latest

Step 3: Verify Health

curl http://localhost:3000/health

Access: Open http://localhost:3000


Option 3: Docker Compose (Full Stack)

Step 1: Start All Services

docker-compose up -d

Step 2: View Logs

docker-compose logs -f

Step 3: Stop Services

docker-compose down

Services:

  • netpulse: Main application (port 3000)
  • Volume: netpulse_logs for persistent logs

Access: Open http://localhost:3000


Option 4: Kubernetes Deployment

Prerequisites:

  • kubectl configured
  • Kubernetes cluster (v1.19+)
  • Docker registry access

Step 1: Build and Push Image

docker build -t your-registry/netpulse:latest .
docker push your-registry/netpulse:latest

Step 2: Update Deployment Config Edit kubernetes/deployment.yaml:

image: your-registry/netpulse:latest

Step 3: Deploy to Cluster

kubectl apply -f kubernetes/deployment.yaml

Step 4: Verify Deployment

kubectl get pods -l app=netpulse
kubectl get svc netpulse

Access: Use LoadBalancer IP from kubectl get svc


Option 5: Cloudflare Pages Deployment

Prerequisites:

  • Wrangler CLI installed: npm i -g wrangler
  • Cloudflare account
  • Domain configured on Cloudflare

Step 1: Login to Cloudflare

wrangler login

Step 2: Initialize Project

wrangler init

Step 3: Configure wrangler.toml

name = "netpulse"
compatibility_date = "2024-01-01"
pages_build_output_dir = "./project/dist"

Step 4: Deploy

wrangler pages deploy project/dist --project-name=netpulse

Access: https://netpulse.pages.dev


Option 6: Render Deployment

Step 1: Push to GitHub

git add .
git commit -m "Initial commit"
git push origin main

Step 2: Connect Render

  1. Go to https://render.com/dashboard
  2. Click "New +" โ†’ "Web Service"
  3. Connect GitHub repository
  4. Configure:
    • Name: netpulse
    • Environment: Docker
    • Plan: Free
    • Auto-Deploy: Yes

Step 3: Deploy Render automatically detects render.yaml and deploys.

Access: https://your-service.onrender.com


๐Ÿ“– Usage Guide

Running Your First Test

1. Navigate to Application Open your browser to the deployed URL (e.g., http://localhost:3000)

2. Configure Test Settings (Optional) Click the gear icon โš™๏ธ to adjust:

  • Test Duration: 5-30 seconds per phase
  • Parallel Connections: 1-8 streams
  • Protocol: XHR or WebSocket
  • TCP Grace Period: 1-5 seconds
  • Enable Bufferbloat Test: Yes/No

3. Start Test Click the big "START TEST" button

4. Watch Progress Real-time updates show:

  • Current phase (Ping โ†’ Download โ†’ Upload)
  • Live speed gauge
  • Progress bar (%)
  • Elapsed time

5. Review Results After completion, see:

  • Download Speed: Mbps + MB/s
  • Upload Speed: Mbps + MB/s
  • Ping: ms (average of 5 samples)
  • Jitter: ms (variance)
  • Bufferbloat Grade: A-F
  • Packet Loss: % lost packets
  • Server Location: Tested endpoint

6. Share Results

  • Click "Share" to generate PDF report
  • Copy result link
  • View historical tests

Advanced Configuration

Environment Variables

Backend (.env):

# Server Configuration
PORT=3000                    # Listen port
ENABLE_CLUSTER=true          # Use all CPU cores
NODE_ENV=production          # production/development

# Performance Tuning
KEEP_ALIVE_TIMEOUT=65000     # HTTP keep-alive (ms)
HEADERS_TIMEOUT=66000        # Headers timeout (ms)

# Rate Limiting
RATE_LIMIT_WINDOW_MS=60000   # 1 minute window
RATE_LIMIT_MAX_REQUESTS=500  # Max requests per IP

# Optional: Custom Server Host
HOST=0.0.0.0                 # Bind address

Frontend (.env):

# API Configuration
VITE_API_BASE_URL=http://localhost:3000
VITE_WS_URL=ws://localhost:3000

# Feature Flags
VITE_ENABLE_BUFFERBLOAT=true
VITE_ENABLE_PACKET_LOSS=true
VITE_DEFAULT_PROTOCOL=xhr

Test Accuracy Tips

For Most Accurate Results:

  1. Use Ethernet - WiFi introduces variance
  2. Close Other Tabs - Background downloads affect results
  3. Test Multiple Times - Average 3-5 tests
  4. Different Times of Day - Network congestion varies
  5. Disable VPN/Proxy - Adds latency and overhead
  6. Update Network Drivers - Old drivers can bottleneck

Understanding TCP Grace Period:

The TCP Grace Period excludes the first few seconds of testing to avoid measuring TCP slow-start:

  • Fast Connection (>50 Mbps): 1 second grace
  • Medium Connection (10-50 Mbps): 2 seconds grace
  • Slow Connection (<10 Mbps): 3 seconds grace

This is automatic and configurable via enableDynamicGracePeriod in config.


Generating PDF Reports

Step 1: Complete a speed test

Step 2: On results page, click "Generate PDF"

Step 3: Report includes:

  • Test summary (download, upload, ping, jitter)
  • Speed graph over time
  • Bufferbloat analysis
  • Timestamp and server info
  • QR code for sharing

Step 4: Download or share link


Viewing Test History

All tests are stored in browser localStorage:

Access:

const history = JSON.parse(localStorage.getItem('speedTestHistory') || '[]');
console.log(history);

Fields:

  • id: Unique test identifier
  • timestamp: Unix timestamp
  • downloadSpeed: Mbps
  • uploadSpeed: Mbps
  • ping: ms
  • jitter: ms
  • bufferbloat: { rating, latencyIncrease }
  • packetLoss: { percentage, sent, received }

๐Ÿ”ง Troubleshooting

Common Issues

โŒ "Test Aborted" Error

Symptoms: Test stops midway with error message

Causes:

  1. Server disconnected
  2. Rate limit exceeded
  3. Browser extension blocking requests

Solutions:

# Check server logs
docker logs netpulse

# Increase rate limit in backend/.env
RATE_LIMIT_MAX_REQUESTS=1000

# Disable ad blockers for test domain

โŒ Inaccurately Low Speeds

Symptoms: Results much lower than expected

Diagnosis:

  1. Check TCP Grace Period:

    // In speedTestWorker.ts
    console.log('Grace period:', gracePeriodMs);
  2. Verify Server Resources:

    curl http://localhost:3000/status
  3. Test with Larger File:

    Default: 10MB โ†’ Try 50MB or 100MB
    

Fix:

  • Increase test duration in settings
  • Enable cluster mode: ENABLE_CLUSTER=true
  • Use WebSocket protocol (lower overhead)

โŒ WebSocket Connection Failed

Symptoms: "WebSocket connection failed" error

Check:

# Verify WebSocket endpoint
curl -i http://localhost:3000/websocket

# Check firewall
netstat -an | grep 3000

Fix:

// In serverConfig.ts, ensure correct WS URL
const wsUrl = import.meta.env.VITE_WS_URL || 'ws://localhost:3000';

โŒ Docker Container Won't Start

Symptoms: Container exits immediately

Debug:

docker run -it netpulse:latest /bin/sh
npm start
# Check error output

Common Fixes:

# Ensure proper permissions
RUN chmod +x /docker-entrypoint.sh

# Check port binding
EXPOSE 3000

โŒ Build Fails on Render

Symptoms: Deployment fails with build error

Check Logs:

# In Render dashboard
View Logs โ†’ Build Logs

Common Issues:

  1. Node Version Mismatch:

    # In render.yaml
    envVars:
      - key: NODE_VERSION
        value: "20"
  2. Memory Limit:

    plan: standard  # Upgrade from free tier
  3. Build Timeout:

    # Optimize Dockerfile
    RUN npm ci --only=production

โŒ Frontend Can't Connect to Backend

Symptoms: "Network Error" or infinite loading

Check:

// In browser console
console.log(window.location.origin);

Fix:

// In serverConfig.ts
const baseUrl = import.meta.env.PROD 
  ? ''  // Use relative URLs for co-hosted
  : 'http://localhost:3000';

For Separate Hosting:

# Frontend .env
VITE_API_BASE_URL=https://api.yourdomain.com

Performance Debugging

Enable Verbose Logging:

Backend:

# Set debug level
NODE_DEBUG=* npm start

Frontend:

// In speedTestWorker.ts
const DEBUG = true;
if (DEBUG) console.log('Download chunk:', bytesWritten);

Monitor Server Metrics:

# Every 10 seconds
watch -n 10 'curl http://localhost:3000/status | jq .metrics'

Profile Node.js:

node --inspect server.js
# Open chrome://inspect in Chrome

๐ŸŒ Deployment

Production Checklist

Before deploying to production:

Security:

  • โœ… Update rate limits for production traffic
  • โœ… Enable HTTPS (TLS termination)
  • โœ… Configure CORS for specific domains only
  • โœ… Rotate any hardcoded secrets

Performance:

  • โœ… Enable cluster mode (ENABLE_CLUSTER=true)
  • โœ… Tune socket timeouts for high-latency clients
  • โœ… Increase max upload size if needed
  • โœ… Configure reverse proxy (nginx/traefik)

Monitoring:

  • โœ… Set up health check alerts
  • โœ… Configure log aggregation (ELK/DataDog)
  • โœ… Monitor memory usage (V8 heap stats)
  • โœ… Track error rates per endpoint

Scalability:

  • โœ… Horizontal scaling plan (load balancer)
  • โœ… Database for test history (optional)
  • โœ… CDN for static assets
  • โœ… Redis for rate limiting (distributed)

Docker Production Deployment

docker-compose.prod.yml:

version: '3.8'

services:
  netpulse:
    image: your-registry/netpulse:latest
    ports:
      - "80:10000"
    environment:
      - NODE_ENV=production
      - ENABLE_CLUSTER=true
      - RATE_LIMIT_MAX_REQUESTS=1000
    restart: unless-stopped
    healthcheck:
      test: ["CMD", "curl", "-f", "http://localhost:10000/health"]
      interval: 30s
      timeout: 10s
      retries: 3
    logging:
      driver: "json-file"
      options:
        max-size: "10m"
        max-file: "3"

Deploy:

docker-compose -f docker-compose.prod.yml up -d

Kubernetes Production Deployment

ingress.yaml:

apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
  name: netpulse-ingress
  annotations:
    kubernetes.io/ingress.class: nginx
    nginx.ingress.kubernetes.io/rate-limit: "100"
    nginx.ingress.kubernetes.io/rate-limit-window: "1m"
spec:
  rules:
  - host: speedtest.yourdomain.com
    http:
      paths:
      - path: /
        pathType: Prefix
        backend:
          service:
            name: netpulse
            port:
              number: 80

Apply:

kubectl apply -f ingress.yaml

Cloudflare Pages Production

Custom Domain:

  1. Go to Cloudflare Dashboard
  2. Pages โ†’ Your Project โ†’ Custom Domains
  3. Add speedtest.yourdomain.com
  4. Wait for DNS propagation

Environment Variables:

# In Cloudflare Dashboard
Pages โ†’ Settings โ†’ Environment Variables
VITE_API_BASE_URL = https://speedtest.yourdomain.com

๐Ÿ“Š Performance Metrics

Backend Benchmarks

Tested Configuration:

  • CPU: 8-core (Intel i7-12700K)
  • RAM: 32GB DDR4
  • Network: 1 Gbps

Single Instance (No Cluster):

  • Concurrent Users: ~500
  • Peak Throughput: ~2 Gbps
  • Memory Usage: ~400MB
  • CPU Usage: ~60% (single core)

Cluster Mode (8 Cores):

  • Concurrent Users: ~4000
  • Peak Throughput: ~8 Gbps
  • Memory Usage: ~3.2GB
  • CPU Usage: ~70% (all cores)

Optimization Impact:

Optimization Improvement Notes
Pre-generated Buffers +40% throughput Reduces crypto.randomBytes calls
Chunked Encoding +25% throughput Better streaming efficiency
Backpressure Handling +15% stability Prevents memory spikes
Cluster Mode +700% capacity Linear scaling with cores
Disabled Compression +20% throughput For test endpoints only

Frontend Performance

Bundle Sizes (Production Build):

Total: 1.2MB (gzipped: 380KB)
โ”œโ”€โ”€ vendor.js: 650KB (React, Recharts)
โ”œโ”€โ”€ index.js: 280KB (App code)
โ”œโ”€โ”€ speedTestWorker.js: 120KB (Web Worker)
โ””โ”€โ”€ styles.css: 45KB

Load Time Metrics:

  • First Contentful Paint: ~800ms
  • Time to Interactive: ~1.2s
  • Lighthouse Score: 92/100

Optimization Techniques:

  • Code splitting (lazy loading)
  • Tree shaking (unused code elimination)
  • Minification (Terser)
  • Asset optimization (WebP, SVG)

๐Ÿ”ฎ Future Roadmap

Q2 2025

  • Mobile App - React Native iOS/Android
  • Historical Analytics - Trend graphs over time
  • Multi-Language Support - i18n with 10+ languages
  • Custom Server Selection - Manual server picker with map

Q3 2025

  • WebRTC Testing - P2P speed tests between users
  • Video Streaming Test - Simulate Netflix/YouTube
  • Gaming Latency Test - Simulate online gaming
  • API Rate Limiting Dashboard - Visual analytics

Q4 2025

  • Machine Learning - Anomaly detection for results
  • Blockchain Verification - Immutable test records
  • Enterprise Features - White-label solutions
  • Global Leaderboard - Compare speeds worldwide

๐Ÿ‘จโ€๐Ÿ’ป Author

Created by: Amar Pawar
Current Version: 2.0.0
License: MIT License

Contributing

Contributions are welcome! Please follow these guidelines:

  1. Fork the repository
  2. Create feature branch:
    git checkout -b feat/amazing-feature
  3. Make changes and ensure tests pass:
    npm run lint
    npm run test
  4. Commit using Conventional Commits:
    git commit -m "feat: add amazing feature"
  5. Push to your fork:
    git push origin feat/amazing-feature
  6. Open Pull Request on GitHub

Code Quality Standards

  • TypeScript strict mode enabled
  • ESLint rules enforced
  • Prettier formatting
  • Unit tests for critical paths
  • Documentation for public APIs

Built with โค๏ธ using React, TypeScript, Node.js, and Cloudflare

โฌ† Back to Top

About

๐Ÿš€ High-precision Network Diagnostic Tool. Features TCP Slow-start exclusion for 99% accuracy, multi-threaded Web Workers to prevent UI blocking, and a Node.js Cluster-mode backend for high-throughput testing.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors