Skip to content

feat: initial project setup#1

Merged
geethac2l merged 13 commits intomainfrom
feat/initial-project-setup
Mar 5, 2026
Merged

feat: initial project setup#1
geethac2l merged 13 commits intomainfrom
feat/initial-project-setup

Conversation

@arpannookala-12
Copy link
Copy Markdown
Collaborator

@arpannookala-12 arpannookala-12 commented Mar 5, 2026

Summary

  • Adds .gitignore covering Python, Node, IDE, env secrets, mypy cache, bandit outputs, and local working files
  • Adds .env.example documenting all configuration options for both remote and Ollama inference
  • Adds FastAPI backend (api/) with dual inference paths, PDF extraction, and health check
  • Adds React + Vite + Tailwind frontend (ui/) with dark mode, pill language selectors, and PDF upload
  • Adds Docker Compose orchestration wiring transpiler-api and transpiler-ui via Nginx
  • Adds README rewritten for CodeTrans with full multi-provider setup guide
  • Adds TROUBLESHOOTING guide

Test plan

  • cp .env.example .env, fill in inference credentials, run docker compose up --build
  • Verify UI at http://localhost:3000 and API health at http://localhost:5001/health
  • Test code translation (remote path) across at least two language pairs
  • Test Ollama path: ollama pull codellama:7b, set INFERENCE_PROVIDER=ollama, retranslate
  • Test PDF upload and code extraction

Documents all required and optional env vars for inference endpoint
configuration, LLM settings, CORS, and SSL verification.
Implements code translation across Java, C, C++, Python, Rust, and Go
via CodeLlama inference endpoints. Includes PDF code extraction, token-
based auth for GenAI/APISIX gateways, input validation, and health check.
Side-by-side code editor with language pill selectors (6 languages),
PDF drag-and-drop upload, real-time character counter, dark mode with
localStorage persistence, and copy-to-clipboard. Built with Vite,
Tailwind CSS, and served via Nginx.
Wires transpiler-api (port 5001) and transpiler-ui (port 3000→8080)
on a shared network. Nginx proxies /api/ to the backend. Supports both
remote inference and local Ollama via host.docker.internal.
README covers architecture, prerequisites, quick-start deployment,
environment configuration, and UI usage. TROUBLESHOOTING covers
common Docker, inference endpoint, and CORS issues.
…port

- Rename project title to CodeTrans and update clone URL to cld2labs/CodeTrans
- Document dual inference paths: remote OpenAI-compatible APIs and local Ollama
- Explain Ollama host-native requirement for Metal GPU acceleration on macOS
- Update architecture diagram and service names (transpiler-api, transpiler-ui)
- Replace duplicate .env blocks with a single cp .env.example .env quick-start
- Add key settings reference table and Ollama model pull commands
- Update log command service names and validated models table
- Remove stale opea-project references and broken image embed
- Add .github/workflows/code-scans.yaml (Trivy + Bandit SDLE scans)
- Add CONTRIBUTING, DISCLAIMER, LICENSE, TERMS_AND_CONDITIONS docs
- Add docs/assets company header image
- Extend .gitignore with testing artifacts, Audify local dir, and pytest/coverage entries
@arpannookala-12 arpannookala-12 force-pushed the feat/initial-project-setup branch from 6511db7 to a88fc01 Compare March 5, 2026 19:55
@geethac2l geethac2l merged commit 308b0cf into main Mar 5, 2026
2 checks passed
@arpannookala-12 arpannookala-12 deleted the feat/initial-project-setup branch March 13, 2026 20:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants