-
Notifications
You must be signed in to change notification settings - Fork 0
Description
Type
refactor
Severity
medium
Area
nmapui/workflows.py — generate_report_task()
Description
generate_report_task is approximately 350 lines handling: input validation, customer resolution, scan rules, folder creation, chunked nmap scanning with recursive subnet splitting, XML merging, HTML conversion (two variants), PDF generation, desktop file copy, metadata persistence, statistics extraction, Google Drive upload orchestration, and result emission.
This makes it extremely difficult to:
- Test individual stages in isolation
- Retry a specific failed stage (e.g., just re-upload to Google Drive)
- Add new output formats or destinations
- Reason about error recovery
Additionally, _scan_subnets_with_fallback can recurse to depth 4, creating up to 4096 sequential nmap invocations for a large subnet with persistent timeouts, with no overall scan count budget.
Proposed Fix
Extract into a pipeline of smaller functions:
def generate_report_task(context, sid, data):
params = _resolve_report_params(context, sid, data) # validation, customer, rules
scan_result = _execute_scan_phase(context, sid, params) # nmap + merge
report_result = _generate_report_artifacts(context, sid, scan_result) # HTML + PDF
_persist_and_distribute(context, sid, report_result) # metadata, Drive, desktop copyAdd a global scan count budget to _scan_subnets_with_fallback (e.g., max 100 total invocations).
Related Issues
#145 (Break reporting.py into rendering, artifact, naming, and diff services)
#166 (Reporting module modularization)