Skip to content

RelativelyBurberry/CNN_QNN-for-ERT

Β 
Β 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

5 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Comparative Analysis of CNN and QNN-Hybrid Architectures for Electrical Resistivity Prediction

This repository presents a comparative study between classical Convolutional Neural Networks (CNNs) and Hybrid Quantum Neural Networks (QNNs) for predicting electrical resistivity of materials using synthetically generated Electrical Resistivity Tomography (ERT) data.

The project explores whether quantum-enhanced machine learning can match or outperform classical deep learning models in solving inverse problems commonly encountered in geophysics and material science.


πŸ“Œ Overview

Electrical resistivity prediction is a challenging inverse problem due to:

  • High-dimensional dependencies
  • Noise sensitivity
  • Non-linear spatial relationships

Traditional numerical inversion techniques are computationally expensive and poorly scalable. This project investigates:

  • A spatial-attention-based CNN
  • A hybrid Quantum–Classical Neural Network (QNN)

Both models are trained and evaluated on the same synthetically generated ERT dataset, enabling a fair and controlled comparison.


🧠 Problem Statement

Can hybrid quantum neural networks provide comparable or improved performance over classical CNNs for resistivity prediction β€” even when executed on quantum simulators?


πŸ—οΈ System Architecture

1️⃣ Data Generation Pipeline

image

2️⃣ CNN Architecture (Classical)

image

3️⃣ Quantum Neural Network Architecture

image

4️⃣ Hybrid QNN Architecture

image

✨ Key Features

πŸ”Ή Physics-inspired synthetic ERT data generation
πŸ”Ή Polynomial feature engineering for spatial relations
πŸ”Ή Log-scaled resistivity modeling (geophysical best practice)
πŸ”Ή Spatial attention mechanism in CNN
πŸ”Ή Hybrid quantum-classical learning pipeline
πŸ”Ή Ensemble learning for robustness
πŸ”Ή Detailed evaluation using regression metrics


βš™οΈ Functionalities

  • Generate realistic multi-anomaly ERT datasets
  • Train CNN and QNN models on identical data
  • Perform fair metric-based comparison
  • Visualize:
    • Training vs validation loss
    • Predicted vs actual resistivity
    • Residual distributions
  • Export reproducible resul ts

🧰 Tech Stack

Programming & ML

  • Python 3.10+
  • PyTorch
  • Scikit-learn
  • NumPy
  • Pandas
  • Matplotlib

Quantum Computing

  • PennyLane
  • Qiskit (hardware-compatible)

Geophysical Simulation

  • PyGIMLi (ERT forward modeling)

πŸ“‚ Repository Structure

β”œβ”€β”€ data_gen.py # Synthetic ERT data generation
β”œβ”€β”€ cnn.py # Spatial Attention CNN model
β”œβ”€β”€ qnn.py # Hybrid Quantum Neural Network
└── README.md


πŸ”¬ How We Built It

  1. Synthetic Data Creation

    • Designed layered subsurface with embedded anomalies
    • Simulated dipole–dipole ERT surveys
    • Added realistic noise and filtering
    image
  2. Feature Engineering

    • Electrode positions & spacings
    • Pseudo-depth & geometric factors
    • Polynomial interaction features
  3. Model Design

    • CNN: Attention + dilation + residual learning
    • Hybrid-QNN: CNN feature extractor + quantum circuit
  4. Training Strategy

    • Log-transformed targets
    • Huber loss for robustness
    • One-cycle learning rate scheduling
    • Early stopping
  5. Evaluation

    • RMSE, MAE, MSE, RΒ²
    • Visual diagnostics

πŸ“Š Results & Comparison

Metric CNN QNN
RΒ² Score 0.8910 0.8862
RMSE 7.5094 7.6745
MSE 56.39 58.89
MAE 3.8923 3.8785

Observations

  • CNN shows slightly better global accuracy
  • QNN achieves lower MAE, indicating better local anomaly handling
  • Performance gap is minimal despite quantum simulation overhead

Note: QNNs were executed on quantum simulators due to limited access to real hardware.


πŸ“ˆ Result Visualizations

  • Training vs Validation Loss
  • Predicted vs Actual Resistivity
  • Residual Error Analysis

CNN

image

Hybrid QNN

image

These confirm:

  • Stable convergence
  • Minimal bias
  • Strong generalization

🧠 Key Learnings

  • CNNs remain strong baselines for spatial inverse problems
  • Quantum layers can integrate meaningfully with classical models
  • Hybrid QNNs are already competitive despite hardware limitations
  • Data preprocessing is as critical as model architecture
  • Scientific ML benefits from physics-informed data generation

πŸ“š Literature & References

  • Liu et al., ERS-InvNet, IEEE TGRS (2020)
  • Vu & Jardani, CNN-3D-ERT, GJI (2021)
  • Li et al., VD-Net, IEEE TIM (2021)
  • Aleardi et al., CNN for ERT, Politecnico di Milano (2024)
  • Schuld & Petruccione, Machine Learning with Quantum Computers

(Full reference list available in project report)


πŸš€ Future Work

  • Deploy QNNs on real quantum hardware
  • Expand to 3D resistivity inversion
  • Add uncertainty quantification
  • Explore deeper quantum circuits
  • Apply framework to other material properties

πŸ‘₯ Team

Developed collaboratively by an 8-member interdisciplinary team as part of an academic physics research project.

  • Team Leader: Yalluru Purushotham Reddy - Hybrid CNN–QNN architecture design, integration and team co-ordination
  • Nipun Saxena β€” CNN model design, training, and comparative analysis
  • Saswata Bastia β€” Data preprocessing and feature engineering
  • Caleb Kurian George β€” ERT physics modeling and synthetic data simulation
  • Abhinav Saikumar β€” Implementation support and debugging
  • Pranav β€” General assistance
  • Atharva β€” General assistance

Under the guidance of Dr. Korlepara Divya Bharathi VIT Chennai Assistant Professor Grade II


πŸ“œ License

This project is intended for academic and research use.

About

Comparative analysis of CNN and Hybrid Quantum Neural Network (QNN) architectures for electrical resistivity prediction using synthetically generated ERT data. Includes data generation, preprocessing, classical deep learning, quantum-enhanced modeling, and performance evaluation.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Python 100.0%