Technical Implementations
This document explains the methods and calculations used in impedance analysis.
Data Quality Assessment
Lin-KK Analysis
The Lin-KK validation uses the impedancepy package, implementing the method from Schönleber et al. [1]. This implementation:
Uses a Kramers-Kronig circuit model with ohmic resistor and RC elements
Finds the best number of RC elements automatically
Analyzes residuals to check data quality
Confirms if measurements follow physical principles
Equivalent Circuit Model (ECM) Fitting
Parameter Estimation Process
The fitting process has these steps:
Parameter Transformation
For bounded optimization:
\[ \begin{align}\begin{aligned}p_{\text{int}} = \log_{10}\left(\frac{p - lb}{1 - p/ub}\right)\\p_{\text{ext}} = \frac{lb + 10^p}{1 + 10^p/ub}\end{aligned}\end{align} \]Objective Function
Using weighted residuals:
\[\text{WRSS} = \sum_{i=1}^N \frac{(Z_{\text{exp},i} - Z_{\text{model},i})^2}{\sigma_i^2}\]Optimization
Uses BFGS algorithm with weighted residuals and parameter bounds.
Weighting Schemes
We offer three weighting options:
Parameter Uncertainties
Calculated using QR decomposition of the weighted Jacobian:
where: - R comes from QR decomposition of the weighted Jacobian - WRMS is weighted root mean square error - Jacobian elements: \(J_{ij} = \frac{\partial Z_i}{\partial p_j}\)
Correlation Analysis
Using the Hessian of the objective function:
Understanding the values: - \(|r| > 0.9\): Strong correlation - \(0.7 < |r| < 0.9\): Medium correlation - \(|r| < 0.7\): Weak correlation
Fit Quality Metrics
Vector Difference Analysis
Measures point-by-point agreement:
Quality guides: - Excellent: < 0.05 (5% average deviation) - Good: < 0.10 (10% average deviation) - Poor: > 0.10
Path Following Analysis
Checks if model follows data trajectory:
Quality guides: - Excellent: < 0.05 (5% path deviation) - Good: < 0.10 (10% path deviation) - Poor: > 0.10 (shows model structure issues)
Model Selection Metrics
Akaike Information Criterion (AIC)
For different weighting schemes [2]:
Unit weighting:
Modulus/proportional weighting:
Sigma weighting:
where: - N is number of data points - k is number of model parameters - WRSS is weighted residual sum of squares
Distribution of Relaxation Times (DRT)
We use Kulikovsky’s method [3] for DRT analysis, which:
Combines Tikhonov regularization with projected gradient method
Handles the ill-posed nature of DRT calculations
Ensures physically meaningful results (non-negative distribution)
Provides fast calculations
The objective function is:
where: - λ is regularization parameter - L is regularization operator - γ is distribution of relaxation times
Implementation Details
Optimization Algorithm
BFGS (Broyden-Fletcher-Goldfarb-Shanno) algorithm
Bounded optimization through parameter transformation
Automatic differentiation for gradients
Numerical Stability
SVD for correlation matrix calculation
QR decomposition for uncertainty estimation
DRT regularization
Parameter scaling
LLM Integration
The analysis workflow integrates these metrics with the LLM to: - Evaluate model validity based on path following - Guide model structure modifications - Interpret parameter correlations - Provide physically meaningful recommendations
The LLM system is structured to prioritize analysis based on: 1. Path following assessment (primary metric for ECM fits) 2. Data quality assessment (primary focus for non-ECM analysis)
References
Notes
JAX helps with automatic differentiation and fast computation
Error calculations assume normal distribution of residuals
CPE and Warburg elements need special attention for correlations
DRT needs careful selection of regularization parameter