|
--- |
|
language: |
|
- en |
|
--- |
|
Chaos Classifier: Logistic Map Regime Detection via 1D CNN |
|
This model classifies time series sequences generated by the logistic map into one of three dynamical regimes: |
|
|
|
0 β Stable (converges to a fixed point) |
|
1 β Periodic (oscillates between repeating values) |
|
2 β Chaotic (irregular, non-repeating behavior) |
|
The goal is to simulate financial market regimes using a controlled chaotic system and train a model to learn phase transitions directly from raw sequences. |
|
|
|
Motivation |
|
Financial systems often exhibit regime shifts: stable growth, cyclical trends, and chaotic crashes. |
|
This model uses the logistic map as a proxy to simulate such transitions and demonstrates how a neural network can classify them. |
|
|
|
Data Generation |
|
Sequences are generated from the logistic map equation: |
|
|
|
[ x_{n+1} = r \cdot x_n \cdot (1 - x_n) ] |
|
|
|
Where: |
|
|
|
xβ β (0.1, 0.9) is the initial condition |
|
r β [2.5, 4.0] controls behavior |
|
Label assignment: |
|
|
|
r < 3.0 β Stable (label = 0) |
|
3.0 β€ r < 3.57 β Periodic (label = 1) |
|
r β₯ 3.57 β Chaotic (label = 2) |
|
Model Architecture |
|
A 1D Convolutional Neural Network (CNN) was used: |
|
|
|
Conv1D β BatchNorm β ReLU Γ 2 |
|
GlobalAvgPool1D |
|
Linear β Softmax (via CrossEntropyLoss) |
|
Advantages of 1D CNN: |
|
|
|
Captures local temporal patterns |
|
Learns wave shapes and jitters |
|
Parameter-efficient vs. MLP |
|
Performance |
|
Trained on 500 synthetic sequences (length = 100), test accuracy reached: |
|
|
|
98β99% accuracy |
|
Smooth convergence |
|
Robust generalization |
|
Confusion matrix showed near-perfect stability detection and strong chaos/periodic separation |
|
Inference Example |
|
You can generate a prediction by passing an r value: |
|
|
|
predict_regime(3.95, model, scaler, device) |
|
# Output: Predicted Regime: Chaotic (Class 2) |