Model Accuracy

🎯 Model Accuracy

Transparency about our predictions. We show when we're right and when we're wrong.

Overall Performance

No projections have been validated yet.

As simulation cycles complete, we'll compare our predictions to actual outcomes.

📚 How We Measure Accuracy

Projection Recording

When a proposal uses the RIPPLE simulation, we record what the model predicts will happen to affected variables.

Validation

After the predicted time period passes, we compare our projection to the actual value from official sources.

Grading

A: <5% error
B: 5-10% error
C: 10-20% error
F: >20% error

Model Improvement

We use validation results to adjust the RIPPLE causal chain strengths, making future predictions more accurate.