Every prediction in Kunkafa ends with a reminder: "Both outcomes remain possible." This isn't a disclaimer - it's the most important part of the prediction. Here's why.
The 78% Trap
Imagine you see a prediction: "78% probability of going up." Your brain likely reads this as "it's going up." This is natural - we're wired to convert probabilities into certainties.
But 78% isn't certainty. Let's make it concrete:
- If you see 100 predictions at 78%, roughly 22 will be wrong
- That's more than 1 in 5
- If you bet everything on a 78% prediction, you'll be wrong about 1 in 5 times
This is why we always show both probabilities. "78% up / 22% down" forces your brain to acknowledge the other outcome exists.
Probability Distribution:
UP 78%
DOWN 22%
Both outcomes remain possible
The visual bar makes it clear: DOWN still has
a meaningful probability mass.
A 78% probability means roughly 1 in 5 predictions will be wrong. That's not an error rate - it's the expected outcome of an accurate probabilistic model.
What Is Uncertainty Quantification?
"Uncertainty quantification" sounds technical, but the concept is simple: it's the model telling you how sure it is.
Some predictions come with high confidence. The model has seen similar patterns many times, and they've been consistent. Other predictions come with uncertainty - the patterns are ambiguous, or the model is in unfamiliar territory.
A good prediction model knows both:
# High confidence prediction
{
"direction": "UP",
"probability": 0.85,
"uncertainty": "LOW",
"confidence_interval": [0.78, 0.92]
}
# Uncertain prediction
{
"direction": "UP",
"probability": 0.58,
"uncertainty": "HIGH",
"confidence_interval": [0.42, 0.74]
}
Notice the second prediction. The point estimate is 58% up, but the confidence interval spans from 42% to 74%. The model is essentially saying "I think up, but I'm not sure."
This information is valuable. A 58% prediction with high uncertainty means something different than a 58% prediction with low uncertainty.
Why Knowing What You Don't Know Matters
Consider two scenarios:
Scenario A: Confident Wrong - Model predicts 85% up, low uncertainty. Market goes down. Result: Surprised by outcome, potentially overexposed.
Scenario B: Uncertain Wrong - Model predicts 58% up, high uncertainty. Market goes down. Result: Not surprised, position sized appropriately.
In both cases, the prediction was "wrong" in the sense that the market went down. But in Scenario B, the user knew the model was uncertain. They could factor that into their decision - smaller position, tighter stops, or no trade at all.
A model that's wrong but told you it was uncertain is more useful than a model that's wrong but projected confidence.
Practical Examples
Example 1: Position Sizing
Suppose you're considering a trade. Two predictions:
- Prediction A: 75% up, LOW uncertainty
- Prediction B: 75% up, HIGH uncertainty
Both show 75% probability, but they're not equivalent. For Prediction A, you might size your position normally. For Prediction B, you might reduce size because the model itself isn't confident in that 75%.
Example 2: Trade/No-Trade Decision
A 55% probability with high uncertainty is essentially the model saying "I don't know." You might treat this as a no-signal condition and skip the trade entirely.
Contrast with a 55% probability with low uncertainty - the model is confident that it's a close call. This might warrant a small position with appropriate risk management.
Example 3: Timeframe Selection
Different timeframes often have different uncertainty levels. A model might show:
- 15-minute: 70% up, moderate uncertainty
- 1-hour: 65% up, low uncertainty
- 4-hour: 72% up, high uncertainty
The 1-hour prediction, despite having lower probability, might be more actionable because the model is more confident.
Calibration: Is 70% Really 70%?
A well-calibrated model means the probabilities match reality. If the model says 70%, it should be right about 70% of the time - not 85%, not 55%, but actually 70%.
This is harder than it sounds. Many models are overconfident (saying 80% when they should say 65%) or underconfident (saying 55% when they should say 70%).
# Calibration check across 1000 predictions
70% predictions (n=150):
Actual accuracy: 68.7% - Well calibrated
80% predictions (n=200):
Actual accuracy: 78.5% - Well calibrated
90% predictions (n=50):
Actual accuracy: 86.0% - Slightly overconfident
At Kunkafa, we continuously monitor calibration. When you see 70%, we've validated that it actually means 70%.
The Intellectual Humility Principle
"Both outcomes remain possible" reflects a deeper principle: intellectual humility about markets.
Financial markets are complex adaptive systems. They're influenced by millions of participants, news events, algorithmic trading, central bank decisions, and factors we can't even measure. Any model - no matter how sophisticated - is working with incomplete information.
The honest response to this complexity isn't false confidence. It's careful probability estimation with transparent uncertainty.
Markets can always surprise you. No model, including ours, can predict with certainty. Intellectual humility isn't weakness - it's honesty.
Using Probabilities in Practice
Here's a framework for thinking about predictions:
| Probability | Interpretation | Typical Response |
| 50-55% | No clear signal | Usually skip |
| 55-65% | Weak signal | Small position if low uncertainty |
| 65-75% | Moderate signal | Standard position, adjust for uncertainty |
| 75-85% | Strong signal | Consider larger position if low uncertainty |
| 85%+ | Very strong signal | Still hedge - 15%+ wrong rate |
Remember: even at 85% probability, you're wrong about 1 in 7 times. Risk management still applies.
The Goal: Better Decisions
Probabilistic thinking takes practice. Your brain wants certainty. Markets don't offer it.
The goal isn't to be right every time - that's impossible. The goal is to make better decisions over many iterations. To size positions appropriately. To know when to act and when to wait. To avoid being surprised when the less likely outcome occurs.
"Both outcomes remain possible" is a reminder to think this way. It's the most important line in every prediction.