Skip to content

Variance Analysis - Pain Points & Challenges

Pain Points

Process Gaps

  • No formal variance analysis process exists - Teams are building this capability from scratch, completely new territory
  • Source: ON (2025-11-11)

  • No market solution for variance analysis - Competitors at Mannheim conference said "we haven't started" when asked about variance tools

  • Source: ON (2025-05-27) - Yulia: "There is no tool or solution which can help identify the differences what was forecasted and what was actual"

  • Data validation consumes time before variance analysis can start - Hours spent cleaning data before the actual analysis begins

  • Source: Personio (2024-10-03) - "Way too much of my time is spent on data validation"

Data Gaps

  • AP data not yet integrated - Limits variance analysis completeness; can't see full picture of what drove variances
  • Source: ON (2025-11-11)

Verification Challenges

  • Need to manually verify treasury-initiated payments - Must check if payments from Kadiba went through; no one else is tracking these
  • Source: ON (2025-11-11)

Explainability Challenges

  • Customer inflows variance is a "black box" - Many factors impact collections; hard to explain why variances occur
  • Source: Personio (2024-10-03)

  • Lack enriched data to explain variances - Need customer name, segment, type to understand why variances occurred

  • Source: Personio (2024-10-03)

  • Understanding the "why" requires manual investigation - Need to reach out to other teams (payroll, FP&A) to explain variances

  • Source: Sonder (2024-10-03) - "We have to reach out to the team that actually manages that function to understand"

  • Holiday schedules across countries affect cash timing unpredictably - Have to Google local holidays to explain zero collections

  • Source: Sonder (2024-10-03) - "We sometimes just Google it, right? If there's no activity, we check to see if it was a holiday"

KPI Framework Gaps

  • No defined KPI framework for tracking forecast accuracy - ON's validation deck revealed they should have defined accuracy KPIs from the start, not just ad-hoc WMAPE checks
  • Source: ON (2026-03-10) - FC Validation deck slide 16: "We could have been more consistent in defining, tracking, and monitoring KPIs throughout the project"
  • Proposed KPIs: Minimum Cash Level, 4-Week Rolling Average WMAPE, Global Total WMAPE (quarterly reduction target), Forecast Bias %, WMAPE by Category/Entity

  • No bias tracking - No visibility into whether forecasts systematically over- or under-predict; need to understand directional tendency

  • Source: ON (2026-03-10) - FC Validation deck lists "Forecast Bias % (systematic over/under forecasting)" as desired KPI