Skip to content
One-PagerReviewed

E0059: Business Cycle Sensitivity Framework

A decision-ready template derived from the framework.

Name variants

English
E0059: Business Cycle Sensitivity Framework
Kanji
景気循環感応度枠組

Quality / Updated / Source / COI

Quality
Reviewed
Updated
COI
none

Context

Context: portfolio or budget planning before a downturn creates recurring decisions where stakeholders interpret output gap sensitivity, revenue cyclicality, and lead indicators differently. The organization needs a standard way to compare options using macro indicators, historical performance, and scenario shocks so that debates do not restart each cycle. Without a common frame, the stability versus growth exposure is decided implicitly and accountability weakens. A shared decision log also helps teams learn which assumptions held and which broke under stress.

Options

  • Option A: Preserve the current approach to minimize short-term disruption, accepting limited upside.
  • Option B: Run a phased change, validate results against agreed metrics, and scale only after thresholds are met.
  • Option C: Redesign the approach end-to-end to pursue larger gains, with higher implementation effort and risk.

Decision

Decision: Choose Option B. Sequence the rollout so early results validate output gap sensitivity, revenue cyclicality, and lead indicators targets, and stop or adjust if assumptions fail. Assign owners, document constraints, and schedule a review checkpoint to avoid drift.

Rationale

Rationale: Option B balances stability versus growth exposure while preserving flexibility if market conditions move. It allows the team to test macro indicators, historical performance, and scenario shocks assumptions and protect against the main risk: over-relying on past cycles that may not repeat. Phasing also improves organizational buy-in because progress is visible and accountability is explicit. The approach generates evidence that improves the next decision cycle.

Risks

  • Weak data quality can obscure changes in output gap sensitivity, revenue cyclicality, and lead indicators, making it hard to validate the decision.
  • Execution drag may delay learning and leave the organization exposed to over-relying on past cycles that may not repeat longer than planned.

Next

Next: Confirm ownership, finalize the baseline for output gap sensitivity, revenue cyclicality, and lead indicators, and document macro indicators, historical performance, and scenario shocks assumptions in a shared log. Schedule the first review, define stop conditions, and communicate the plan to affected teams. Capture lessons learned so the framework improves with each cycle.