databusiness

BI Report

Design BI reports and analytics views — with metric definitions, data source mapping, filter logic, drill-down paths, and refresh schedules that enable self-service decision-making.

business-intelligencereportsanalyticsSQLself-service

Works well with agents

BI Analyst AgentData Visualization Specialist AgentProduct Analyst Agent

Works well with skills

Dashboard DesignMetrics Framework
bi-report/
    • customer-retention-dashboard.md4.7 KB
  • SKILL.md7.1 KB
SKILL.md
Markdown
1 
2# BI Report
3 
4## Before you start
5 
6Gather the following from the user before designing any report:
7 
81. **What decision does this report support?** Name the specific business question. "How is revenue trending?" is a question. "Revenue dashboard" is not.
92. **Who is the audience?** Executives need summaries and trends. Analysts need drill-downs and raw data access. Operators need real-time alerts.
103. **What metrics matter?** List the 3-7 KPIs this report must answer. Every metric needs a precise definition before you build anything.
114. **What data sources exist?** Tables, schemas, APIs, or third-party systems. Confirm access and freshness.
125. **How often is the data needed?** Real-time, hourly, daily, weekly — this drives refresh strategy and cost.
136. **What filters and breakdowns are required?** Date range, region, product line, customer segment — define the dimensions.
14 
15If the user says "build me a sales report," push back: "What specific questions should this report answer? Who will use it and how often?"
16 
17## BI report design template
18 
19### 1. Report Purpose
20 
21Write 2-3 sentences:
22- The business question this report answers
23- The primary audience and how they will use it
24- The cadence of use (daily standup, weekly review, monthly board meeting)
25 
26### 2. Metric Definitions
27 
28Define every metric precisely. Ambiguous metrics produce misleading reports.
29 
30| Metric | Definition | Formula | Unit | Grain |
31|--------|-----------|---------|------|-------|
32| Monthly Recurring Revenue | Sum of active subscription values at month end | `SUM(subscription_amount) WHERE status = 'active' AND date = last_day_of_month` | USD | Monthly |
33| Churn Rate | Percentage of customers who cancelled in the period | `cancelled_customers / start_of_period_customers * 100` | % | Monthly |
34| Average Order Value | Mean revenue per completed order | `SUM(revenue) / COUNT(DISTINCT order_id) WHERE status = 'completed'` | USD | Daily |
35 
36For each metric, specify:
37- **Includes/excludes**: Does revenue include refunds? Does churn count downgrades?
38- **Null handling**: What happens when a dimension value is missing?
39- **Historical comparability**: Has the definition changed? Document when and how.
40 
41### 3. Data Source Mapping
42 
43| Metric | Source Table(s) | Key Columns | Join Logic | Known Issues |
44|--------|----------------|-------------|------------|-------------|
45| MRR | `billing.subscriptions` | `customer_id`, `amount`, `status`, `period_end` | None | Trial subscriptions have amount=0, exclude |
46| Churn Rate | `billing.subscriptions`, `crm.customers` | `customer_id`, `cancelled_at` | Join on `customer_id` | Reactivated customers counted as new, not returning |
47 
48Document data freshness for each source — when was the last ETL run, what is the typical lag?
49 
50### 4. Filters and Parameters
51 
52| Filter | Type | Default | Options | Behavior |
53|--------|------|---------|---------|----------|
54| Date Range | Date picker | Last 30 days | Any range, max 1 year | Applies to all metrics |
55| Region | Multi-select | All | NA, EMEA, APAC, LATAM | Filters underlying data, not just display |
56| Product Line | Single-select | All Products | Product A, B, C | Changes all metrics to selected product |
57 
58Specify filter interactions: Does selecting a region also filter the product dropdown to products available in that region?
59 
60### 5. Layout and Visualizations
61 
62Describe each section of the report top-to-bottom:
63 
641. **Summary bar** — KPI cards showing current value, period-over-period change (%), and trend arrow. Metrics: MRR, Churn Rate, New Customers, AOV.
652. **Trend chart** — Line chart of MRR over time with comparison period overlay. X-axis: date grain matching the filter. Y-axis: USD.
663. **Breakdown table** — Tabular view of all metrics broken down by the selected dimension (region, product, segment). Sortable on every column. Include sparklines for trend.
674. **Detail drill-down** — Clicking any row in the breakdown table opens a filtered view showing the individual records that compose that aggregate.
68 
69For each visualization, specify: chart type, axes, legend, color encoding, and what interaction (click, hover, filter) is supported.
70 
71### 6. Drill-Down Paths
72 
73Define how users navigate from summary to detail:
74 
75```
76MRR Summary Card → MRR by Region (bar chart) → Region Detail Table → Individual Subscription Record
77```
78 
79At each level, specify what filters carry forward and what new dimensions become available.
80 
81### 7. Refresh Schedule
82 
83| Data Source | Refresh Frequency | Method | SLA | Failure Handling |
84|------------|-------------------|--------|-----|-----------------|
85| `billing.subscriptions` | Every 6 hours | Incremental ETL | Data available by :30 past | Retry 3x, alert #data-ops after failure |
86| `crm.customers` | Daily at 02:00 UTC | Full sync | Data available by 03:00 UTC | Stale data badge shown on report |
87 
88Specify what the user sees when data is stale — a timestamp, a warning banner, or a fallback to the last successful refresh.
89 
90## Quality checklist
91 
92Before delivering the report design, verify:
93 
94- [ ] Every metric has a precise written definition with includes/excludes
95- [ ] Formulas are unambiguous — another analyst could reproduce the number from the formula alone
96- [ ] Data sources are identified with table names, join logic, and known data quality issues
97- [ ] Filters specify defaults, options, and cross-filter behavior
98- [ ] Drill-down paths are defined from summary to the most granular level
99- [ ] Refresh schedule includes SLA, failure handling, and staleness indicators
100- [ ] The report answers the stated business question — not adjacent interesting questions
101- [ ] Null and edge case handling is documented (zero-division, missing dimensions, partial periods)
102 
103## Common mistakes to avoid
104 
105- **Undefined metrics.** "Revenue" means different things to finance, sales, and product. Write the SQL-level definition. If two stakeholders disagree on the definition, that is a conversation to have before building the report, not after.
106- **Too many metrics on one report.** A report with 20 KPIs answers no question well. Limit to 3-7 primary metrics that directly answer the stated business question. Link to secondary reports for everything else.
107- **No drill-down path.** A summary number without the ability to investigate why it changed is useless. Every aggregate needs a path to the underlying records.
108- **Ignoring data freshness.** Users will make decisions based on stale data if you do not make freshness visible. Always show the last-refreshed timestamp on the report.
109- **Filters that mislead.** A date filter that only filters one chart but not another on the same report causes users to draw wrong conclusions. Document exactly what each filter affects.
110 

©2026 ai-directory.company

·Privacy·Terms·Cookies·