Why don't the EMBAC Dashboard and EMBAC Program Survey reports align?
Understanding the Difference Between Insights Display within the EMBAC Dashboard and Program Survey Benchmark Reports
The data presented in the EMBAC dashboard may not always match exactly with the static benchmarking reports based on the EMBAC Membership Program Survey.
Do Dashboard and Static Reports Reflect Different Data?
Both the Dashboard and the EMBAC Program Survey reports are derived from the same set of data. However, the way they present and refresh these data can be different, leading to perceived discrepancies.
-
Data Aggregation & Granularity:
The Dashboard is built to provide a high-level overview regarding specific industry perspectives. These visualizations aim to provide quick insight into overall trends and performance metrics. Conversely, the static reports offer more detailed perspectives based on all EMBA programs completing the survey for the specific benchmark (e.g., peers, region, cost-band). The difference in granularity levels can often result in differing data visualizations.
-
Data Filtering:
The EMBAC Dashboard utilizes filters to focus on specific data subsets. The EMBAC Program Survey static reports are generated without these filters, potentially causing variations in the data presented.
While it can initially be disconcerting to see discrepancies between the EMBAC Dashboard visualizations and static reports, understanding the reasons behind the differences can help you better interpret the EMBA benchmark data. Always consider the purpose and functionality of each tool when comparing data across platforms.
Having a sound knowledge of both your dashboard and static reports will make data analysis more effective and meaningful. If the discrepancy persists beyond these logical explanations, consider submitting a support ticket to Percept Research for further assistance.