Power BI and Microsoft Business Central: Efficient data integration for modern analytics
Typical decision paths in consulting practice – architecture, interface, modelling
//
Data & Analytics, Microsoft Power BI

Integrated data analytics are indispensable for companies today. The combination of Microsoft Business Central and Power BI offers a flexible, powerful solution for analysing central ERP data. Business Central covers core processes such as finance, purchasing and warehousing and is deeply integrated into Microsoft 365. This article highlights how a direct connection can be established, what advantages it offers and where the challenges lie.
Microsoft Business Central is a modern, cloud-based ERP system and one of the world's leading solutions for small and medium-sized businesses. It covers a wide range of business processes such as financial management, purchasing, sales, warehousing and project management, and offers a high level of integration within the Microsoft product world. Alongside established large-scale solutions such as SAP, Business Central has established itself as a strong alternative, particularly due to its flexibility, user-friendliness and seamless integration into the Microsoft 365 environment.
What decision-makers need to know about this topic
The integration of Business Central and Power BI is not an IT detail – it is a strategic lever for data-driven decisions.
In this article, we compare common architectural approaches and interfaces. For IT managers, the focus is on scalability, performance and maintainability – for decision-makers, this means
- Faster access to reliable data
- Greater transparency across business processes
- Less dependence on IT resources
A well-designed data architecture directly improves the quality of your decisions – whether in controlling, sales or management.
Architectural considerations: Direct connection or classic DWH strategy?
Before starting a project, a key architectural question arises: Should you opt for a classic data warehouse architecture with a central DWH based on Microsoft SQL Server or an SQL database in Azure or Fabric, or should you choose a "pure" Power BI solution with a direct connection to Business Central? Our customers often face similar questions: How large is the expected data volume? Do we need historisation and complex data preparation? How important are central governance and scalability for future requirements?
A key question is whether a classic data warehouse (DWH) should be connected between Business Central and Power BI:
- Direct connection Business Central → Power BI
- Advantages: Fast implementation, direct proximity to the business area, lower infrastructure costs.
- Disadvantages: Limited options for data historisation, limited scalability for very large data volumes.
- Business Central → DWH → Power BI
- Advantages: Clean history, consolidated data storage, improved governance.
- Disadvantages: Higher implementation and maintenance costs.
Direct connection is often perfectly adequate, especially for smaller departments or decentralised solutions. For more complex requirements, we recommend setting up a DWH, for example based on Microsoft Azure or Microsoft Fabric.
A central DWH enables structured, historical and consolidated data storage and offers clear advantages in terms of governance and data quality. However, setting it up involves higher initial costs and requires additional infrastructure and maintenance.
The direct connection of Power BI to Business Central, on the other hand, offers an alternative. With modern features such as an in-memory engine, aggregations and flexible modelling options, Power BI is able to meet even demanding data analysis and reporting requirements. Our customers appreciate the ability to achieve results quickly without having to wait for centralised systems. At the same time, the architecture remains lean and tailored directly to the business area. This achieves a high level of agility, which is a clear competitive advantage, especially in dynamic business environments.
Connection options: OData or REST API?
This decision forms the basis for the further architecture and should be made carefully at the start of the project, taking into account individual requirements.
Direct connection of Power BI to Business Central There are various options available for connecting Power BI to Business Central:
- OData interfaces: OData can be used to set up user-defined web services that are flexible and easy to maintain.
- Advantages: Clear configuration, fast deployment, low complexity.
- Disadvantages: Limitations occur with large amounts of data (e.g. paging), performance may be restricted.
- REST API: A modern and high-performance alternative, especially for large data volumes and more complex use cases.
- Advantages: Better performance, modern authentication mechanisms, scalability for growing data volumes.
- Disadvantages: Higher initial effort, more complex implementation.
Both approaches, OData and REST API, have proven themselves many times in practice. OData continues to be widely used due to its simple configuration and high flexibility, especially in department-specific projects. If an OData interface is defined based on a Business Central page, it can be easily configured in the Business Central interface. This enables quick implementation without in-depth technical intervention.
Nevertheless, one thing is clear: the REST API is not only more powerful, but also the strategically future-proof choice. Microsoft explicitly recommends REST APIs for new developments and is continuously investing in the expansion of REST-based endpoints. REST APIs feature a modern authentication model, an optimised data structure and greater scalability, especially for large data volumes or real-time requirements.
Even though configuring individual REST endpoints requires more effort than with OData, it offers significant advantages in the long term. Similar to OData web services, REST APIs can also be tailored to user-defined requirements. So if you need custom interfaces beyond Microsoft's extensive standard APIs, you should seriously consider investing in REST. REST is already the technologically superior option and offers significantly better performance than OData in many scenarios – switching is already worthwhile.
Data modelling: layer architecture and star schema as success factors
After importing the data, we rely on our proven noventum layer architecture:
- Acquisition layer: Raw data from Business Central.
- Integration layer: Data cleansing and harmonisation.
- Propagation layer: Modelling of the final report model in the output layer.
A central element here is the star schema, which offers the following advantages:
- Optimised performance thanks to the VertiPaq in-memory engine – a highly developed column-based database technology from Microsoft that is optimised for analytical workloads. VertiPaq compresses data highly efficiently and processes it in the main memory, enabling extremely fast query times. Column-oriented storage means that only the attributes relevant for analysis are loaded, which minimises storage requirements and latency and thus enables even very large data volumes to be analysed with high performance.
- Improved user-friendliness thanks to an intuitive navigation structure – the clear separation of fact and dimension tables makes it easier for end users to understand the data models.
- Clear logical structure – the separation of dimensions and facts creates the basis for scalable, maintainable and easily expandable data models that support both self-service BI and professional BI scenarios.
Best practices such as the use of surrogate keys, value diversity (cardinality) in dimension tables optimised for the application case, consistent naming conventions and the minimisation of relationships across large tables contribute significantly to performance improvement. We recommend consistent, topic-oriented modelling. This allows key figures to be clearly structured, consistently calculated and analysed in a performant manner. A consistent separation of fact and dimension logic as well as topic-related fact tables ensure high modelling quality and scalability.
Best practices for performance and model quality
To ensure high performance even with large amounts of data, we rely on the following mechanisms:
- Use of the VertiPaq in-memory engine, Microsoft's column-based in-memory database. Through column-wise compression and memory-optimised processing, the engine ensures short response times and high user satisfaction, even for complex analyses.
- Removal of unnecessary columns and rows already in Power Query: This reduces the model size and shortens loading times.
- Minimisation of the number of relationships and use of single-direction relationships wherever possible.
- Set up data flows or fabric solutions for scaling.
- Reduction of complexity in measures: Computationally intensive measures with many nested calculations have a direct impact on query speed. Optimised DAX formulas help here.
These measures enable us to process large data models efficiently and minimise reporting latency.
Reporting standards: Intuitive, scalable analyses
Based on noventum information design standards, we create reports that:
- Are grid-based and clearly structured.
- Enable key information to be quickly identified.
- Support different levels of analysis – from management summaries to document analysis.
These standards guarantee a high level of user-friendliness and support quick decision-making. They thus form the basis for effective data-driven decision-making, where decisions are based on sound data analysis rather than gut feeling.
Conclusion: Efficient reporting solutions with Business Central and Power BI
Power BI and Business Central enable powerful reporting without complex intermediate layers. REST APIs are the modern, high-performance standard for future-proof integration. They offer clear advantages in terms of scalability, maintainability and speed. With proven modelling architecture, we support our customers on their way to data-driven decisions.

noventum consulting GmbH
Münsterstraße 111
48155 Münster