When Atlantic LNG came to us, the requirement was clear: executive and departmental dashboards pulling together operational and financial data from multiple systems. The challenge was equally clear: IT security policy did not permit direct connectivity from any BI tool to any production database.
In our experience, this is the point where most BI engagements either stall or produce a compromise nobody is satisfied with — a limited dashboard that only shows what's accessible, or a drawn-out negotiation with IT that delays the project by months.
We did neither. We designed an architecture that delivered everything the business needed without touching a single production database. Here's how we thought about it and what we built.
The constraint that looked like a blocker
The standard BI implementation approach is straightforward: connect Power BI (or Tableau, or Qlik) directly to the databases, build a data model, build dashboards. When direct connectivity isn't available, that path is closed.
What most teams miss is that the goal was never "connect Power BI to a database." The goal was "give leadership a reliable, unified view of the business." The database connection is just one way to achieve that — and when it's unavailable, the question becomes: what other ways exist to get clean, current data from these systems into a BI-readable form?
Five constraints. Any one of them could have stalled the project. Together they forced us to think about the problem differently — which, in retrospect, produced a better architecture than a straight database connection would have.
The reframe that unlocked the solution
The insight was simple once stated: the BI tool doesn't need to know where the data came from — it only needs the data to be clean, current, and in a format it can read.
That reframe separated the ingestion problem from the visualisation problem. Power BI doesn't care whether its data model was populated from a direct SQL connection or from a staged SharePoint file. If the data arrives correctly validated and structured, the dashboards work identically.
So the question became: how do we get clean, current data from each source system into a structured location that Power BI can read — without any of those systems needing to grant direct database access?
The answer was two ingestion streams, each designed for a different type of data source.
Two streams. One data model.
Excel staging exports
PowerApps capture
Why Excel staging is better than it sounds
When we describe this approach to clients, the initial reaction is sometimes scepticism. Excel exports sound like a workaround — less rigorous than a "proper" database connection. In practice, structured Excel staging through a controlled SharePoint location has several genuine advantages over direct database connectivity.
- Analytical load separated from transactional load. The source databases are never touched by analytical queries. They run at full transactional performance regardless of how many dashboards are open or how many users are refreshing simultaneously.
- Data quality validation at ingestion. Every file is validated before it enters the data model. A direct database connection takes whatever is in the database — a staging layer can reject data that doesn't meet format and quality standards before it reaches any dashboard.
- Historical data preserved automatically. Append-only staging means every load adds to the history. Even if the source system only keeps 90 days of data, the staging layer has everything since the pipeline started.
- Export process creates awareness. When data owners export on a schedule, they see the data they're responsible for. In several cases, this surface-level visibility caught data quality issues in the source system that had been invisible before.
The PowerApps decision — digitising data that didn't exist yet
The second stream solved a different problem. Some of the data Atlantic LNG needed in their dashboards didn't exist in any digital system. It lived in the heads of operational staff, in paper forms, in email exchanges, in spreadsheets that sat on individual laptops and never fed into anything central.
The conventional response to this is to say "we can't dashboard data that doesn't exist" and move on. We built the mechanism to make it exist.
Microsoft PowerApps is a low-code form builder that writes directly to Microsoft Dataverse — a structured data store that Power BI connects to natively. We built PowerApps forms for each data category that had no digital source: structured, validated, required fields enforced, submission confirmation, timestamped records. Operational staff completed them on desktop or mobile. The data went immediately into Dataverse. Power BI read from Dataverse.
Three things made this work well in practice:
- The forms were genuinely simple. We designed each form around exactly the data the dashboard needed — not a comprehensive data capture exercise, just the specific fields required for the specific metrics leadership wanted to see. Short forms get completed. Comprehensive forms get abandoned.
- Mobile support mattered. Operational staff in an oil and gas environment are often not at a desk. A mobile-accessible form they could complete from their phone or tablet removed the friction that would have killed adoption.
- Validation was built in. Required fields, data type validation, dropdown menus for categorical choices — the forms prevented the kind of data entry inconsistency that makes manual data useless for analytics.
What we learned — and why this is now a blueprint
The constraint shaped a better solution
The BI tool's job is visualisation — not ingestion
Non-digital data can be digitised at the point of collection
IT security constraints often produce better architecture
Where this approach applies beyond Atlantic LNG
We've described this as the "Atlantic LNG approach" because that's where we refined it. But the same pattern applies to a wider range of situations than the oil and gas industry.
Healthcare — clinical system access restrictions
Patient data systems often prohibit direct BI connectivity. Structured export pipelines with proper data governance deliver the analytics without the compliance risk.Government and regulated industries
IT security policies in public sector organisations frequently restrict direct database access. The staging approach satisfies security requirements while delivering full analytical capability.Manufacturing — legacy OT systems
Operational technology systems on factory floors often have no API and no safe way to connect directly. Scheduled exports from SCADA or MES systems to a staging layer is the standard approach.Businesses using third-party SaaS
When your data lives in a SaaS platform that doesn't expose direct database access (only an API or export), a staging architecture from the export mechanism is often more reliable than API polling.Any environment where "no direct access" seems like a blocker
If a previous BI attempt stalled because of connectivity constraints, the staging approach is almost always the solution that was missing.The honest summary
The assumption that BI requires direct database access is wrong. It requires clean, current, structured data in a place the BI tool can read. How that data gets there is an engineering problem with multiple solutions — direct connection being only one of them.
The Atlantic LNG project taught us to start every constrained BI engagement by asking: "What data exchange mechanisms do these systems support?" rather than "Can we connect directly?" The answer to the second question is sometimes no. The answer to the first question almost always reveals a path forward.
If your BI project has stalled because direct database access isn't available — or if IT security policy is blocking the conventional approach — we've built the alternative. It works. And in several ways, it's better.