At Digital Built World 2026, Cupix hosted a roundtable that went straight to one of the industry's most persistent problems: as-built BIM is widely delivered, but rarely relied upon once a building or infrastructure asset goes into operation. The conversation that followed was candid and wide-ranging, drawing on practitioners from construction, surveying, digital engineering and asset operations. By the end, it was clear the gap between what the industry produces at handover and what operations teams actually need is wider than most are willing to admit publicly.
Delivered but not used
The session opened on familiar ground. As-built BIM has become a standard contractual requirement, but the question of whether it delivers real operational value is a different matter entirely. One participant described a pattern they see repeatedly across both civil and vertical construction: models are assembled to meet client requirements at handover, then set aside. They are not used operationally during the build, and they are rarely consulted once an asset is handed over.
A Cupix LinkedIn poll conducted in the weeks before the event reinforced this. 75% of respondents said their as-built BIM is used rarely or not at all once an asset is operational. Only 8% said it is used frequently.
The reasons are not hard to find. In commercial construction, fit-out begins almost immediately after practical completion, making the as-built model redundant before operations even start. In civil infrastructure, reality capture and drone data is gathered for construction execution, not as a maintained record. As one participant put it, a model is a fictional representation of your asset unless it is actively connected to reality.

The accuracy problem starts at handover
Before any operational degradation even enters the picture, the roundtable tackled the accuracy of as-built information at the point of handover itself. A Cupix poll on this question found that 62% of respondents said installed services only sometimes match their documented locations in the as-built model at project completion. Just 15% said they are almost always accurate.
No one in the room was particularly surprised. Participants noted there is no clear, consistently enforced definition of what an as-built is actually required to achieve in terms of accuracy. Clients accept handover deliverables, often without validation, and then find out years later during a refurbishment that the documentation has little to do with what was actually built. At that point, the only option is to commission a fresh scan and start the record again from scratch.
This is not purely a technical shortcoming. It reflects a structural accountability gap in the handover process, and historically there have not been affordable, practical tools to fill it.
What operations actually needs
A further poll asked what matters most for operational digital asset data. Two thirds of respondents (67%) chose staying current over time, while 33% chose accuracy at handover.
This landed well with the operations voice in the room. Their world is built around maintenance management systems and the need to demonstrate ongoing inspections, testing and compliance. What they described needing was not a perfect model on day one. It was the ability to link spatial information to maintenance records, tag assets with a unique identifier, and access reliable, current documentation quickly when planning plant modifications or isolations. Getting that wrong is not just an operational inconvenience. As several participants noted, inaccurate or inaccessible records in a safety-critical environment creates real exposure for the organisation.
One participant summed it up simply: the most powerful data is the metadata sitting around the BIM model, because that is what supports 50 years of operations. Keeping it current is the hard part.
Keeping data current at scale is the core challenge
A fourth poll asked what the hardest part of maintaining accurate digital asset information at scale actually is. Half of respondents (50%) said keeping data up to date. Cost and specialist skills came in at 21%, with usability for operations teams and integrating data sources both at 14%.
That result drove some of the most direct conversation of the session. Participants from large infrastructure programs talked frankly about the difficulty of continuous capture, not just the cost but the mindset shift it requires. Civil construction in particular has less cultural affinity for BIM-led processes than vertical builds, and the prospect of regular scanning across a large portfolio can be hard to sell internally, let alone to clients.
But the cost of not capturing is also real, and it compounds. One participant described going back to a project 20 years later and having to track down the original superintendent because no usable record existed. Another raised a point that several in the room found confronting: as scanning becomes cheaper and more accessible, asset owners will increasingly be in a position to validate contractor as-builts themselves. The industry has assumed clients would not bother. That assumption may not hold much longer.

Where does the 3D model actually fit?
One of the more contested questions in the room was whether the 3D BIM model remains relevant at all once construction is complete and reality capture data is available. Some participants argued that a current point cloud or 360-degree photographic record is simply more trustworthy for operations than a model that drifts from reality the moment work begins.
The pushback was equally clear. 3D models carry metadata that reality capture on its own cannot provide: manufacturer details, maintenance schedules, component identifiers. That structured data is what allows operations teams to actually do their jobs. The view that settled in the room was that models stay important for design, coordination and future retrofitting, but they cannot and should not be expected to do the operational job alone. Reality, captured regularly and connected to asset data, is what makes the model useful rather than decorative.
The conversation also surfaced a language problem. Terms like digital twin and as-built mean different things depending on who is using them and what context they are working in. Several participants argued for moving away from model-centric language altogether and starting with the use case: what data is needed, by whom, for what purpose, and how critical is it if it is wrong?
Integration is where value is either realised or lost
The operations perspective kept returning to one practical requirement: the ability to link spatial information into enterprise asset management and maintenance systems via a common tag or identifier. Without that link, spatial data sits in a silo regardless of how accurate or current it is. Operational staff work inside their maintenance systems. If the spatial record cannot connect to that world, it will not get used.
The discussion also picked up on where AI agents fit into this picture. Several participants noted their organisations are already moving toward AI-assisted operational workflows, and the next step is giving those agents the ability to query spatial data. To answer questions about where something is, what condition it is in, what the maintenance history looks like. That kind of capability requires a spatial record that is being updated consistently over time, not a model frozen at handover.
One participant put the precondition plainly: if you are not capturing today, you cannot interrogate the data tomorrow.
How Cupix can help
The problems the roundtable worked through are exactly what Cupix (www.cupix.com) is built to address.
The core offer is straightforward: Cupix makes continuous spatial capture practical and affordable, using 360-degree cameras rather than specialist laser scanning equipment. Project teams can walk a site in 20 minutes and have a complete, navigable visual record available to every team member. What they are interrogating is reality, not design intent.
During construction, that record can be compared directly against the coordinated model, making deviations visible in near real time. Assets can be tagged and documented progressively, which means the QA and handover process is built up continuously rather than assembled under pressure at the end of the job.
For operations, Cupix provides the spatial layer the roundtable kept coming back to: a continuously updated record of the asset as it actually exists, anchored in project coordinates, and connected to asset management systems, maintenance platforms and GIS tools. Operators get a living record rather than a static model that stopped reflecting reality on the day it was handed over.
The AI capabilities Cupix is developing extend this further, with automated asset identification, issue detection and the kind of agent-driven queries that participants in the roundtable described as the near-term direction of operational workflows. And because Cupix is a lifecycle platform, data captured by the contractor during construction can be transferred directly to the owner at handover, so the record continues rather than being rebuilt from scratch.
The roundtable made one thing clear: the industry knows what it needs. A spatial record that is current, connected and usable across the whole life of an asset. Cupix is built to deliver that.
Learn more at www.cupix.com

%401x.png)

