There are many new and exciting applications for data analysis in the construction industry, the vast majority of which employ new data aggregation tools to quantify and analyze specific aspects of the built environment, often numerical aspects like square foot totals, or asset management counts. But there are other important uses for data tools for more abstract attributes of the built environment, such as tracking, analyzing and evaluating risk.
This case-study details a facilities assessment for a government client who needed to understand the range of risk types and impacts that could potentially occur in their buildings. The analysis was triggered when a single pipe failure on the 11th floor of one facility unleashed a cascade of unforeseen consequences as it made its way to ground, almost paralyzing the facility for several days. The consequences rippled through the facility long after the event.
As a result, the client, a proactive owner-occupier, commissioned a comprehensive assessment of their three largest facilities to help enumerate and understand other similar vulnerabilities in each facility, so as to avoid future similar exposures.
Evaluating the Facilities
As a first step, the team visited the three facilities, totaling over two million SF, and surveyed each system’s equipment and arrangement. Because each of the facilities was large, and built over several decades, the range of equipment and systems encountered varied from new to those which were teetering on the end of their useful life. All of this data was captured with a brief narrative for each item, as well as a mitigation strategy.
Once the list of infrastructure, equipment, layout and configurations of each facility was completed, the team assigned each of the identified issues a category, such as site work, building envelope, electrical equipment, mechanical systems and even security protocols. In addition, they further classified each item according to two important decision-oriented factors: 1) its likelihood to occur, and 2) the impact of a failure were it to occur. They then took the categorized items and assigned a rough dollar amount to address and fix each work item.
The data had several data categories, or “dimensions.”
When completed, the final spreadsheet was a lengthy inventory of future work-items arranged by category, cost, likelihood and severity. In data jargon, each of these traits – category, cost severity, likelihood, etc. - would be called ‘dimensions’, that is aspects of the item that help to understand the problem. Each dimension represents a way to slice the data. Thus, with the right tool an owner could start to drill into the hundreds of items, even with combinations of dimensions.
Leveraging Data Tools
With over 1,000 items cataloged and several dimensions to simultaneously evaluate, the next task was to address was how best to turn the data into information – that is actionable decisions to help address the facility needs. In its raw form, the initial spreadsheet report was rich with information but was somewhat cumbersome to navigate for everyday use by the facility manager.
ALSO SEE: BI(m): BIM data without models
In addition, with so many measures, the team needed the capability to drill down into the findings to be able to pose various questions:
• Which items were the most critical points of failure for service continuity?
• Were there certain categories of vulnerabilities that occurred frequently in each facility?
• What was the total capital cost by category, such as critical items and less critical items?
These were all valid questions that would take quite a while to deduce from a straight spreadsheet format. For the next step, the team imported the raw data into a data analysis application.
A collection of charts on a single page; each chart is interactive, so that clicking on it filters the surrounding charts.
By using the advanced pivoting and aggregation of data analytical tools, the engineering team was able to scrutinize the data and display it graphically in a way the facility manager could more readily comprehend. A diverse collection of tables and column charts, organized and colored by severity or category, told a far more vivid story of the work items for each facility. For example, it was easier to see which items needed to be addressed immediately and which were those that could happen sometime over a longer timeframe.
3D charts with a color legend can display four dimensions at once.
Using the data tools, the team was even able to create three-dimensional chart formats that provided a very rich assessment of each item by campus, capital cost, severity and likelihood of occurrence all in a single comprehensive view. Each item could also display additional information about the line item when the user hovered over it. Thus, the new data tools allowed the reader to comprehend major trends and highpoints in a highly integrated way.
Even after the site survey activity was complete, there were further opportunities to ‘extend’ - or enhance - the original raw survey data spreadsheet with other data points subsequent to the survey, such as an anticipated contract date for when each item might need to be performed. As a result, extending the data resulted in a date chart to gauge future expenditures for each year, and also an average yearly spend rate for the facilities. The team could see that in 15 years for example, a large sum of funds would be needed to avoid disruptions to services. This chart could be further analyzed by severity or impact.
An anticipated contract date was added to each work item to forecast capital expenditures over the next 25 years.
Seeing the Forest AND the Tree
The risk analysis study illustrated how risk mitigation studies can be greatly enhanced with new data tools. We like to say that the team sees the power of data analytics to be that the owner “can see the forest AND the tree” – meaning, - where previous typical reports either summarized the findings, (the Forest), or itemized them in individual detail, (the Trees), - these new tools allowed the forest of measures to be displayed at once, and then through drilldown and interactive graphics, enabled custom aggregations as well as isolation of individual trees. As such, data analysis tools constitute a significant leap for facility managers and building professionals to help catalog, understand and strategize around facility conditions and plan future capital expenditures to address risks and vulnerabilities.
The uses of data in construction go far beyond mere floor areas and quantities, - in this case evaluating the risk posed to critical facilities. What is most valuable about these tools is how they helped to make complex and abstract aspects of buildings more visible.
About the Author
John Tobin is the Albany, N.Y., Office Director for SMRT Architects and Engineers, headquartered in Portland, Maine. He has spent three decades spearheading new practice technologies, especially BIM and advanced delivery, and then as an AE technology and operations leader. Most recently, Tobin has begun integrating new data analysis techniques with BIM processes.
Related Stories
Sponsored | AEC Tech | Jun 15, 2018
4 project management mistakes to avoid at all costs
Helpful tips for managing projects more effectively
| Jun 11, 2018
Accelerate Live! talk: The smart jobsite — Predictive visual data analytics for proactive project controls
In this 15-minute talk at BD+C’s Accelerate Live! conference (May 10, 2018, Chicago), a trio of experts in predictive visual data analytics presents how design-build giant Clayco has leveraged this technology to achieve production efficiency on several construction sites.
| Jun 11, 2018
Accelerate Live! talk: Design innovation at Adrian Smith + Gordon Gill Architecture
In this 15-minute talk at BD+C’s Accelerate Live! conference (May 10, 2018, Chicago), AS+GG’s Anthony Viola takes us behind the magic curtain to illuminate the firm’s methods for nurturing design innovation.
| Jun 11, 2018
Accelerate Live! talk: ‘AEC can has Blockchains?’
In this 15-minute talk at BD+C’s Accelerate Live! conference (May 10, 2018, Chicago), HOK’s Greg Schleusner explores how the AEC industry could adapt the best ideas from other industries (banking, manufacturing, tech) to modernize inefficient design and construction processes.
| Jun 11, 2018
Accelerate Live! talk: How advanced digital fabrication techniques are driving design innovation
In this 15-minute talk at BD+C’s Accelerate Live! conference (May 10, 2018, Chicago), SOM’s Lucas Tryggestad and Kyle Vansice present the firm’s 3D-printed building project and explore how digital fabrication is pushing design innovation.
| May 30, 2018
Accelerate Live! talk: Seven technologies that restore glory to the master builder
In this 15-minute talk at BD+C’s Accelerate Live! conference (May 10, 2018, Chicago), AEC technophile Rohit Arora outlines emerging innovations that are poised to transform how we design and build structures in the near future.
| May 30, 2018
Accelerate Live! talk: Why the AEC industry must adapt to the Internet of Things boom
In this 15-minute talk at BD+C’s Accelerate Live! conference (May 10, 2018, Chicago), building systems expert Jeff Carpenter explores established and emerging IoT applications for commercial and institutional buildings, and offers a technology roadmap for navigating the IoT landscape.
| May 24, 2018
Accelerate Live! talk: The rise of multi-user virtual reality
In this 15-minute talk at BD+C’s Accelerate Live! conference (May 10, 2018, Chicago), two of CannonDesign's tech leaders present their early findings from pilot testing multi-user VR technology for AEC project coordination.
Sponsored | AEC Tech | Apr 3, 2018
4 reasons to take a closer look at your project metrics
We've all heard that data is important, but what role does it really play in your business?
AEC Tech | Mar 7, 2018
Interoperability for the building industry – More wicked problems
This article is a follow up to Nathan Miller's 2016 article “The Wicked Problem of Interoperability”.