Emerging Industry Trends in Computational Design
Computational design is irrevocably changing the building industry. Every aspect of this industry is being affected by the emergence of computational design, and yet the reality is many people are not prepared for nor aware of the scale of change that is coming. Computational design represents a paradigm shift in the way we think and the way we work. The foundations of design and construction are shifting, and a new digitally powered world is emerging. This is a world with comprehensive automation, enriched intelligent design processes and increasingly interconnected data that informs everything.
The extent of computational design’s impact is hard to conceptualize. It is something radical. It also something that we probably cannot fully comprehend at this time. In the webinar, “Emerging Industry Trends in Computational Design,” computational design strategist Anthony Zuefeldt refers to David Bowie’s famous interview with Jeremy Paxson on BBC Newsnight, where Bowie astutely predicted how the emergence of the internet would reshape our world. It’s wise to recognize the parallels between the inception of the internet and the rise of computational design in the building industry and society at large.
While the consensus on the definition of computational design is evolving, Applied Software defines it as “an algorithmic problem-solving methodology thatfacilitates the holistic organizational configuration of people, processes and technology to define value propositions and to develop solutions made possible by the capabilities of digital technologies.” It is through the application of this problem-solving methodology that the building industry is beginning to solve some of the largest, most pressing challenges of the industry and transform existing processes.
There’s a multitude of trends occurring in computation design. As a reflection of the tech industry, the pace of change and innovation in this space of the building industry are extraordinarily high. This webinar explores a few emerging trends that Anthony Zuefeldt believes to be important, but certainly does not touch on all of them.
The following emerging industry trends were discussed:
- Moore’s Law
- Earlier design intelligence
- Increasing interoperability
- Surrogate modeling
For the entire presentation on trends in computational design, you can watch Anthony Zuefeldt’s Applied Software webinar on demand.
To understand why computational design is so transformational, it’s imperative to understand Moore’s Law. This prediction, laid out by Gordan Moore, co-founder of Intel, asserts that the processing power of our computers will double every two years. The prediction has largely held true for over 50 years and has truly been the engine behind the transformation of our society and economy into a digital world.
Underpinning all of this is the concept of exponential growth. For example, take an exponential growth sequence, doubling the number 10 ten times:
10, 20, 40, 80,160, 320, 640, 1280, 2560, 5120
The grand total of all the growth preceding the last doubling event is less than the final value of that event. This is the exponential growth Moore’s Law represents. The implications are staggering: every two years, our computing power increases by a sum greater than all the improvements combined over the last 50 years – Moore’s Law in action.
There have been many proclamations over the years that Moore’s Law will eventually end, especially as we bump up against the limits of physics. And yet, according to Anthony Zuefeldt, the opposite is happening: Moore’s Law is accelerating. In the last several years, processing power has been doubling annually rather than every two years.
As processing power grows, our capabilities to perform computationally intensive tasks grows. In computational design, the explosion of processing power has unlocked increasingly sophisticated computing methods that allow for ever more powerful tools to be added to our arsenal in the building industry.
Innovation of processes paired with Moore’s Law are fueling the current trends of computational design.
Earlier Design Intelligence
The first significant trend that Anthony Zuefeldt identified in the webinar is the expansion of sophisticated design intelligence into the earliest stages of the design process. Designers are leveraging tools that give them comprehensive and granulated feedback in scoping exercises for projects. These tools are allowing designers to make smarter, better, faster design decisions at the beginning of a project when these decisions are most impactful. This level of impact cannot be matched by a human alone.
The team at Applied Software is deeply involved in this space and encourages a continual dialog around innovation and computational design. If you want to join in, reach out to the experts of Applied Software to participate in this conversation.
The webinar covers multiple examples of applications that support this expansion of design intelligence, including the site feasibility tool Testfit and Gensler’s Blox design platform. Each of these leverage groundbreaking functionalities that facilitate automation and expression of data in easily consumable formats that can be linked to BIM environments like Revit.
The second trend that the webinar explores is the move towards greater interoperability. Amongst many common data platforms and tools, there is a patchwork network of integrations that allow for seamless sharing of data. Traditionally, tools like Revit have not communicated with other modeling environments.
Data is the key to communicating among digital technologies, and the means for communicating data is rapidly improving. Rhino.Inside.Revit stands out as a recent development, connecting the 3D modeling application Rhino3D with Revit in a bi-directional seamless workflow. Each application that connects to another grows the network of connections by every application they work with. Rhino3D connects to 50 applications; with Rhino.Inside.Revit, Revit connects to those same 50 applications, and vice versa.
The last trend that the website highlights is the advent of machine learning and surrogate modeling. A surrogate model, or an approximation model, is an engineering method used when an outcome of interest cannot be directly measured easily, so a model of the outcome is used instead.
– Shuai Guo, An introduction to Surrogate modeling, Part II
Surrogate modeling offers a completely different approach to design. Where traditional design methods make design moves then simulate the results, with surrogate modeling design moves yield instant feedback on their impact. Thinking in the context of something like energy analysis, a designer can make different geometric explorations, and as they go, they can get highly accurate data on how their design moves are impacting energy performance.
As computational design evolves and the rate of its evolution continues to accelerate, the building industry is changing. This change is not a hypothetical or a speculation of the future . . . transformation is happening today. Wholesale industry disruption is likely not far away. In his closing statements, Anthony encouraged the audience to begin to seriously consider the implications of computational design and how they might prepare their companies for this movement.
Learn about the onslaught of change and digital transformation causing construction markets to shift … download the free new eBook from Applied Software: “Ultimate Construction Tech Stack for 2021.”