The Design Process: Design, Validation and the Optimization Paradigm
The production of any man-made object or system is the result of numerous iterations that refine the design from the initial concept to its final form. These iterations essentially consist of the definition of one or more performance goals and the validation of the design against these goals. For years, designs have been validated the hard way – by constructing physical prototypes, testing, and refining them continuously until the final optimal design is produced.
The Romans built thousands of bridges, buildings, roads, and aqueducts that still stand today. However, we’ll never know how many have crumbled as soon as they were completed, if not possibly in the middle of construction. Eventually, the Romans learned what makes a building stand, support its own weight, and the weight of whatever it is designed to contain, all without using Computer-aided Engineering (CAE).
Scientists modeled the Romans’ empirical knowledge by discovering the laws of mechanics, the behavior of materials, and putting them into mathematical form. These mathematical formulations opened the possibility of doing “virtual” tests, which avoids the construction of physical buildings until the design is safe. Testing was virtually executed first on paper, and would later be on computers through the programs we now call “solvers,” as they solve the system of equations that represents the physical behavior.
When the design can be validated virtually against performance goals on the computer, the user can quickly perform a sophisticated test with complicated systems without constructing a physical product. As a result, this opens up the possibility to “optimize” the design, minimizing undesired properties like the cost, weight, and size while still meeting the desired requirements.
Nowadays, design decisions are largely made upon the results of virtual testing validations, allowing for significant optimization. In fact, it is so significant that we can truly say many complex systems are entirely designed by CAE.
The Design Data Model
However, for CAE to be an effective driver of the design, it must be fast and accurate. Unfortunately, these two adjectives don’t get along very well and their difficult coexistence relates to the challenges of properly defining the model for the engineering at hand.
The design of a complex system is sometimes a multi-layered modeling activity. The activity can include a transfer function in one engineering domain, a heat source in another domain, and a rigid body in another.
For example, a joint in an engineering domain constrains the degrees of freedom of an object’s motion relative to another object. In another domain, it is a coupling between these two objects which can bend or buckle under sufficient forces.
The challenge is that a single change in the design requires an adjustment to all the derived “data models” in the different engineering domains of interest.
The Solver Data Model
When it comes to the virtual simulation of a model designed for a specific engineering domain, the challenge becomes more difficult as it entails the translation of the design data model into the solver data model, the “language” the solver understands. The accuracy greatly depends on the description of the system to the solver, taking into account what the solver does and how it does it. The skills required to accurately describe a non-trivial system for different disciplines are still so articulated that they are rarely found in the same engineer, or even in the same team of engineers. Over the years, with the widespread use of CAE in the product development processes across many industries, it has become a common practice to structure CAE into separate teams by discipline. Each team must dedicate their skills to push the envelope of solver capabilities in order to obtain the best accuracy.
Moreover, the solvers can also add another level of complexity. It is not unusual to find different data models describing the same physical behavior in two different solvers or even between different versions of the same solver.
The Engineering Data Model
Unfortunately, a single data model understandable by everyone, including designers, is not possible today and won’t be for many years. Engineering is still an incredibly complicated business. Many solutions are not yet at the level of a commodity, either due to the difficulty of making them accurate, fast enough, or both. However, not all the virtual simulation tools expose all the complexity that the solvers are capable of. Take solidThinking Inspire for an example. It is well-defined and simple – the simulation can be made to model with no prior knowledge required for advanced CAE skills.
Can this “simplification” be applied to all the pre-processors? At Altair, we believe it can. We can’t ignore that the industry is significantly investing in CAE, embedding tools into a more “simplified” environment of CAD. Therefore, we are introducing an “engineering” approach to the CAE model definition by starting with a new workflow to the modular definition of the model’s system, which hides the unnecessary complexities due to different solvers and analysis while still support and feed all the necessary use cases and multiple domains.
Even though offering a simplified and solver-independent modeling approach is a challenging task for products like HyperWorks, which is known for its accurate modeling on complex systems, multiple solvers and disciplines, we believe the new HyperWorks workflow will benefit many users without disturbing the consolidated processes based on traditional or custom solutions. As an evolution of engineering concepts is taking place across domains, we are excited to transform our sophisticated software to be less dependent on the solver complexities and we would like you to be a part of this growth in the CAE space with us.