In the past decade, processor performance has increased significantly, leading to a huge jump in computational speed. For example, the first computers, such as the ENIAC, could perform 385 multiplications per second, whereas modern processors can perform more than 11 trillion operations per second (more than 28 million times faster). In addition, computation methodologies have improved, leading to a significant reduction in computation time. The way codes were developed and FEA models were solved has changed significantly over time. It would be interesting to learn how they evolved.
20th Century FEA Models
During the mid-century, the first FEA calculations were performed. Researchers developed and used the formulation, and each decade saw at least a tenfold increase in computational speed.
1950
It all started around 1950 when models had fewer than 100 elements and were executed on computers such as the IBM 704. The bridge structures were 1D beam or 2D planar element for this simulation. You can imagine how simple these designs were and how many assumptions were made to make them work. Its not clear how long that simulation took but definitely it wasn’t a quick one.
Clough, R. W. (1960). “The Finite Element Method in Plane Stress Analysis.” Proceedings of the ASCE 2nd Conference on Electronic Computation.
1960 and 1970
During this period, computational power increased, which allowed for larger simulation sizes. Software like NASTRAN was developed, capable of solving models with up to 1,000 elements. Typical users of these tools were high-end companies such as NASA. Universities like Swansea and Berkeley also contributed by enhancing the modeling formulas for heat transfer problems.
https://pmc.ncbi.nlm.nih.gov/articles/PMC3278509/
1980
In the next decade, the scale increased further, and models with up to 10,000 elements could be solved. However, even then, advanced microprocessors and workstations were required, it wasn’t something you could run at home. By this time, these tools were commonly used in automotive and civil engineering.
1990
By the 1990s, it was possible to simulate models with up to 100,000 elements, allowing for detailed simulations of turbine blades and car crashes. When we talk about details, I mean that there were enough elements per part to see the difference between components such as a wheel or a string wheel (It’s a made-up statement). These tools were powerful enough to simulate an entire aircraft with some good accuracy which was a break though for that time.
21st Century
This century has seen an even more significant increase in computing power. We now have supercomputers and servers that can house tens of thousands of CPUs. RAM capacity is no longer limited to the gigabyte scale; terabytes of RAM are common in supercomputers. Companies have discovered that by investing in FEA tools, they can save tons of money by avoiding repetitive experiments and saving time. The scale of elements and level of detail is now limited only by the scope of the study and the available computing power. In some cases, computers can run simulations for months if needed.
Commercial Tools
Commercial tools like Abaqus and ANSYS further accelerate the process by introducing new calculation techniques that yield more solutions per CPU cycle. Year by year, these companies have reduced computation time by up to 20% in some cases. For example, in 2009, the Nastran team simulated a model with one hundred million grids, approximately 98 million shells, and 49 million solid elements. This resulted in 500 million equations and more than 600 million global degrees of freedom, requiring 1,069 minutes (or 17.8 hours) of elapsed time on an IBM Power 570 server. Since then, companies have shifted their focus from simply increasing the number of elements to ensuring the accuracy of the model by using the right solution techniques.
Now You
Now that you are reading this document, you might be tempted to purchase a powerful computer to solve models faster. However, as users, we really don’t need that much computational power. What we need is to learn how to develop an accurate model. To ensure that a solution is accurate, you need to validate your model with experimental data or with another simulation that has already been validated. This process can be time-consuming. Here is a solution: visit engineeringdownloads.com and search for the paper you need to validate your model. This could save you much more time than purchasing a faster computer.
If you already have a model that you are willing to share to raise your citations and earn money, come to us—we have great deals for you. Great news: if you are working on research, we can even help you secure a refund from your university for your expenses.













