Great story, @Beth! During my years at the Air Force Research Laboratory, we would "rapidly prototype" by iterating between CAD/CAM and wind-tunnel testing. My colleagues and I in the testing group would design new ways to make measurements of all sorts of values on, above, through, and around all sorts of bodies. The CFD group would take our results and develop models and then predict unique configurations that would provide validation of their models or provide needed information. We would go back into the lab and figure out how to make the additional measurements and complete the feedback loop. After the CFD models became reliable, the CFD team would provide us with expected parameter ranges for our own transducer models that permitted us to make new measurements that were previously extremely difficult (or at the very least would have taken months of trial and error.) I'm delighted to see the CFD models have advanced enough to support experimentation through simulation and are contributing to the acceleration of technology development. Awesome.
Sounds like quite a process, William. What was the typical timeframe for that back and forth to occur? While the wind tunnel and CFD testing was no doubt an integral part of the design process, did the protracted test cycle have a negative impact on your project schedules?
It was a on-going, continuous process. A nice symbiotic ecosystem among the scientists, engineers, computer scientists, and mathematicians. The Air Force Research Laboratory is a basic research facility at Wright-Patterson AFB (the home of the Wright-Brothers). The Wright Brothers popularized the use of the wind tunnel in aerospace research and things have been increasing continuously since ca. 1900. After entering academia in 1999, I've seen quite a few of our developments hit prime time, including many of the UAVs and drones, RAM and SCRAM jet engines, and a few new diagnostics that are based on laser spectroscopy and digital imaging. Exciting stuff! =]
To answer your question, the AFRL would publish and share its findings and developments with airframe and propulsion manufacturers and contractors. Once a research engine or airframe was developed, our teams would measure performance diagnostics of the proposed designs and the manufacturers would incorporate the results and down-select...
williamweaver, it is amazing how the fidelity of simulations has increased. This may be due to the ability, on fairly inexpensive hardware, to work with a very fine mesh. Of course, improvements in the mathematics and physics also contribute.
Naperlou, you make a good point about the fine mesh. I've commented on other articles about the reliance on FEA used improperly by unskilled people, but the ability to run a fine mesh over the entire model does hide a lot of sins.
Classic modeling (I guess you'd have to say old school) would use a coarse mesh in areas of low interest and fine mesh where one was most interested in the results. An unskilled user might miss an important area because of the judgement needed to decide what was of interest and what was not.
The raw power available now means a fine mesh all over is simply inefficient, but one would never know (or generally care anymore) that the time was wasted.
@TJ: That is definitely one of the concerns raised by the tighter integration of CFD and FEA tools with more mainstream CAD applications. Just because a CAD specialist or engineer can more easily create a mesh or simulation with these more accessible tools, doesn't mean they have the skill set or the background to understand the mathematics and physics behind the simulation. Therefore, there are definite cultural issues raised with CFD and other simulation experts not fully trusting the simulations performed by the non-specialists. I suppose in many cases, they have good reason.
TJ, you raise an interesting point. These CAE programs are not automatic. You need good understanding of the problem and of the physics involved. I have talked to people with PhDs that have used some of the primary CAE products and the interesting thing is that they tend to play down the CAE products. This is not to say that the CAE product was not useful, or actually essential, but they found that they still had to do a lot of work.
Many of the newer tools have what they call adaptive mesh that varies the mesh based on the physics or geometry. As you point out, though, you still need to be able to override the software based on your understanding of the problem or your particular requirement.
Frank J. Sprague taught Edison the scientific method. To design on paper before spend one moment of experimentation. Edison changed his ways.
I used to work like Edison, but Solidworks, Electronics Workbench, SPICE, and writing software has taught me to design first, build later.
Simulation is In engineering is necessary. It is 2012, combining simulation in all aspects of design is important. I want to see something like how Tony Stark designs his suits and devices in the movies Ironman - all inclusive and intuitive software.
Otherwise, the tried and true methods of today allows us to scrape along to the future.
In an age of globalization and rapid changes through scientific progress, two of our societies' (and economies') main concerns are to satisfy the needs and wishes of the individual and to save precious resources. Cloud computing caters to both of these.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.