I second your motion! I was especially surprised to hear this coming from an ME. Both MEs and CEs should look up "Galloping Gertie," the Tacoma Narrows Bridge that failed rather spectacularly due to resonances "outside the range of interest." Also, students aren't taught about the limitations of the rote methods they learn any more. A simple example: the Fourier Transform (which underlies all frequency analysis methods) has a big qualifier: it exists ONLY for "steady-state systems" implying two things: the signal NEVER CHANGES, and perfect linearity is assumed. These are the assumptions underlying the mathemetics.
I would strongly disagree with the statement "Any real-world device or process will need to function properly for only a certain range of frequencies. Outside this range, we don't care what happens." I'd say that, in real-world systems, you'd better care about frequencies outside it "working range" or it will bite you - sometimes seriously. And in audio design, this is especially true. One glaring example is RF interference, which often arrives right along with the system input signal. Another thing this cavalier approach neglects is that ALL systems are non-linear to some degree. Now imagine the effect of two "out-of-band" (say ultrasonic frequencies for an audio system) signals applied simultaneously. The non-linearity of the system intermodulates the two and one of the results is a difference frequency that IS inside the audio range ... and audible as a non-harmonically-related "grunge" or "veil" in the audio quality. I've found that folks who approach engineering issues with only math equations (I often call them "math snobs") most frequently forget or ignore such issues. If your system can't react "gracefully" to out-of-band inputs, then one of your tasks is to remove/attenuate those frequencies so that your system doesn't "see" them! Some of the most awful-sounding audio systems are due to a DC-to-daylight bandwidth design philosophy. Any system, audio or otherwise, should include filters at the inputs. Of course, be careful what kind of filter you choose, so you don't degrade transient response (for audio, Bessel filters are the best choice because they optimize linear phase). - Bill Whitlock, president & chief engineer, Jensen Transformers www.jensen-transformers.com
I very much appreciate and agree with your approach, but the current crop of EE students has only a vague idea what you're talking about. Frequency domain analysis leads to writing and solving equations using the Laplace transform, i. e. "transfer function" equations involving ratios of polynomials in s and talking about Routh-Hurwitz, gain and phase margin (in a servo system or circuit) etc. But the current crop of students can only analyze a system expressed in the inverse z-transform, and they use an entirely different set of tools to attempt to determine the "order" of the system (an approach by the way which can quickly get you in trouble in the real world if you attempt to apply it to a system which is even marginally nonlinear). It's also true that the "order" so obtained depends in the real world on the rate at which the system is sampled (and this order is constrained to be a rather small number if the system can be "solved" at all), and hence offers very little "insight" into the principles underlying the system under analysis, and the whole process often results in the necessity to use highly complex mathematical techniques involving large mathematical matrices to make even the simplest system work properly. Nonetheless anything having to do with "frequency domain analysis" is relabeled a "legacy engineering tool" and moved to the utterly inaccessible portion of the engineering library where books are stored that were written before 1980! It strikes me that it's likely that MEs are probably taught frequency domain and EEs are taught time domain, and this could just be a scheme so professors from BOTH departmens can get consulting assignments out of the disparity (!), anyway I'd like to hear your comments because I believe BOTH methods are useful for solving "real world" problems.
Thanks for the informative post. Frequency response plays a major role in audio design and development. It can be to study the response of the audio components to any frequency.
Actually in real world, no frequencies should be exaggerated or reduced, more accurate representation of the original sound is required. Any audio device should preserve the loudness relationship between various instruments and voices and should not over or under-emphasize any frequency. This can be achieved if frequency response of device is flat. A flat frequency response means that the audio device is equally sensitive to all frequencies.
Altair has released an update of its HyperWorks computer-aided engineering simulation suite that includes new features focusing on four key areas of product design: performance optimization, lightweight design, lead-time reduction, and new technologies.
At IMTS last week, Stratasys introduced two new multi-materials PolyJet 3D printers, plus a new UV-resistant material for its FDM production 3D printers. They can be used in making jigs and fixtures, as well as prototypes and small runs of production parts.
In a line of ultra-futuristic projects, DARPA is developing a brain microchip that will help heal the bodies and minds of soldiers. A final product is far off, but preliminary chips are already being tested.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.