Pneumatic technology is usually associated with machine control, but at the Medical Design & Manufacturing West Conference in Anaheim, Calif., it was about music.
Clippard Instrument Laboratory Inc., a maker of pneumatic components, demonstrated a guitar that employs 62 air cylinders and 62 pneumatic valves to play music. The brainchild of company namesake Rob Clippard, the guitar uses a combination of 5/16th-inch and 5/32-inch air cylinders to strum its six strings. It also employs a half-inch-diameter cylinder to provide an "acoustic thump" for the music.
Clippard's air guitar uses a combination of 5/16-inch and 5/32-inch air cylinders to strum its six strings. A half-inch-diameter cylinder provides percussion. (Source: Clippard Instrument Laboratory Inc.)
"Rob's musical, and he grew up with Clippard technology, so he just combined the two," says Edward Ehrhardt, sales application engineer for Clippard.
The Air Guitar plays Rob Clippard's own original songs, which are encoded in MIDI (musical instrument digital interface) protocol files. The files are communicated from an iPad to a microcontroller-based I/O board, which decides which valves to fire.
At the show, the Air Guitar drew crowds as it played a running loop of Rob's music.
The Air Guitar isn't his first foray into pneumatic music. He previously designed a 6-foot-diameter "music tree," which employed air to play strings, whistles, and cow bells. The musical tree is now housed at the Cincinnati Museum of Natural History.
Air Guitar Videos
Here's a video shot by the author at the MD&M conference:
Well, bronorb, its time for your humble opinion to change. Mechanized instruments can and do sound amazing when the engineering is done right. Check out Zenph Studios. These folks have developed a technology via DSP analysis of older audio performances of piano recitals that are turned into control files for reproducing the original physical performance and playing it back on a piano as though the dead performer was there!
I see a human versus machine guitar "shredding" contest here. Just like the battle between Garry Kasparov and IBM's Deep Blue, how about pitting this contraption against some of the guitar greats and see what happens over the next few years as technology works on catching up....
bdcst, I don't think we're comparing apples to apples here. Why would I listen to an automated piano playing Gershwin (derived from an actual Gershwin performance) when I could just listen to the recording myself? Also, the Gershwin recording is ONE example of a performance by Gershwin of that piece of music. Do you think he played it exactly the same for every performance? Were other performances different? Better? Worse?
That is my point.The reason for recording music is to capture a single performance. If this is what the automated equipment is doing, it is no different than listening to a vinyl record, CD, tape, or mp3 file.
The nuance, expression, attitude, emotion and soul of the musician is what makes it special. You get none of that with a machine.
In the (Zenth Studios) reproduction instance you get all of the performer's nuances except for actually watching the performer who is long dead. And the reason for reproducing the performance on a real instrument, voiced to sound like the correct period instrument is to hear as close a reproduction to the original performance as possible. Nothing is more high fidelity than hearing an instrument with your own ears! Second best is a modern hi definition digital audio recording of that re-created performance. While you can listen for technique, timing, interpretation, you can also hear a lot more nuanced tonality and sound from a modern recording than an old scratchy acitate of low bandwidth and poor frequency stability, wow and flutter.
This type of performance restoration and instrument automation is a new era for the long sought after technology of capturing the moment for posterity. The Kodak moment! And it does work even if you doubt it.
Of course, short of being there you cannot recreate being in the live audience. But that too will happen someday as we learn to store infinitely more data and read it out at faster and faster rates with dirt cheap digital processors.
Yes, the performance is frozen in time and will not vary as it would if you attended multiple concerts of the same program. That's not the point. They're not trying to eliminate the human artist, at least not yet!
The question of whether engineers could have foreseen the shortcut maintenance procedures that led to the crash of American Airlines Flight 191 in 1979 will probably linger for as long as there is an engineering profession.
More than 35 years later, the post-mortem on one of the country’s worst engineering disasters appears to be simple. A contractor asked for a change in an original design. The change was approved by engineers, later resulting in a mammoth structural collapse that killed 114 people and injured 216 more.
If you’re an embedded systems engineer whose analog capabilities are getting a little bit rusty, then you’ll want to take note of an upcoming Design News Continuing Education Center class, “Analog Design for the Digital World,” running Monday, Nov. 17 through Friday, Nov. 21.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.