An artist's rendering of how DARPA aims to create an embedded cooling system for electronics components through its ntrachip/Interchip Enhanced Cooling (ICECool) program to improve thermal management. (Source: DARPA)
This effort seems like it could have far reaching ramifications for a host of different gear, from military equipment all the way through consumer electronics. One thing that strikes me about these different DARPA efforts is how the agency has embraced a level of crowdsourcing or at minimum, idea seeking, from a broader constituency in its efforts to break boundaries on product design and innovation. It will be interesting to see what these efforts unearth by opening up the process to a wider audience.
DARPA is at it again in low-power energy R&D. The "chips getting smaller and hotter" phenomenon is a longstanding, ongoing industry problem, with cooling solutions always trailing technology-wise. In military apps, it tends to be even worse because of the very high-performance electronics used for apps like signal processing. The efficacy of older forced-air cooling and conduction-cooling techniques used at the system level began reaching practical limits a few years back, when liquid flow-through (LFT) and spray-cooling came into use. Microchannel/microfluid cooling techniques are a subset of LFT, and both board-level and chip-level research has been ongoing in universities and industry, both military and commercial suppliers, for several years. So I wonder why DARPA is now funding this type of research?
Great idea to take cooling to the next level and embed it into the device up front. I am eager to see what the measured improvement will eventually be. Will it be a small incremental improvement or a major paradigm shift in technology? (I think it will eventually become the latter).
There's a basic principle that applies to ALL 'cooling' methods. To move a certain amount of heat, you must have area x temperature. If device size is specified, the only way to lose more heat is to have the radiator hotter. Pressurized Ethylene Glycol allowed smaller radiators because it could much hotter than unpressurized water.
For cooler laptops on laps, you need some external part of the device much hotter than the part that is in contact with your lap.
A more efficient heatsink (lower K/W cooling system) for semiconductors runs HOTTER cos the temperature is closer to the temperature of the working part. That's how it moves more heat.
These fancy methods only move the heat (and temperature) closer to the external surfaces of the device. Applies even to forced air cooling as the ambient air is the ultimate 'external surface'.
Not sure how applicable this is but I seem to remember WAY back in the 60's that there was a white paper by Eimac (the transmitting tube people) that talked of BOILING the water at the anode of a large power tube rather than simply circulating water passed it. Boiling removed a lot more heat because of the change of the water (or what ever) to steam. May be Freon or some similar low boiling point liquid could be used to an advantage.
Also I seem to remember that RCA filled some of their first transistors with toluene to help cool them. I think these were germanium devices.
> that talked of BOILING the water at the anode of a large power tube rather than simply circulating water passed it. Boiling removed a lot more heat because of the change of the water (or what ever) to steam.
This is how a heat pipe works. The speed of heat transfer approaches that of the speed of sound. You select your heat pipe so that the boiling temperature lies between the hot item and the ambient cooling mass.
The design problems with heat pipes is how to get the heat into them at one end and out the other.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.