heh, you know this was discovered in the 80's right? 30 years later, we're still not using it? I wouldn't get your hopes up too high for any time in the foreseeable future. Full spectrum CCD's mentioned at the end are a much better discovery, IMO. Black Silicon is nice, but there are a lot of things it changes about how we fab the CCDs. Full spectrum CCD's effectively remove Bayer Interpolation all together (a threefold increase in RESOLUTION), and can be implemented without a drastic change in fab methods. Black silicon increases sensitivity, but what does that really do? Increased sensitivity to noise. We have transistor processes that can effectively increase the gain of our readout electronics to dozens of times what it is now, but we don't use them because of the noise problem. With every miraculous discovery there are years of waiting time figuring out how to tame what it does and make it useful for our purposes. Besides, silicon is reaching it's peak. Spintronics, quantum computing, carbon nanotubes, graphene, these are the paths of the future.