Welcome to the very confusing field of optical units. An image does not have units of intensity. See the table of the bottom of this page: https://en.wikipedia.org/wiki/Candela I'm surprised you couldn't find anything because there is tons of information out there, unfortunately it will make your head spin unless you have a Ph.D. in optics (sometimes even if you do, speaking from personal experience). The units of an image are like joules, or in photometric units lux*m^2*seconds, which is lumens*seconds, which is candela*steradian*second (to get it into all base SI units). Anyway, think of it as a measure of energy (or luminous energy). Let's use regular radiometric units instead of photometric (luminous) units (which are restricted to the human visual range and a lot more complicated). So you have optical power hitting your sensor. Like 10 watts over an area of 1 cm by 1 cm. Now the CCD well integrates those photons. Each pixel might be 5 microns by 5 microns. So now we have watts per area multiplied by the area. That's how many watts are integrated by that pixel. But the pixel only integrates for a certain number of seconds, and watts is joules per second, so you have watts*seconds = (joules/second)*second = joules. That's why I say it's like joules or energy.
OK, that's more than you wanted to know, so I won't even bother to go into the "intensity" of a light source which is even more complicated. Even the "experts" don't agree. For example the American Institute of Physics says that the "intensity" of a light source is W/steradian, yet intensity is an SI Base quantity (like meter, kg, second, ampere, kelvin, and mole) and it's units are candela. My late optics professor, Jim Palmer of the College of Optics at the University of Arizona, got so worked up on the sloppy usage that he wrote a paper on it: "Getting intense about intensity", Metrologia, 1993, vol 30, pp. 371-372. I attach a partial screenshot of his paper, for educational purposes.