A dark cloud can outshine a clear patch of sky, if you pick the wrong eyes. In visible light the nearby molecular cloud looks like a solid bruise against the Milky Way, yet instruments tuned to longer wavelengths record a crowded nursery of young stars burning inside.
The basic trick is brutal: dust grains act as both curtain and spotlight. Tiny silicate and carbon particles scatter and absorb optical photons from embedded protostars, a process described by extinction and radiative transfer, so ground‑based telescopes see only a blank silhouette. The same grains re‑emit that stolen energy as infrared radiation once they warm slightly, turning the cloud into a faint heater that glows best for detectors beyond the visible range, such as space‑based infrared arrays.
What looks like contradiction is just wavelength bias. Shorter‑wavelength light is suppressed by the high column density of gas and dust, while longer‑wavelength infrared and submillimeter photons slip through the cloud with far fewer interactions, revealing accretion disks, outflows, and dense cores. To call the cloud dark is therefore a choice made by the human eye, not by the physics unfolding inside it.