A butterfly’s compound eye, packed with hundreds of ommatidia, tracks a flower with a spatial resolution that looks crude on paper yet proves highly effective in flight. Each tiny unit samples a slightly different angle, creating a mosaic image rich in motion cues and ultraviolet contrasts invisible to humans and most cameras.
Instead of building a sharp picture, neural circuits downstream of the retina compute optical flow and edge detection to estimate distance, speed and direction in real time. The insect central nervous system runs this as a low-latency control loop, tightly coupled to wingbeat mechanics and proprioception, allowing rapid course corrections in gusty air and dense foliage without heavy processing hardware.
Ultraviolet reflection patterns on petals act as high-contrast landing beacons, while polarization sensitivity filters glare that would saturate typical image sensors. Many small drones, by contrast, rely on power-hungry cameras and inertial measurement units, then pass data through computationally expensive algorithms that struggle with motion blur, dynamic lighting and tight energy budgets, especially at the scale of a butterfly.