Cameras imitate human vision. Color is purely a human response to light: it does not actually exist in nature. How far we see in the electromagnetic spectrum, we call that range light, is simply a response of our biology.
But eyes are not simply tiny cameras. Our vision is a complex system. Part of the retina of our eyes (rods) only sees luminance—how bright things are. Part (cones) only sees color, or, maybe more accurately, differences in color. These signals are transmitted to our brain where it recreates an image of the world. But even in our brain, luminance and color are processed in entirely different areas. So while our experience is a unified vision of a color world (the middle image), the reality is part of our brain is processing luminance (the left image) and another part color (the right image).
We are really not that good at seeing color, which is a late evolutionary adaptation (many animals do not see in color). But it is important. The sky is rather dull in simple luminance—the orange sky at the horizon has the same brightness, or equiluminant, as the blue sky through the cloud. The addition of color in our perception creates far more separation. But color alone lacks structure and detail—color acuity is low in human vision, which why it can be really hard to read red text on a green background when both colors are equiluminant. Click on the image for a larger view.