I posted a reddit comment about the picture of a dress that has been confusing people. Depending on viewing conditions it looks either white and gold, or black and blue. It’s a rather extraordinary demonstration of how malleable human perception is, and it really shows off the fact that while we believe we are capable of observing an “objective” reality, our minds make up our perception of the world with only modest input from our senses. Given that nobody can tell the color of a dress when they are looking at a photo of it, the picture obviously speaks to how much we rely on things like human recollection of events in court cases which literally decide matters of life and death. A witness can be 100% sure that they saw one thing, but that doesn’t mean that what they saw necessarily had a strong correlation with the objective reality that is so important. The reddit comment was in response to a post that then got deleted, so I assume nobody will ever read it there, so I am reprinting it here.
Is it some aspect of my environment (ambient light, etc.) that is altering the way my brain guesses?
Absolutely. Most people never notice that daylight is bright blue (around 5600-6500 kelvins) and lightbulbs are bright orange (around 3200 kelvins). When you see a “white” object under varying conditions, you are actually getting a huge range of colors falling on your eyeballs which all “look” white because your brain is autocompensating for varying ambient lighting conditions. On a camera, this is called white balance, and if it is set wrong the picture will look absurdly blue/orange. Depending on the lighting conditions where you are viewing the image, and the color temperature of the monitor, the dress can look all sorts of different ways. This is why a professional color grading suite is always blocked off from outside lights, and always uses dim lighting that matches the color temperature of the primary display in the suite. The primary display is generally calibrated to “D65” which displays white stuff as closely as possible to the color of a “blackbody” object heated to glowing “white hot” at 6504 kelvins. (Which is even bluer than most daylight.) Because your perception in the grading suite adjusts to “white = D65” you can make consistent adjustments to the color. Then when somebody at home is sitting in front of their TV watching several shows and lots of commercials in sequence, they will get used to whatever the whitepoint of their TV is in their lighting conditions and generally have a similar perception of color to what was intended, even though the photons hitting their eyes are completely different than the photons hitting your eyes while you are in the color grading suite, both in spectral distribution and quantity.
Trying to make color grading decisions for film or television when the decision maker is in an unconstrained viewing environment (i.e. a producer checking it on his laptop / iPhone at home/the airport/ the office ) is basically impossible because of these sorts of effects.
There are tons of fascinating color perception optical illusions, but this is now one of my favorites. I’ll probably wind up using it in a color training talk at some point. It’s certainly an extreme example.
Also metamerism is important. Different people will see different colors from the same monitor, even when standing right next to each other because of differences in the receptors inside their eyeballs. Even when getting the exact same photons as input. It’s weird, and it’s probably one of many factors that influences this particular illusion/trick. But I think it’s probably not as important as color temperature and white balance compensation. OTOH, anybody interested in colro perception should look it up if they want to start crawling down a deep rabbit hole.
So basically, white dress reflecting some blue daylight to a camera expecting orange light bulb light makes a weird picture that under some viewing conditions your brain compensates for, or overcompensates for, or fails to compensate for or whatever. Or maybe the opposite of what I just said. It’s hard to tell. I’d need to see the thing in person, or know more about the color settings used on the camera to make any formal analysis about what the RGB values in the image actually represent. Anybody looking at the pixel values and making any conclusions about the actual color of the material or of the lighting conditions is dealing with a problem that actually has too many variables to solve from the image alone. Anyway the result is that color perception goes all to hell.