Maybe. I don't mean that the RGB values are wrong in relation to what the photo represents. I mean they're wrong in relation to what colours the actual dress is.
Then again I'm not really sure what I'm trying to mean when I say, "actual colour".
Photography is *always* wrong when it comes to an object's "actual colors". By the process' very nature, and the necessary involvement of the photographer, photographs are never true to the original moment. Perhaps partially true. An interpretation of the true. 50 shades of true.
That is why scientific imaging requires very precise instruments and a thorough documentation of the signal processing.
Humans also perceive colors differently (that is, non-objectively) but as usual in human experience there's more common ground than there is deviation.
To add further comments to my previous response to nanook:
Other things could be involved in the images mismatch to the deep blue original could be shitty artificial light. Artificial lights don't have the same intensity in all the frequency bands than daylight does. Some have some serious peaks in certain spectrums, while gaps in others (like common fluorescent lights).
Imagine you have a thing that is very reflective to blue light and then you shine a light that has some big gaps in the blue spectrum. The object will never look good under this light, because there is a lack of blue light to reflect. Blue things that fall in the gaps will never look good, while those lucky objects that aren't in the blue gaps might look ok. Greens would look yellowish. Purples would look reddish.
Specialists working with color and lighting know there's a thing called the
color rendering index (CRI, which is
not the same as Color Temperature) used to qualify light quality of lamps. Architects, fashion designers, graphic designers, print-shops, museographers, high-end retailers etc. pay more to buy special high-CRI lamps so they can minimize the unwanted color shifts.
Most lamps sold nowadays are around CRI 80, which is good enough for non-specialist use, but it was very easy to find much, much lower quality ones not too long ago. Tri-phosphor fluorescents also have become more common and cheaper, though still pricier and less energy efficient than your regular color-poor fluorescent.
Further down the line, when a digital camera captures light the signal is often transformed and the image is not stored in a raw format, but a lossy compressed format like JPEG. This "pre-processing" that is done in-camera loses a lot of important information, and certain modifications are applied to the image to make it look "better". Often cameras have presets for some of this pre-processing, so you can choose "neutral" "standard" "vivid" "landscape" "portrait" and so on, which alter the tone curve, the saturation, the sharpness, noise, color balance, etc... This modifies the whole image irreversibly. On the JPEG conversion process the white balance is also baked into the image, so careless people or bad auto-balance algorithms often produce terrible images.
Thus, with the same camera, with the same exposure settings, in the same lighting it is possible to produce two images with different colors due to the pre-processing settings.
Those that take their photography more seriously often prefer to shoot in a raw format, that is, without the in-camera pre-processing, and rather make the adjustments in post-processing on their computers. This makes for larger files and more time dedicated, but gives photographers much more control and higher-quality end results. Of course in the instagram age of instant-gratification, most people with a camera these days don't know shit about how it works, and give less shit about the technical quality of the images they make.
I find it hilarious (in a sad way) that people are willing to drop some serious cash for pro or pro-sumer equipment and yet have zero knowledge of how to use it, producing results not far from shitty consumer cameras.