I'm glad that you are enjoying the discussion. It wasn't my intent to offend anyone.
I didn't confirm the nonsense. Shrinking the picture simply makes it's details imperceptible to you. If you can't see a bacteria, it doesn't mean that there are none. In case with a magic shrinking machine, the details are made smaller so you can still see them by shrinking yourself or maybe using a microscope. However, when you shrink the image on your screen or print a thumbnail, you are just losing the information. Just like for a half-blind person all your images can look same "sharp" or same "blurry". In fact, for him, sharp and blur looks the same. CoC is about perception. DoF is not, it is about information, same as photography. Once the light of an optical image hits the sensor, it is gone, all that's left is the information gathered by the electronics. If you shoot a picture that has nothing in focus, it doesn't matter to what resolution you downsize it, no new information will occur (except the false one). You can manipulate the image in any way you want, but in relativity to reality DoF won't change a bit. If photography is just a form of art for you and perception is the only thing that matters, then perhaps you are not even trying to understand what I'm talking about.
Well by your standard, in reality there is nothing really 'in focus'. The focus 'plane' is a hypothetical thing that has zero thickness. Also on the 'true' focus plane every light point has diameter of 0. Anything in front, or behind this zero thickness hypothetical plane is deemed out of focus because they have a CoC > absolute 0.
The sensor sees something in focus not because they are in focus, but simply because the CoC is smaller than sensor's pixel could distinguish. So what you say? That the image the sensor captures is the real world? It is not.
If above assumption is correct, then take an example, if I shoot a photo with a 320x240 pixel FF sensor, what is my DoF? Even my lens gives a blurry mess I would still get a 320x240 photo that is sharp at pixel level. Does this represent the 'reality'?
The thing is, reality is far weirder than you can ever imaging. We are in a photographic forum, so yes, photography is just a form of art for me and perception is the only thing that matters. I learn from my output photos and prints so I can control my equipment to get the result I want.
Then we leave the underlying physical, electricial or philosophical discussions for some one else or somewhere else.
No. In reality there is nothing that is a 0. Zero is not a thing, zero is just a tool in mathematics. Every point has dimensions and it can be represented as an image of at least 1 pixel. When you are viewing ~18mp image on a ~2mp screen, then 1 dot (color) on the screen represents a group of 9 pixels of the image. Sensor does not capture the real world. The projection of an optical image on the sensor is limited by all kinds of information manipulation by the lens (diffraction, aberrations, vignetting, coma, color tint, distortion, flares and CoC). If the 9 combined pixels carry enough information to represent 1 real world dot, then it will be sharp. If not, then it will be blur (or noise). At 1:1 (100%) it is similar, but with much more false color and noise. If you shrank the blur into oblivion and got some kind of real world information, then it only means that you've destroyed all the rest and the whole blurriness carried only this little.
The sensor and electronics "sees" nothing in focus, just color and contrast of the neighboring pixels.320x240 pixel FF sensor cannot mimic human vision. There are artificial eye implants that allow blind people to see the world in just a few hundred pixels and trust me, it's nothing like the real thing. It's a blurry mess and they can only see a letter or a digit in close-up.