October 25, 2014, 10:49:01 AM

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - rs

Pages: [1] 2 3 ... 45
1
EOS Bodies - For Stills / Re: How to differentiate crop vs. FF
« on: October 23, 2014, 10:29:55 AM »
I don't agree with this "amount of light" argument. Consider a full frame sensor and an APS-C size sensor with pixels the same size as a full frame taking photos with the same lens at the same f-stop and the same distance from the subject. The signal to noise ratio for each pixel in the APS-C sensor will be the same as the S:N ratio as the corresponding pixels in an APS-C sized area of the ff.

True, but the 2.56x greater area of the FF sensor will gather more total light.  Comparing noise at the pixel level isn't the same as comparing noise at the image level.

You lost me on the image level noise, Neuro. It seems that an APS-C sized crop of the FF image and the APS-C image in this case would be identical. The number of photons hitting each pixel is the same and assuming the downstream operations are identical, what's the difference?

I don't know about you, but as a photographer, I strive to compose correctly, and I look at the image as a whole myself. Your above scenario doesn't make a huge amount of sense as the resulting photos would be very different. I'd start off by using an appropriate focal length on each format to capture the chosen image, not to mention choosing an aperture to achieve the desired DoF, and an ISO to achieve the desired shutter speed/exposure.

If you truly do use the same length lens, settings and distance from subject with the larger format, it would require cropping in post to achieve the framing you've strived towards with the APS-C body. Now we're left with the same image as an APS-C body would have taken - just under 40% of the image. In other words, just under 40% of the light. So to enlarge this small crop up to the same viewing or printing size that you would have wanted should the whole frame of that FF body have been filled correctly, you've now magnified what signal is left by 2.56 times. And strangely enough, the noise has been magnified by that exact same amount too.

If we could just keep cropping with no enlargement penalties such as additional noise, why would anyone bother with these huge telescopes in Hawaii? Surely we can just equip the optics in an iPhone with a sensor 10,000 times smaller than the 1/3rd inch sensor they currently have? We could see the most distant galaxies with unimaginable magnification and clarity, all from a device in your pocket? After all, it is an f2.2 lens  ;)

2
EOS Bodies - For Stills / Re: How to differentiate crop vs. FF
« on: October 23, 2014, 03:58:21 AM »
Sure, if you are doing portraits, where DoF matters, you'll go with FF. The same holds with low light where you want to minimize noise.  However, if I were going to visit Alaska or Yellowstone, I think I'd buy a MFT camera and the Zuiko 300/2.8 lens rather than the 600/5.6 for my Canon.  The combo will be cheaper, lighter, smaller and unless I'm shooting at dusk or dawn, the grizzlies will show the same size (and probably comparable IQ) on the same size print, or my screen.

Indeed.  Because we all know having deeper DoF makes for better wildlife images.  For example, the first image is much better than the second, it's much better that all the distracting stuff behind the subject is decently sharp focus.   :o





Oh c'mon neuro, DoF certainly matters, but this is a shot at 55mm with f/5.6 and a downward angle. Nobody would expect it to have a shallow DoF.  If anything, this shot is an argument against the significance of sensor size, because the following shot I did with my crop sensor and it has a much nicer bokeh.
The amount of background blur isnt just a function of sensor size and aperture. Relative distance between camera, subject and background all play a very important part too, so that owl shot has no relevance to the bear shots with a telephoto.

I can even get some notable separation with my iPhone if I'm shooting a subject at minimum focus distance with a background at or near infinity.

3
EOS Bodies - For Stills / Re: How to differentiate crop vs. FF
« on: October 22, 2014, 05:18:10 PM »
Crop vs. Full frame.

As a 6d owner - my first full frame- im very unimpressed at its low iso quality. There is no 3d-ish POP that i have seen in so many 5d2/5d3 images over the years - image quality that i could easily see was not reproducible on my crop cameras.

To these ultra pixel peeping eyes, the 6d is only slightly better at dynamic range than my t2i.  High Iso handling is generally FANTASTIC, which is why i figure the low iso takes an image quality hit. Low light photography is a whole other ball game compared to my crop bodies.

So while i enjoy my 6d, its nowhere close to the full frame experience i thought it would be. Still dreaming of a 5d3 and non-ancient AF.  :(



-
What lenses are you using?

4
EOS Bodies - For Stills / Re: How to differentiate crop vs. FF
« on: October 22, 2014, 05:16:27 PM »
I must have misread. I was under the impression that some people were arguing that 2.8 lens let in the light of an f8 when on the micro 4/3 mount.
Jarrod
As Neuro said, two stops (f5.6), not three.

To understand this, you need to understand the difference between total amount of light, and the intensity of light. Think of a shaft of sunlight - use a magnifying glass to concentrate that light into a smaller area - you get no more light, but the intensity is increased. Just shrouding more of the light to make a narrower shaft leaves the intensity the same, and reduces the total amount.

A greater intensity of light is what's needed to make a smaller area receive the same amount of light. Simply cutting/cropping out some light, and then magnifying/enlarging what's left afterwards results in less light captured. That's otherwise known as a lower signal, which requires more amplification/enlargement, typically resulting in more noise.

rs, you might be right about light, but you are at 666 posts, so you got to post again to avoid being evil :-)
Good point. Ok, here's one to sort out the number.

The Canon 300/2.8 is about the same price as the Olympus 300/2.8, yet the Canon is smaller, and even with the extra heft of the 2x TC and bulkier FF body, lighter. Plus it doubles up as a 4/3rds equivalent of a 150/1.4

I know which I'd rather buy (not taking into account the dropped 4/3rds mount) and carry with me  ;)

5
EOS Bodies - For Stills / Re: How to differentiate crop vs. FF
« on: October 22, 2014, 05:05:00 PM »
I must have misread. I was under the impression that some people were arguing that 2.8 lens let in the light of an f8 when on the micro 4/3 mount.
Jarrod
As Neuro said, two stops (f5.6), not three.

To understand this, you need to understand the difference between total amount of light, and the intensity of light. Think of a shaft of sunlight - use a magnifying glass to concentrate that light into a smaller area - you get no more light, but the intensity is increased. Just shrouding more of the light to make a narrower shaft leaves the intensity the same, and reduces the total amount.

A greater intensity of light is what's needed to make a smaller area receive the same amount of light. Simply cutting/cropping out some light, and then magnifying/enlarging what's left afterwards results in less light captured. That's otherwise known as a lower signal, which requires more amplification/enlargement, typically resulting in more noise.

6
EOS Bodies - For Stills / Re: How to differentiate crop vs. FF
« on: October 22, 2014, 09:13:26 AM »
the Zuiko is as fast as a 2.8.
You are correct, it is f2.8. And by that exact same premise, the Zuiko is 300mm long too.

What I'd like to know is, if small sensors lead to smaller, lighter systems, how come this lens designed for only a quarter of the frame size manages to weigh in at close on 50% more than the Canon 300/2.8? Even with a 2x TC and bigger body, there's a massive weight saving for the large sensor system, not to mention better handling.

7
EOS Bodies - For Stills / Re: How to differentiate crop vs. FF
« on: October 22, 2014, 04:25:52 AM »
It's a stop and a third, of noise performance and DOF control (assuming you can keep constant framing through either a change in focal length at the same f-stop or a change in subject distance).  You pay for that just like you do with lenses.

exactly.

It is a matter of how much photographic possibilities, capabilities and image quality you need or want and are willing and able to pay for. 

Also, while FF cameras and lenses are larger, heavier and more expensive than APS-C gear, the relation is certainly not proportional to sensor size. In real life, FF with more than 200% of Canon APS-C imaging area comes with a 0% [e.g. Sony A7/R, and all tele lenses>135 mm] to max. 50% size, weight, price "penalty".
mFT and 1" sensored gear scales even less proportionately against FF in terms of capabilities, size, weight and cost. :-) 
Have you seen the size/weight of the Pentax Q?  :o

8
Reviews / Re: Camera Store Trashes New G7X
« on: October 20, 2014, 09:53:17 AM »
Wait, the RX100 does not do 85mm or 100mm, the most common portrait focal lengths.

X100 focal length actually covers both 85mm and 100mm (as 35mm "equivalent")
?? The X100 only covers the 35mm focal length equivalent.

If you're on about the RX100 range, then the mk I and mk II do go all the way up to 100mm equiv., at the expense of the wide end. However, the mk III gains the wide end and loses the long end.

9
EOS Bodies - For Video / Re: 5D Mk 3 'time remaining' display error???
« on: October 19, 2014, 12:28:11 PM »
It records in a VBR. Due to that it's impossible to accurately predict how long any amount of storage will allow you to record for until it's been recorded.

10
EOS Bodies / Re: Patent: Canon 2mm f/1.4 Lens for Small Sensors
« on: October 18, 2014, 12:16:32 PM »
This looks like a rectilinear lens. A 2.03mm lens on a 1/3" (iPhone sensor size) 7.21x crop sensor frames like a 14.64mm lens on FF. If the projection of typical canon fisheyes is anything to go by, ~15mm gives a 180' FoV across the diagonal. A 14.64mm distortion free rectilinear lens would give a 112' FoV - remarkably close to the 59.18' half angle quoted in the patent.

11
Technical Support / Re: 100-400mm will not stop down
« on: October 13, 2014, 01:08:20 AM »
Try stopping down the lens, and then pressing the DoF preview button. If the aperture blades do work, the image in the viewfinder should get darker. You can also look down the barrel of the lens and see the aperture blades while that button is pressed (again, if they're working). You can confirm how this should work by trying it with a working lens.

If they're definitely not working, it could be worth cleaning the electronic contacts on the lens, but failing that, I'd suggest a return trip to a canon service centre for that lens.

12
EOS Bodies / Re: AA Filter: Still Relevant, Marketing Ploy, or Obsolete?
« on: October 12, 2014, 06:02:10 PM »
Only if you don't understand the way aliasing is created and if you ignore the way system resolution is arrived at. Stop thinking this lens out resolves that sensor or the other way around, it just doesn't work like that.

Look at the Nyquist limit, or as we often refer to it in digital photography, Diffraction Limited Aperture (DLA), the more pixels we get the less aperture we have to show off those pixels. Already pixel density is such that f5.6 gives us the "sharpest" images, more pixels will demand ever better glass and the "sweet spot" will get lower and lower such that we will have less dof to display that resolution.

It isn't that diffraction will get worse, but at the moment we can resolve the diffraction above f8, which is the main reason the 36mp cameras don't actually return much better resolution figures than 24mp cameras, more mp will enable us to resolve the diffraction at apertures faster than f5.6. Once the diffraction limit (Nyquist limit) is hit then aliasing is no longer an issue and neither are AA filters. It will be a very long time before Nyquist limits are hit for very fast apertures.
I was just putting it in simple terms there by using the examples of sensor vs lens resolving ability. I still stand by the underlying principle: its pointless to declare a sensor (if you ever genuinely could) to have such a high resolution as to not need an AA filter, as the resolution is so high (in those declared scenarios, which can no doubt be overcome with a different, possibly future lens, or technique etc) that the AA filter is no longer the limiting factor anyway.

Any form of blurring prior to sampling has the potential to mitigate aliasing. Diffraction, subject motion blur, camera shake, focus accuracy, AA filter, optical aberrations etc.

I did some calculations on MP and diffraction caused by aperture ratio some time ago (I think I posted it here on CR) - while we're closing in on the point of diminishing returns with normal use apertures right now (f5.6 to f8), I think it was something in the region of 300 + MP with an f1.4 lens - but realising that resolution to its full potential would require beyond Otus levels of wide open sharpness (there's no way the current Canon 50/1.4 would do it, regardless of the identical DLA calculations) and one hell of a steady tripod with mirror lockup, and probably an electronic shutter too.

13
EOS Bodies / Re: AA Filter: Still Relevant, Marketing Ploy, or Obsolete?
« on: October 12, 2014, 05:27:31 PM »
If anti-aliasing is so hard to do, then how do computer games have 16x AA filters running at video game frame rates?
The computer can calculate what the original object and all its details are before any aliasing is baked into the image.

If a computer gets hold of the image with aliasing (such as from a camera), applying the filter after is pretty hard:



If you can describe (in words) exactly what the optical filter is doing to the light, then you can create a program that will do the same in software. In general, algorithms that increase the spread of light (such as an anti-alias filter) are easy to implement and fast to compute. It is the reverse (undo the spreading of light) that is computationally difficult.

While I do not know the details of what the optical filters are actually doing, I see no reason why it should be computationally difficult to spread the light around and anti-alias a scene or portion of a scene. I would love for someone to educate me otherwise.

An AA filter creates a subtle blur before sampling takes place. Its a low pass filter, and these are commonly used before AD conversion for all sorts of sampling, not just imaging:

http://en.wikipedia.org/wiki/Low-pass_filter

Their effects cannot fully be recreated post capture. How could you filter out the above without destroying detail or colour?

14
Business of Photography/Videography / Re: 4K, 5K, 6K and Up Video
« on: October 12, 2014, 05:10:39 PM »
I can't believe people are saying you can "barely see the difference" between 4K and 1080p. The difference is huge and immediately noticeable, even on a 1080p monitor.
:o

Try some 1080p footage taken from a 10mp 3:2 sensor (a 16:9 crop of that sensor gives 8mp) - this is effectively 4k downscaled to 1080p at capture. Ignoring compression artefacts, frame rates or subject (which we're not discussing here), how is that 1080p video worse in any way shape or form on a 1080p output device than that same 8mp 16:9 frame recorded at 4k and then downsized to 1080p on the playback device?  ???

15
Business of Photography/Videography / Re: 4K, 5K, 6K and Up Video
« on: October 12, 2014, 04:57:37 PM »
These higher res displays look radically better!

Man that new Dell sounds amazing! Maybe I got the UP2414Q too soon!

I mean think about it, these displays are like getting INSTANT, FREE 8MP and 14MP 24" and 27" prints!
It looks so much better than regular HD monitors, that it is not even funny. My PA241W HD monitor looks so fuzzy now and pixellated it's got to go!

And some of the 4k video samples I've seen are pretty amazing. It's so much more like you are really there looking at something.
I fully agree - for stills, these displays with that DPI/PPI and size are getting to the point where there's no need for further improvement. It's just like a scaled up retina display. No longer are you tied to seeing pixelated images. Just everything appearing like a perfect print (if the viewing angles, colour gamut and all that are good enough).

However, I disagree about the need for 4k video (and beyond) with current frame rates. Video is usually shot with a 180' shutter - in other words 1/50th for PAL (25fps) or 1/60th for NTSC (30fps). Each frame of 4k footage is approx 8MP. How many images of moving scenes with a 1/50th shutter speed would resolve much more than 2MP? The background is typically not moving too much but out of focus, and the foreground will have motion blur. 4k (8MP) and 8k (32/33MP) are great - if the temporal resolution is there to match the spatial resolution. NHK have been playing about with 8k at 120fps (allowing for a natural looking 1/250th shutter), and that should be great.

Most 4k footage you see in showrooms uses a very clever trick - it all has minimal movement between frames - either a time-lapse with a fixed camera position, slow motion, or footage of a waterfall or some other scene which doesn't really move. In other words, with the slow frame rate they've cheated and found a way around the whole unnatural shutter speed while maximising resolution. However, watch any real life 4k footage and it'll fall apart. I remember when 1080p was a new thing - they were advertising it on standard def TV, and the footage always looked amazing. However, everything was always in slow motion to make the SD feed appear sharper.

For stills, these displays make perfect sense right now.

I can't agree, I've been looking at lots of 4k samples and they tend to look a lot better to me. Maybe for a 100% non-stop action movie it's less of a big detail, but for other stuff and certainly for nature videos, wow.
For any footage with minimal movement between frames, 4k at 25p/30p will yield benefits over 1080p at the same frame rate. But slow moving footage such as those samples used to sell 4k TV's is not the norm. Certainly parts of some nature documentaries could fit the bill, but not all. Almost all TV shows and films move much faster as the creators want the content to capture your attention, not the technology.

My point is merely that for the extra spatial resolution of 4k to be noticeable over 1080p for typical TV/movie footage (not slow motion/time lapses/tripod based static samples you see in TV showrooms, or, at a pinch, slow moving nature documentaries), the temporal resolution will have to increase too.

4k and beyond no doubt is the future for video. It has big benefits for big screens. But lets not have just one aspect of resolution increased with the other left in the dust. Lets keep some balance between spatial and temporal resolution.

Pages: [1] 2 3 ... 45