December 22, 2014, 07:21:35 PM

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - rs

Pages: 1 2 [3] 4 5 ... 47
EOS Bodies - For Stills / Re: How to differentiate crop vs. FF
« on: October 22, 2014, 04:25:52 AM »
It's a stop and a third, of noise performance and DOF control (assuming you can keep constant framing through either a change in focal length at the same f-stop or a change in subject distance).  You pay for that just like you do with lenses.


It is a matter of how much photographic possibilities, capabilities and image quality you need or want and are willing and able to pay for. 

Also, while FF cameras and lenses are larger, heavier and more expensive than APS-C gear, the relation is certainly not proportional to sensor size. In real life, FF with more than 200% of Canon APS-C imaging area comes with a 0% [e.g. Sony A7/R, and all tele lenses>135 mm] to max. 50% size, weight, price "penalty".
mFT and 1" sensored gear scales even less proportionately against FF in terms of capabilities, size, weight and cost. :-) 
Have you seen the size/weight of the Pentax Q?  :o

Reviews / Re: Camera Store Trashes New G7X
« on: October 20, 2014, 09:53:17 AM »
Wait, the RX100 does not do 85mm or 100mm, the most common portrait focal lengths.

X100 focal length actually covers both 85mm and 100mm (as 35mm "equivalent")
?? The X100 only covers the 35mm focal length equivalent.

If you're on about the RX100 range, then the mk I and mk II do go all the way up to 100mm equiv., at the expense of the wide end. However, the mk III gains the wide end and loses the long end.

EOS Bodies - For Video / Re: 5D Mk 3 'time remaining' display error???
« on: October 19, 2014, 12:28:11 PM »
It records in a VBR. Due to that it's impossible to accurately predict how long any amount of storage will allow you to record for until it's been recorded.

EOS Bodies / Re: Patent: Canon 2mm f/1.4 Lens for Small Sensors
« on: October 18, 2014, 12:16:32 PM »
This looks like a rectilinear lens. A 2.03mm lens on a 1/3" (iPhone sensor size) 7.21x crop sensor frames like a 14.64mm lens on FF. If the projection of typical canon fisheyes is anything to go by, ~15mm gives a 180' FoV across the diagonal. A 14.64mm distortion free rectilinear lens would give a 112' FoV - remarkably close to the 59.18' half angle quoted in the patent.

Technical Support / Re: 100-400mm will not stop down
« on: October 13, 2014, 01:08:20 AM »
Try stopping down the lens, and then pressing the DoF preview button. If the aperture blades do work, the image in the viewfinder should get darker. You can also look down the barrel of the lens and see the aperture blades while that button is pressed (again, if they're working). You can confirm how this should work by trying it with a working lens.

If they're definitely not working, it could be worth cleaning the electronic contacts on the lens, but failing that, I'd suggest a return trip to a canon service centre for that lens.

EOS Bodies / Re: AA Filter: Still Relevant, Marketing Ploy, or Obsolete?
« on: October 12, 2014, 06:02:10 PM »
Only if you don't understand the way aliasing is created and if you ignore the way system resolution is arrived at. Stop thinking this lens out resolves that sensor or the other way around, it just doesn't work like that.

Look at the Nyquist limit, or as we often refer to it in digital photography, Diffraction Limited Aperture (DLA), the more pixels we get the less aperture we have to show off those pixels. Already pixel density is such that f5.6 gives us the "sharpest" images, more pixels will demand ever better glass and the "sweet spot" will get lower and lower such that we will have less dof to display that resolution.

It isn't that diffraction will get worse, but at the moment we can resolve the diffraction above f8, which is the main reason the 36mp cameras don't actually return much better resolution figures than 24mp cameras, more mp will enable us to resolve the diffraction at apertures faster than f5.6. Once the diffraction limit (Nyquist limit) is hit then aliasing is no longer an issue and neither are AA filters. It will be a very long time before Nyquist limits are hit for very fast apertures.
I was just putting it in simple terms there by using the examples of sensor vs lens resolving ability. I still stand by the underlying principle: its pointless to declare a sensor (if you ever genuinely could) to have such a high resolution as to not need an AA filter, as the resolution is so high (in those declared scenarios, which can no doubt be overcome with a different, possibly future lens, or technique etc) that the AA filter is no longer the limiting factor anyway.

Any form of blurring prior to sampling has the potential to mitigate aliasing. Diffraction, subject motion blur, camera shake, focus accuracy, AA filter, optical aberrations etc.

I did some calculations on MP and diffraction caused by aperture ratio some time ago (I think I posted it here on CR) - while we're closing in on the point of diminishing returns with normal use apertures right now (f5.6 to f8), I think it was something in the region of 300 + MP with an f1.4 lens - but realising that resolution to its full potential would require beyond Otus levels of wide open sharpness (there's no way the current Canon 50/1.4 would do it, regardless of the identical DLA calculations) and one hell of a steady tripod with mirror lockup, and probably an electronic shutter too.

EOS Bodies / Re: AA Filter: Still Relevant, Marketing Ploy, or Obsolete?
« on: October 12, 2014, 05:27:31 PM »
If anti-aliasing is so hard to do, then how do computer games have 16x AA filters running at video game frame rates?
The computer can calculate what the original object and all its details are before any aliasing is baked into the image.

If a computer gets hold of the image with aliasing (such as from a camera), applying the filter after is pretty hard:

If you can describe (in words) exactly what the optical filter is doing to the light, then you can create a program that will do the same in software. In general, algorithms that increase the spread of light (such as an anti-alias filter) are easy to implement and fast to compute. It is the reverse (undo the spreading of light) that is computationally difficult.

While I do not know the details of what the optical filters are actually doing, I see no reason why it should be computationally difficult to spread the light around and anti-alias a scene or portion of a scene. I would love for someone to educate me otherwise.

An AA filter creates a subtle blur before sampling takes place. Its a low pass filter, and these are commonly used before AD conversion for all sorts of sampling, not just imaging:

Their effects cannot fully be recreated post capture. How could you filter out the above without destroying detail or colour?

Business of Photography/Videography / Re: 4K, 5K, 6K and Up Video
« on: October 12, 2014, 05:10:39 PM »
I can't believe people are saying you can "barely see the difference" between 4K and 1080p. The difference is huge and immediately noticeable, even on a 1080p monitor.

Try some 1080p footage taken from a 10mp 3:2 sensor (a 16:9 crop of that sensor gives 8mp) - this is effectively 4k downscaled to 1080p at capture. Ignoring compression artefacts, frame rates or subject (which we're not discussing here), how is that 1080p video worse in any way shape or form on a 1080p output device than that same 8mp 16:9 frame recorded at 4k and then downsized to 1080p on the playback device?  ???

Business of Photography/Videography / Re: 4K, 5K, 6K and Up Video
« on: October 12, 2014, 04:57:37 PM »
These higher res displays look radically better!

Man that new Dell sounds amazing! Maybe I got the UP2414Q too soon!

I mean think about it, these displays are like getting INSTANT, FREE 8MP and 14MP 24" and 27" prints!
It looks so much better than regular HD monitors, that it is not even funny. My PA241W HD monitor looks so fuzzy now and pixellated it's got to go!

And some of the 4k video samples I've seen are pretty amazing. It's so much more like you are really there looking at something.
I fully agree - for stills, these displays with that DPI/PPI and size are getting to the point where there's no need for further improvement. It's just like a scaled up retina display. No longer are you tied to seeing pixelated images. Just everything appearing like a perfect print (if the viewing angles, colour gamut and all that are good enough).

However, I disagree about the need for 4k video (and beyond) with current frame rates. Video is usually shot with a 180' shutter - in other words 1/50th for PAL (25fps) or 1/60th for NTSC (30fps). Each frame of 4k footage is approx 8MP. How many images of moving scenes with a 1/50th shutter speed would resolve much more than 2MP? The background is typically not moving too much but out of focus, and the foreground will have motion blur. 4k (8MP) and 8k (32/33MP) are great - if the temporal resolution is there to match the spatial resolution. NHK have been playing about with 8k at 120fps (allowing for a natural looking 1/250th shutter), and that should be great.

Most 4k footage you see in showrooms uses a very clever trick - it all has minimal movement between frames - either a time-lapse with a fixed camera position, slow motion, or footage of a waterfall or some other scene which doesn't really move. In other words, with the slow frame rate they've cheated and found a way around the whole unnatural shutter speed while maximising resolution. However, watch any real life 4k footage and it'll fall apart. I remember when 1080p was a new thing - they were advertising it on standard def TV, and the footage always looked amazing. However, everything was always in slow motion to make the SD feed appear sharper.

For stills, these displays make perfect sense right now.

I can't agree, I've been looking at lots of 4k samples and they tend to look a lot better to me. Maybe for a 100% non-stop action movie it's less of a big detail, but for other stuff and certainly for nature videos, wow.
For any footage with minimal movement between frames, 4k at 25p/30p will yield benefits over 1080p at the same frame rate. But slow moving footage such as those samples used to sell 4k TV's is not the norm. Certainly parts of some nature documentaries could fit the bill, but not all. Almost all TV shows and films move much faster as the creators want the content to capture your attention, not the technology.

My point is merely that for the extra spatial resolution of 4k to be noticeable over 1080p for typical TV/movie footage (not slow motion/time lapses/tripod based static samples you see in TV showrooms, or, at a pinch, slow moving nature documentaries), the temporal resolution will have to increase too.

4k and beyond no doubt is the future for video. It has big benefits for big screens. But lets not have just one aspect of resolution increased with the other left in the dust. Lets keep some balance between spatial and temporal resolution.

EOS Bodies / Re: AA Filter: Still Relevant, Marketing Ploy, or Obsolete?
« on: October 12, 2014, 03:36:28 PM »
More MP negating the need for an AA filter is a paradox.

If you have a low MP sensor and a sharp lens, most users should be able to take a picture which shows pixelation. Any patterns in the image stand a chance of exciting moire/aliasing, so an optical low pass filter is very useful to avoid that - at the slight expense of ultimate resolution.

If you have a high MP sensor and a lens not capable of resolving as finely as the sensor, then under no circumstances will pixelation, moire or aliasing show up. So why not dispense with the AA filter?

However, there are some facts which have been overlooked in that second argument:

If it's impossible for the system to create that fine per pixel detail, why get rid of the AA filter? What possible resolution advantage can you gain by dispensing with it when there's a surplus of pixels anyway?

And what happens if that high MP body is paired with a sharper lens which can out resolve the sensor? Then we're back to square one. Pixelation can occur in certain circumstances, and then some users will find moire appear in their pictures - all because their lens/technique is able to make use of the extra resolution gifted to them by the camera manufacturer. Great, eh?

I doubt anyone owns one as they're only available for pre order at the moment.

There are some sample images out there:

Canon General / Re:
« on: October 07, 2014, 01:17:32 AM »
Well the timer ran out... Nothing there... Whats with Canon guys ?
Just read like four posts up bud. Read before posting
For everyone who says the countdown timer is showing up wrong, is the time zone on your device set wrong? This page uses a client side script based on UTC - if your clock is correct but the time zone is wrong, UTC must be set incorrectly.

Canon General / Re:
« on: October 05, 2014, 01:12:32 PM »
You can read anything you like into this. I'm hoping this means they've got new sensor fabrication, resulting in a new lineup of sensors which make Exmors seem old hat. But it's probably promotional material for a PowerShot N2...

Does someone want to mention to the OP that switching from Canon with its free DPP software (which doesn't work with his computer, but appears to work fine for everyone else) to Nikon will mean paying for their DPP equivalent. Last time I looked, Capture NX 2 was $180 - $30 more than Lightroom!

It gets the site indexed in Google for those keywords. It'll drive traffic to the site, which usually means money for the site owner.

Pages: 1 2 [3] 4 5 ... 47