« on: October 19, 2014, 12:28:11 PM »
It records in a VBR. Due to that it's impossible to accurately predict how long any amount of storage will allow you to record for until it's been recorded.
This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.
Only if you don't understand the way aliasing is created and if you ignore the way system resolution is arrived at. Stop thinking this lens out resolves that sensor or the other way around, it just doesn't work like that.I was just putting it in simple terms there by using the examples of sensor vs lens resolving ability. I still stand by the underlying principle: its pointless to declare a sensor (if you ever genuinely could) to have such a high resolution as to not need an AA filter, as the resolution is so high (in those declared scenarios, which can no doubt be overcome with a different, possibly future lens, or technique etc) that the AA filter is no longer the limiting factor anyway.
Look at the Nyquist limit, or as we often refer to it in digital photography, Diffraction Limited Aperture (DLA), the more pixels we get the less aperture we have to show off those pixels. Already pixel density is such that f5.6 gives us the "sharpest" images, more pixels will demand ever better glass and the "sweet spot" will get lower and lower such that we will have less dof to display that resolution.
It isn't that diffraction will get worse, but at the moment we can resolve the diffraction above f8, which is the main reason the 36mp cameras don't actually return much better resolution figures than 24mp cameras, more mp will enable us to resolve the diffraction at apertures faster than f5.6. Once the diffraction limit (Nyquist limit) is hit then aliasing is no longer an issue and neither are AA filters. It will be a very long time before Nyquist limits are hit for very fast apertures.
If anti-aliasing is so hard to do, then how do computer games have 16x AA filters running at video game frame rates?The computer can calculate what the original object and all its details are before any aliasing is baked into the image.
If you can describe (in words) exactly what the optical filter is doing to the light, then you can create a program that will do the same in software. In general, algorithms that increase the spread of light (such as an anti-alias filter) are easy to implement and fast to compute. It is the reverse (undo the spreading of light) that is computationally difficult.
While I do not know the details of what the optical filters are actually doing, I see no reason why it should be computationally difficult to spread the light around and anti-alias a scene or portion of a scene. I would love for someone to educate me otherwise.
I can't believe people are saying you can "barely see the difference" between 4K and 1080p. The difference is huge and immediately noticeable, even on a 1080p monitor.
For any footage with minimal movement between frames, 4k at 25p/30p will yield benefits over 1080p at the same frame rate. But slow moving footage such as those samples used to sell 4k TV's is not the norm. Certainly parts of some nature documentaries could fit the bill, but not all. Almost all TV shows and films move much faster as the creators want the content to capture your attention, not the technology.These higher res displays look radically better!I fully agree - for stills, these displays with that DPI/PPI and size are getting to the point where there's no need for further improvement. It's just like a scaled up retina display. No longer are you tied to seeing pixelated images. Just everything appearing like a perfect print (if the viewing angles, colour gamut and all that are good enough).
Man that new Dell sounds amazing! Maybe I got the UP2414Q too soon!
I mean think about it, these displays are like getting INSTANT, FREE 8MP and 14MP 24" and 27" prints!
It looks so much better than regular HD monitors, that it is not even funny. My PA241W HD monitor looks so fuzzy now and pixellated it's got to go!
And some of the 4k video samples I've seen are pretty amazing. It's so much more like you are really there looking at something.
However, I disagree about the need for 4k video (and beyond) with current frame rates. Video is usually shot with a 180' shutter - in other words 1/50th for PAL (25fps) or 1/60th for NTSC (30fps). Each frame of 4k footage is approx 8MP. How many images of moving scenes with a 1/50th shutter speed would resolve much more than 2MP? The background is typically not moving too much but out of focus, and the foreground will have motion blur. 4k (8MP) and 8k (32/33MP) are great - if the temporal resolution is there to match the spatial resolution. NHK have been playing about with 8k at 120fps (allowing for a natural looking 1/250th shutter), and that should be great.
Most 4k footage you see in showrooms uses a very clever trick - it all has minimal movement between frames - either a time-lapse with a fixed camera position, slow motion, or footage of a waterfall or some other scene which doesn't really move. In other words, with the slow frame rate they've cheated and found a way around the whole unnatural shutter speed while maximising resolution. However, watch any real life 4k footage and it'll fall apart. I remember when 1080p was a new thing - they were advertising it on standard def TV, and the footage always looked amazing. However, everything was always in slow motion to make the SD feed appear sharper.
For stills, these displays make perfect sense right now.
I can't agree, I've been looking at lots of 4k samples and they tend to look a lot better to me. Maybe for a 100% non-stop action movie it's less of a big detail, but for other stuff and certainly for nature videos, wow.
For everyone who says the countdown timer is showing up wrong, is the time zone on your device set wrong? This page uses a client side script based on UTC - if your clock is correct but the time zone is wrong, UTC must be set incorrectly.Well the timer ran out... Nothing there... Whats with Canon guys ?Just read like four posts up bud. Read before posting
As the DSLR market falters, what is your strategy for mirrorless?m43 is bigger than CX
The mirrorless market is growing rapidly, thanks to products from manufacturers like Sony. We know that there is a certain body of demand for larger sensors in mirrorless products. Although we already have the 1 inch sensor in the 1 System, we don't want to deny the possibility of future large-sensor mirrorless cameras. So maybe if there is enough demand we may be able to provide another type of mirrorless camera with larger sensors. This is one of the solutions.
I've not found any info about anyone seeing/reporting on this lens from Photokina. Was it not actually there, or is there somehow just so little interest in it?
It's already announced, see below