August 22, 2014, 03:51:15 AM

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - Aglet

Pages: [1] 2 3 ... 62
1
Third Party Manufacturers / Re: D810 users are seeing spots
« on: August 21, 2014, 09:50:48 PM »
I also got my 5DII when it was first released and have never had an issue. Neither have I ever updated the firmware - 'cos I never do  ;)

The 5DII must be remembered as one of the best sorted cameras right from its inception; not a good example to use !

HEHE.. that makes me chuckle. :P

Whilst it may not have impacted you, I know people for whom it did delay their purchasing decision until after it was fixed/resolved.

yup, I remember seeing those xmas light black dot examples and I waited until after a FW update before getting one - but no matter what firmware version I ran on mine, it was a bandy bass terd of a camera.

2
Photography Technique / Re: APOLLO missions - image inconsistencies
« on: August 19, 2014, 03:46:02 PM »
...
And the incredible Eidophor Projectors they used in mission control!!
http://en.wikipedia.org/wiki/Eidophor[/b]

Eidophors used an optical system somewhat similar to a conventional movie projector but substituted a slowly-rotating mirrored disk or dish for the film. The disk was covered with a thin film of high-viscosity transparent oil and through the use of a scanned electron beam, electrostatic charges could be deposited onto the oil, causing the surface of the oil to deform. Light was shone on the disc via a striped mirror consisting of strips of reflective material alternated with transparent non-reflective areas. Areas of the oil unaffected by the electron beam would allow the light to be reflected directly back to the mirror and towards the light source, whereas light passing through deformed areas would be displaced and would pass through the adjacent transparent areas and onwards through the projection system. As the disk rotated, a doctor blade discharged and smoothed the ripples in the oil, readying it for re-use on another television frame.


That is really cool!  I'd never heard of this thing before.

3
The idea of being able to do this is absolutely drool-worthy.
Will be interesting to see how well it performs and how much movement range it will have on a FF.

www.imaging-resource.com/news/2014/08/14/hartblei-introduces-hcam-master-ts-14-24mm-tilt-shift-optic-for-sony-e-moun

Dang!  Sony's the only system I don't carry... yet.

4
EOS Bodies - For Stills / Re: Canon mirrorless: Status?
« on: August 18, 2014, 05:27:27 PM »
I'm intrigued why more fixed-mount lenses with a simple, high quality zoom aren't offered more often.  Right now, the best bet for fixed-mount lens with a small, high quality zoom are some "cheaper" APS-C Leicas or the high-end point and shoots like the Sony RX100 series or the G1X II.  Fuji has the X10, X20 bodies that do this as well, I think...

I think the reason why is manufacturers want lens pullthrough dollars, so the added cost / hassle of making it modular in as many body designs as possible is more profitable in the longer term.  Just guessing, though.

- A

...and there's that great new Panasonic too.
But I think you've nailed it, it's more about profit than making the best possible all-in-one that would meet the needs of 95% of people, 95% of the time.
Build such an ideal camera and you'll only sell a lot of them until you saturate the market, then you'll be out of customers and unable to sustain a robust operation.

5
Photography Technique / Re: APOLLO missions - image inconsistencies
« on: August 18, 2014, 05:21:15 PM »
I'll answer just this one.  The lander is up on legs and next to a slight hill on camera right.  This makes it's shadow appear like it's going a different direction when it's not.  Look carefully at the lander and its legs and you'll see that its shadows are consistent with those in the rest of the image.

http://www.lpi.usra.edu/resources/apollo/images/print/AS14/68/9487.jpg

By the way, this one and many more were debunked by the Mythbusters.  Go watch the episode.


Thanks for the link, certainly changes the appearance and perspective of the image when the R side of it isn't cropped away!  Lots of other great images there to explore too.

This is a gag post, right?  I went to CR and the Onion site breaks out.


HAHA!  Not completely gag. :)
Just looking at poor quality, compromised images and explanations for them leave too much wiggle room.  Combine that with the pressures such a mission would have, from so many angles, that it's conceivable to have also spent a lot of effort to generate some contingency images....  Then again, this was pre-Nixon era... but not by much. ;)

Thinking about it now, that would have been an exceptional feat to accomplish with the technology of the time!  So there's room for doubt.. and deception.


Bull. The 747 was designed from scratch and brought into production in 28 months during that period, without computers. Remember the Concorde?

It was an exceptional feat. That's the kind of thing you get when thousands of dedicated individuals, with extraordinary resources, work for a long time on a common goal.


I agree.  Feats like this are possible when great leaders inspire great minds and many talented/smart individuals to work together to achieve a common goal.  Those 1960's folks did all that with slide rules and an enormous amount of guts, sweat and sacrifice.  (Not to mention copious amounts of coffee and cigarettes!)

Unfortunately, I'm afraid we have mostly forgotten that spirit of self sacrifice and achievement in this country.  Our leaders are largely selfish, corrupt or just too weak to make a difference.   The majority of our youth are now poorly educated, apathetic and self centered.  (And those are the ones that aren't on public assistance programs.)  Our smart and motivated youth are in short supply these days.  It's a terrible shame.  Now more than half of our populace fears the future is less hopeful than the past.   :'(


That's another good point.
Cold war era saw tremendous innovation and expansion of technical capability, really pushing the limits of the technology of the day.
Despite much great technological information and abilities in the small-scale ways we have now, I don't think we've seen anything, in a big-picture technology push, that's comparable to what was done back then.  Lots of great unmanned probe stuff has been accomplished but manned exploration, unfortunately, seems to have faltered due to budget cuts and possibly also a different mindset in general.
The muddling about with the space station and shuttles was a good example of how that became inefficient.

If the moon landing program had not been done back then, and were, for whatever reason, a project of this time, I wonder how smoothly it would progress, how much it would cost, and how long it would take.
Perhaps I need to look at some news from China and India to get an idea.

7
Photography Technique / Re: APOLLO missions - image inconsistencies
« on: August 18, 2014, 01:56:49 AM »

OK, some of you are jumping to the extreme end of a different argument about whether or not the moon landings happened at all.  My post is not about that.

I'll reiterate my point, underlined below

They present a variety of interesting discontinuities and other inconsistencies which could lend some credence to some of these images being produced in ways that are not congruent with the official story.
Whether differences in lighting or physical geometry, some things just don't look right.

Now I'll expand that a little by saying, it certainly does appear that some of the images appear to have been made in a staged production environment, or in near-earth-orbit, or composited in such a way as to try make them appear genuine when they may not be.  There are many reasons why this may have been done, frankly, many of them I would also think of as contingency items if I were part of that team at the time.  Still others may have been manipulated for purely artistic and marketing reasons and not disclosed as such.

If you don't watch the whole video, try watching these short segments.
Otherswise it's a lot of time to spend when you could be reading CR instead. ;)

E.g:

0:20:50 thru 0:21:40  -  scene items in FRONT of camera reticle marks, these do not necessarily look like "bleed outs" as described in the debunk.  But that would be sloppy compositing work otherwise.

0:22:40 thru 0:26:40  -  shadows that don't line up for infinity isotropic light source (sun).  I'd like to hear a good reason how this photos like this were taken on the moon and not under artificial light.  I've never seen a lens create this kind of anamorphic distortion and the surface does not appear to have any topographical features that would create an apparent shift in shadow directions. see screen shot below.

0:31:45 thru 0:33:14  -  sure looks like fill-light hot spot (artificial lighting argument continues a few more minutes) and does not match the wiki debunking image I looked at.

0:38:30 thru 0:43:15  -  seriously uneven lighting causing falloff, + some front fill light, possible horizon too high relative to where camera should be.  another few minutes on, central reticle is not in the center of the shot, or even close, it looks reframed.
all these items are plausible debunkable - but what was the source of the image being analyzed?  Was it purported to be an unadulterated copy or was it a concocted one?  likely the latter.

As for the ozzy woman, she's not alone seeing the coke bottle and it's live vs a later rebroadcast
0:49:20 - 0:53:00


There's more, but the premise I would not quickly discount is that, altho the Apollo program accomplished what was intended, it sure looks like some images and video were also produced from a secret contingency plan and used in support of the main program.

It sure would be nice to see high res scans of the original images disputed with their full DR and detail available.


8
Canon General / Re: Canon lens comparible to a 150-500 or 150-600
« on: August 18, 2014, 12:00:57 AM »
I just got a tiny little Olympus 75-300mm for MFT, only weighs about a pound.  Yes, very slow with f/5.6-6.7 but it was only $450 new and, attached to a new EM10, it performs fairly well.  My intent was to use it as a smaller, lighter (and way cheaper) version of the 100-400 L + 60D when I need to travel with less.
So far, it's looking like a strong performer from 75-200mm (~150-400mm equiv), even wide open.  But I think my old L glass is pulling ahead at the long end AND, at least so far, I can get better, more consistent results from the Canon kit as far as sharpness.  I still need to learn how to optimize my use of the MFT system to squeeze the most performance from it as the IBIS may be less effective at that focal length.
As a mirrorless bonus tho, I can AF and then MF while the camera EVF instantly zooms my AF spot (to a level I define) so I can focus on a bird partly hidden by a branch.... Can't do that too readily with an SLR.

OTOH, a good friend is very satisfied with the performance from his 150-600m Tamron on his D7100; finds it quite sharp beyond 400mm but does get noticeably soft between 500 to 600mm.

So, even if CaNikon don't make such a lens at that price point, there are viable options.

9
Photography Technique / APOLLO missions - image inconsistencies
« on: August 17, 2014, 07:36:51 PM »
This seems about the most appropriate forum heading to post this so here goes.

I've often heard of controversy around various aspects of images from Apollo missions.  I'd never taken it too seriously but an interesting old video on youtube ..

www.youtube.com/watch?v=W79mIGx9Ib4 Small | Large


.. caught my attention the other night and I started watching it.
I've only gone thru the first 1.5 hours, it's ~3:40 total.

They present a variety of interesting discontinuities and other inconsistencies which could lend some credence to some of these images being produced in ways that are not congruent with the official story.
Whether differences in lighting or physical geometry, some things just don't look right.

None of this was obvious, when I was a wide-eyed kid watching these events unfold, back in the day on a small B&W TV.  Thinking about it now, that would have been an exceptional feat to accomplish with the technology of the time!  So there's room for doubt.. and deception.

Has there been a good discussion on this topic on this site before?

If not, with all the expertise available here, it should be possible to have a very interesting one.

For convenience' sake, if you comment on this, or any other video you may reference on the subject, please include the video time relevant to your reply.

10
Reviews / Re: DxO reviews Sony A7s: king of low light photography?
« on: August 06, 2014, 01:51:49 PM »
are you saying you disagree, that you can't squeeze 13 effective bits from properly processing a 12 bit data set?

Are you saying that information not captured initially (lost due to clipping at the bottom or top of the range) can be recovered by 'proper processing'?

If so, you may want to have a conversation with those fish you mentioned, which is about as logical as your suggestion.

so you now want to alter the parameters of the argument to include clipped data so that you're not wrong?

doesn't negate the fact that you can obtain somewhat more resolution from an ADC than the number of useful bits of that device.  That's just stat-math.
Sure, you can't recover data clipped at the high end, but it is possible to obtain more data at the low end.

11
Reviews / Re: DxO reviews Sony A7s: king of low light photography?
« on: August 06, 2014, 02:44:44 AM »
So, yes, it's possible to get more bits of data than you have available bits of an ADC.

Sure.  It's possible to upsample an image and get a print that looks good, too.  Doesn't mean the data are real. 

are you saying you disagree, that you can't squeeze 13 effective bits from properly processing a 12 bit data set?

if so, you might want to have a conversation with the engineers at NASA, Texas Instruments, HP/Agilent/name du-jour, Tektronix, National Semiconductor, etc.



Quote
Please go back to shooting with the lens cap on and pushing 4+ stops in post.  That seems to be something you enjoy...

you apparently still like trolling with the same old hook. ;)
your fishin' licence has expired

12
Third Party Manufacturers / Re: DXO uh-oh?
« on: August 06, 2014, 02:29:53 AM »
Quote
Why? Because when you integrate multiple frames, even if you don't even do any kind of dark or bias frame subtraction to remove read noise, your averaging those frames together. Averaging reduces noise. So, let's say you have the option of shooting one ISO 100 shot at 1/10th of a second, or four ISO 400 shots at 1/40th of a second. Integrate the ISO 400 shots, and you reduce noise by averaging. You reduce ALL noise, including deeper shadow read noise. In a Canon camera, ISO 400 has as much DR as ISO 100, so your did not lose anything by doing that, but because you could use a higher shutter speed, in the end, after integration, you gain something.

I'm gonna poke a small hole in this argument, even if i've made some math errors.
Otherwise, I know the basic premise is correct and I agree with you.

Averaging 4 at iso 400 vs 1 at iso 100 will not net as much of a return because the SNR of any sensor also gets worse at higher ISO by a ratio that's mathematically pretty close to the same as the benefit of the stacking, at the same ratio of iso to n images stacked.
Accordian tu Ducks-o-mark.  5.1 dB worth (39.7 -  34.6) on the 5d3, for instance, at 18% gray (screen) (similar for d800 as well)
stacking 4 gives about 6dB of benefit??  then, net 0.9dB improvement, not likely noticeable.
You'd need to stack more images at the same iso to get better results.

However, where it's needed most, in the deeper shadow areas, the SNR difference between 100 iso and 400 iso is much smaller and averaging 4 iso 400s will certainly make an improvement over 1 at iso 100.  This is more a characteristic of Canon's sensors that can be exploited more effectively with this method.

Also, this is only applicable to random noise.
if theres FPN, then averaging can make it worse by reinforcing it unless you also apply a spacial shift to the images you're stacking and then realigning them later.

13
Third Party Manufacturers / Re: DXO uh-oh?
« on: August 05, 2014, 08:41:12 PM »
So now <the D810> certainly IS competition for the 5d3 in more types of shooting.

Yet Nikon IS still predicting greater sales losses than Canon.  Some competition...   ::)

as long as SoNikon sell enough to stay in production and a step ahead I'm OK with that.
I don't own stock in any of them. :P

Edit: typo

14
Reviews / Re: DxO reviews Sony A7s: king of low light photography?
« on: August 05, 2014, 08:38:18 PM »
Nope it's an oversimplified example to illustrate a simple point: When you downscale an image you gain additional information per pixel in the downscaled image (assuming you use a sensible downscaling algorithm). Each pixel in the downscaled image will use information from multiple pixels in the original.  In practice it is really trivial to observe, take a somewhat noisy image and downscale, which is more noisy and thus has less dynamic range: a pixel in the original image or a pixel in the downscaled image? Obviously you don't gain editing latitude for the image as a whole (you loose it), but each pixel individually gain DR.


Gains DR to a point.  There's a ceiling, and that ceiling is the maximum DR at capture.  Sorry, but your suggestion that combining four pixels with 14-bits at capture and getting 16-bits of real data is ludicrous.  But you're probably making dilbert happy as you waft the stench of DxO's BS (aka Biased Scores) around the forums.  Nothing to be proud of, IMO.


It sounds like you're dismissing it out of hand when you really should be saying something else.

I'm sure you can remember a day, back when (are you old enough?), early digital scientific instrumentation often made use of (noise) dithering to improve dynamic range and reduce noise.
I don't care to get into all the math, so I don't know just HOW much effect it can have, but it does work.
In the case of a camera sensor, averaging 4 pixels is gonna buy some extra DR at the expense of spacial resolution.

Read about it here, for those interested:

www.analog.com/library/analogDialogue/archives/40-02/adc_noise.html

EDIT:  adding quote from above referenced article to keep things interesting:

Digital Averaging Increases Resolution and Reduces Noise
The effects of input-referred noise can be reduced by digital averaging. Consider a 16-bit ADC which has 15 noise-free bits at a sampling rate of 100 kSPS. Averaging two measurements of an unchanging signal for each output sample reduces the effective sampling rate to 50 kSPS—and increases the SNR by 3 dB and the number of noise-free bits to 15.5. Averaging four measurements per output sample reduces the sampling rate to 25 kSPS—and increases the SNR by 6 dB and the number of noise-free bits to 16.



Yes, I've played with digital averaging and dithering (both before and – with much less consternation – after MATLAB).  My $300 EOS M does multishot noise reduction.  Woot.

Dismissing out of hand?  No.  Simply defining boundaries beyond which the logic breaks down.  I notice your pasted example indicates signal processing to achieve 16-bits of real data from a 16-bit ADC.  Would you care to provide an example demonstrating signal processing which delivers more than the bit depth of real data initially acquired (as in msm's suggestion of combining 14-bit pixels to achieve data with true 16-bit depth)?


canon's only got 12 bits of data on a 14 bit conversion.
So you're telling me those last 2 bits of noise are a FEATURE by implementing noise dithering? ;)

If only the noise were purely random, it could qualify.  :D  They are getting closer to that.

In fact, it's entirely likely that Canon did go to 14b ADC back then so that noise dither would improve tonal transitions; something I remember being stated in a promo for the 40D in its day.  Altho, "noise" was not mentioned, don't want people to get any negative impressions or customer confusion from that word!

Back in the 80s, I used to get annoyed by articles from math geeks telling us designers how to get an extra half or full bit (or more?) effective resolution from a 12 bit ADC by dithering and processing.  This was the beginning times of digital processing.
I never paid much attention because, for the instruments I was building, I could get ADCs with enough precision and resolution without having to resort to any DSP for a result.
If I need 16 bits, I'll buy 18 or 20 bit converters.  I don't have to worry about cost cutting for consumer scale production.
If noise was an issue, filtering and averaging was easily done with simple software routines the end users preferred to have control of anyway.

see

DITHER

http://en.wikipedia.org/wiki/Analog-to-digital_converter#Dither


and

OVERSAMPLING

http://en.wikipedia.org/wiki/Analog-to-digital_converter#Oversampling


So, yes, it's possible to get more bits of data than you have available bits of an ADC.
The process is similar whether you have a 16b ADC with 2LSB worth of noise or a 14b ADC you dither and oversample.

So, I think there are already some examples that define that the boundary of a 14b ADC is potentially greater than 14 bits.

Again, plenty of astute signal processing math geeks here can explain how that works.
Jristas astrophotography examples, pulling nebula detail from stacks of dozens or hundreds of images, are a good example of how to extract more bits of info from limited ADCs and noise which equates to a terrific amount of DR and effective bits of conversion.

15
Third Party Manufacturers / Re: DXO uh-oh?
« on: August 05, 2014, 06:33:46 PM »
ya, the d810's many little improvements make it a much better all-around camera than the previous 800s.
So now it certainly IS competition for the 5d3 in more types of shooting.

As for Tony's videos... I'm not a fan.

Pages: [1] 2 3 ... 62