October 30, 2014, 07:55:04 PM

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - jrista

Pages: 1 ... 97 98 [99] 100 101 ... 300
1471
Landscape / Re: Deep Sky Astrophotography
« on: March 01, 2014, 01:00:27 PM »
Oh, I forgot about planetarium software and plate solving. You will need that as well:

Planetarium Software:
Microsoft WorldWide Telescope
Stellarium
Cartes du Ciel
[others]

Planetarium software allows you to find things in the night sky to observe or image, and can be used to instruct the mount to go to (point the scope to) those objects. By far the easiest way to navigate the sky is with a planetarium.

I personally use Microsoft WWT. If offers a full color ultra high detail seamless map of the sky made from actual photographic images, and offers very fluid and smooth panning, zooming, tracking, etc. It has a very useful search tool. It can connect to your mount directly.

Largely a matter of taste, however.

Plate Solving:
AstroTortilla
Elbrus
Pinpoint
(SGP)*

Plate solving is a process by which an image of the sky is "solved", or where all the stars and other celestial objects are identified by matching it to an index of the sky. Plate solving allows the control software on your computer to know exactly where your mount is actually pointing. There is usually some amount of residual error in pointing, so when you tell the mount to point to "IC434" or "M45", it may not actually center that object in the view. When it comes to brighter objects, that isn't such a huge problem, you can "star hop" until you find the object visually.

For dimmer objects, it's impossible to see them unless you have a telescope with a HUGE aperture (i.e. 14" or more) and a large eyepiece...however with astrophotography, you generally don't have eyepieces and often don't have large apertures, and only computer control software with which to find things by. Accurate pointing is then quite important. By plate solving, you can automate the process of working out the error in your pointing. AstroTortilla is the simplest way to start plate solving. It's free, simple, and effective...once you get it configured right (you MUST learn how to set the min and max scaling, scale factor, and sigma values for your actual FoV of the sky with whatever lens or scope you are using in order for it to work.) AT will work with other control software like BackyardEOS, SGP, MaxIm DL to automatically take images of whatever the telescope is pointing at, solve the image, use the information about the exact region of the sky that image indicates the mount is pointing at to "sync" with the scope a new, more accurate model of the sky, and repoint. The process can automatically be repeated until your pointing accuracy is within a certain configurable precision...by default 1" (one arcsecond). After this, you should be able to point to anything on that side of the meridian (the imaginary line that goes directly overhead from north to south that divides the sky into eastern and western halves). After a meridian flip (the necessary action when an object starting in the east moves past the meridian, where the mount is reoriented in the opposite configuration to point at objects on the western side of the sky), you may need to plate solve again for pointing accuracy to be perfect again.

For fully automated imaging, you can use Sequence Generator Pro (SGP). SGP supports automatic meridian flipping and integrates plate solving to support highly accurate, fully automated all-night imaging. Without plate solving, due to the often inherent error in mount+telescope setup (called cone error, where the optical axis is not exactly at a 90° right angle to the declination axis), pointing accuracy is often off by a few minutes RA/degree or two Dec after a meridian flip. This requires meridian flips to be manual, and if you are not paying close attention to the imaging process, a meridian flip can cost you time. SGP can detect when the mount is in a "past meridian" point from the east, pause your imaging sequence, automatically do a meridian flip, plate solve and remodel the sky, repoint EXACTLY at your object (not just point, but frame it the same as well), and automatically resume your imaging sequence.

Drivers:
ASCOM

In order for you to be able to properly control your telescope and mount setup from a computer, you will need ASCOM. This is the standard component object model and driver platform that ALL telescope equipment and software use to communicate with each other. You will need ASCOM, and an ascom telescope driver. If you use an Orion Atlas and EQMOD, EQMOD includes an ASCOM compatible driver.

1472
Landscape / Re: Deep Sky Astrophotography
« on: March 01, 2014, 03:35:23 AM »
Close-up crop of Flame, Horse Head, IC2023:


1473
Landscape / Re: Deep Sky Astrophotography
« on: March 01, 2014, 03:31:32 AM »
Someone asked about the specifics of my exact setup, as they use a 500mm f/4 lens. Here they are, for those who are interested in getting started the same way I have:



HARDWARE

Mount:
Orion Atlas EQ-G | $1499

The mount is the most important part, hands down. Unless your going to dive head first into scopes larger than 11", you should be looking at the Celestron CGEM or the Orion Atlas EQ-G. I did a lot of research, spent almost a month on it. Celestron mounts are quite pretty, and they are actually manufactured by the same company that manufactures the Orion Atlas (which is the same as the Skywatcher EQ-6). Since it's introduction, however, the Celestron CGEM (and slightly larger sibling CGEM DX) has had some intrinsic gearbox problems, as well as a gear cogging problem. Celestron has done a couple things to fix these issues, however the gearbox is pretty fundamental to the design of the mount. There seem to be no real solution to the issue, which introduces a source of non-periodic error (or rather, periodic error, but at a different cyclic rate than the standard worm gear causes, and when the two converge & diverge, it creates a problem for autoguiding.)

The Orion Atlas is a very good mount. Feature wise, it's a bit better than the Celestron CGEM. It comes with a "dual" saddle built in. There are two main types of dovetails used with astronomy gear: The V-type or Vixen, and the D-type or Losmandy. D-type is much larger, much more stable, than V-type, and is essential for larger gear (like a 500mm f/4 lens.) The Orion always comes with a built-in polar finder scope (its situated in the right-ascension axis, and is used to align the mount itself with the exact celestial pole.) The polar finder scope is an add-on option for Celestron CGEM. The Orion polar finder scope comes with a built-in red LED to help you see the reticule diagram...the Celestron's add-on polar finder does not (although there are guides online that can help you solder in your own onto the mounts control board, if your up for it.)

The really nice thing about the Orion Atlas mount is that there is a good, high quality belt drive for it. Belt drives are what the higher end scopes use. Direct Belt Drive is what the ASA DDM (Direct Drive Mount) line of mounts use, and part of why they have exceptional accuracy and practically no problems with gear backlash. The existence of this upgrade (it costs a couple hundred) is one of the key reasons I purchased an Orion Atlas. I will be adding this upgrade in the future...from what I hear, it greatly smooths out the tracking and allows it to be more precise, with and without guiding. There are also other upgrades for the Orion Atlas, as well as a service called "HyperTuning" offered by DeepSpaceProducts (hypertuning is also available for the CGEM.)

My recommendation is to definitely get the Orion Atlas. When you get the atlas, also search for and buy an "EQDIR" adapter. EQDIR is a special TTL connecting cable that allows you to plug your computer directly into the mount, and control it with software called EQMOD. EQMOD is another bonus for the Orion Atlas that the CGEM does not have...very powerful, very capable software that can do everything the hand controller does, and much more. It also allows "plate solving"...more on that in a bit. (The EQDIR cable costs about $40.)

Saddle:
ADM D-series Side-by-side "Dual" | $249

ADM makes a number of accessories for astronomy equipment. I purchased the DSBS "Dual", or D-type side-by-side "dual saddle" model adapter so I could mount both my lens+camera as the "telescope", and also use an external automatic guiding setup with it's own lens. I'll get into that in a bit.

There are a number of saddle options, and a number of sbs saddle options. You can get V-type or D-type, or dual which takes both vixen and losmandy dovetails. I chose the dual, so that I would never have to spend more money adapting anything to my mount. It's $50 more expensive, but ultimately, it is going to be worth it, because it gives you more freedom to expand in the future. At longer focal length telescopes, you inevitably need larger guiding scopes.

Dovetail:
Astro-Tech Losmandy Style 7.9" | $50

I purchased this dovetail plate to attach my 600mm f/4 lens to the ADM DSBS saddle. There are a bunch of dovetail plates on the market. The nice thing about this one is that it has a slot down the middle, rather than a set of pre-spaced holes. The slot is essential to properly attaching your lens to the plate. With other dovetails, you are stuck either sinking your own hole for the second screw, or using a single screw...and THAT is bad news...one screw will not cut it, because as the mount moves the scope to track across the sky, its angle can change considerably...once it gets to a point where it is more parallel to the ground, a lens bolted to the plate with a single screw will easily slip...and that can potentially cause the whole entire setup, mount and all, crashing to the ground. On top of your lens. The overall weight of the entire setup is about 70 pounds...not even a Canon Great White lens would survive that.

Get this dovetail. Then, go to your hardware store, and buy some 1/4"-20 hex head screws with the small round heads that are about 1/2" long. The hex head ones with a small round head are important, because anything else is generally too large to fit into the slot on this dovetail, and will not hold the lens to the plate securely enough. Use those to bolt the dovetail to BOTH screw holes on your lens' foot.

Guider:
Orion SSAG w/ 50mm Mini Guidescope | $349

For a starter guiding package, you really can't go wrong with the Orion 50mm Mini Guidescope and the Orion SSAG (StarShoot AutoGuider). Don't get roped in by the "delux" version of this, which is about $70 more. All that ads is a helical focusing ring on the scope, however the scope can already be focused in two ways: Either by unlocking the front part that holds the objective and rotating it, then locking it again. Or by sliding the SSAG mini guider (which is really just a webcam sensor packaged with a logic chip that can control your mount through what is called an ST-4 port) in and out of the other end until your focused, then locking it down. Save yourself the $70, just get the Orion SSAG with 50mm Mini scope for $349.

There are other guiding options out there. The only other one I think is worth while is the SBIG ST-i guider. The SBIG is a higher quality guider, nice and compact, however it is a lot more expensive. Between a scope and the guider itself, you'll probably spend $800. The ST-i is really the option you want to go with if you eventually move to a full blown cooled CCD camera. The SBIG STF-8300M is a great midrange peltier-cooled mono astro cam, and in a package deal, you can get the cam, a filter wheel, as well as an OAG (off-axis guider) adapter that the ST-i plugs into. Off-axis guiding is generally more effective than external guiding, as it observes the sky through the same optical tube your imaging through. That allows it to correct for things like flexure. Flexure is one of the bigger problems with getting very precise guiding (essential at longer focal lengths, not really a problem at 200-1000mm). Flexure is the flexing...the bending and twisting...of an OTA (optical telescope assembly) as it is moved by the mount during tracking. It can result in elongation and wobble of stars that makes them look "unnatural" and "not round".

If you think your really going to get into astrophotography, and figure you'll eventually get a proper telescope (i.e. a Newtonian, SCT, or RC), then you might want to start with the ST-i. You can still attach it to an external guide scope (any smaller refracting telescope will work, or even the Orion 50mm mini guidescope).

Misc:
AC Power adapter for your DSLR (for use when your at home)
DC power adapter for your DSLR (for use with a deep cycle battery when your at a "dark sky" site)
DC power adapter for your laptop (for use with deep cycle battery ...)
2-3x DC cigarette plug with positive/negative clamps (for use with deep cycle battery...)
A 20-30 Ah deep cycle battery to power the mount at dark sky sites
A 100 Ah or larger deep cycle battery to power your camera and laptop

SOFTWARE

In order to effectively use the equipment above, you need the proper software. You need control software to actually control the camera and any other accessories (such as focusers and filter wheels, of your using them).  You need stacking software. You need processing software. You will also need guiding software to make guiding actually work. Finally you will need planetarium software to tell the mount to point at various things in the sky.

Control Software:
BackyardEOS
Nebulosity
Sequence Generator Pro
MaxIm DL

Control software connects to the camera at the very least, possibly multiple accessories like a motorized focuser, robotic filter wheel, and even the mount. You use control software to frame, focus, and set up sequences for imaging. Imaging sequences can be light frames (i.e. 30x240s f/4 ISO 400 with lens cap off), dark frames and bias frames (with lens cap on), and flat & dark flat frames. Certain objects in the sky are exceptionally high dynamic range. Orion Nebula is the best example. Sometimes you need very long exposures, medium exposures, short exposures, and very short exposures, to fully capture the dynamic range. But in astro imaging, each "exposure" is ultimately an integration...a calibrated ((light frame - dark frame - bias frame) / (flat frame - dark frame)) image stack. So for each of those groups of integrations, you have multiple individual light frames, as well as the corresponding dark frames. You don't just take an image and then be done, like you would with normal photography. You can potentially take hundreds of individual frames of the same object, often over the course of several nights, just to make ONE image in the end.

Control software is essential. It takes a LOT of the tedium out of the process. Largely personal preference. I highly recommend looking into BackyardEOS and Sequence Generator Pro.

Guiding Software:
PHD2

In order to guide, you need guiding software. There is really only one option here, PHD2. MaxIm DL, the control software above, also supports guiding, but PHD2 is free and its the de-facto standard. It's highly capable, very easy (the acronym literally stands for "Push Here Dummy"!), and, free.

Stacking Software:
DeepSkyStacker
AstroStack
RegiStax
Images Plus
IRIS
[many others]

Stacking software is the first step in the post-processing stage. You bring in all your light, dark, bias, flat and dark flat frames. You configure the application for stacking. You register, then you stack. Sometimes the last two are part of the same step. Once stacked, you export an unmodified 32-bit TIFF, and your ready for the next step.

DSS, or DeepSkyStacker, is easy and free. Probably best to start there. There are a bunch of options here, with different capabilities for different purposes. Some are better for planetary stacking, some are better for galaxy and nebula stacking. Experiment.

Processing Software:
Adobe Photoshop
PixInsight

Processing software is what you use to "stretch" your integrations. Calibrating and stacking images into a single integration GREATLY reduces noise, and enhances dynamic range. If you understood the kind of noise levels and dynamic range levels that are fairly common in astrophotography, even the much vaunted 14-stops of DR that you get with a D800 would make you laugh at how pathetic that is! :P With a DSLR and a proper stack of dozens or even a hundred deeply exposed frames (several hundred seconds at least), along with dark, bias, and flat calibration, you can have well more than 14 stops of DR. If you were using a multi-stage cooled CCD imager, where dark current noise is in the 0.1e- or less range, after stacking your dynamic range from the darkest bits of dark nebula to the brightest pinpoint peaks of stars could be as much as 20 stops of DR, maybe even more (in which case, your most definitely using 32-bit float TIFF images, which are capable of storing what is effectively unlimited dynamic range, at least for the purposes of mankinds endeavors. :P)

Photoshop is pretty much a given. It's the staple of image processors, for everything. If you are a photographer and do not have PHOTOshop, then your a twit. Get it. You need it. ;) There are thousands of tutorials, video and article, on the net that cover using Photoshop for processing astro images. There are bunches of PS actions for free and for sale that take care of common tasks that are just essential parts of stretching and enhancing astro images. You really can't do astrophotography without it.

PixInsight is the big 800 pound gorilla when it comes to editing astro images. It brings to the table a whole, broad range of mathematical tools that allow you to process your images in ways Photoshop could never even tough, using advanced algorithms that can deal with noise in ways you never even imagined, extract detail from the black background depths that are so deep you wouldn't have even imagined that you could imagine there being detail there at all. It even offers "pixel math", giving you the ability to apply any kind of algorithm you can imagine to your images, effectively allowing an infinite amount of ways to process, calibrate, color balance, stretch, extract, and otherwise enhance your images. Oh, and beware...it has a REALLY balls-to-the-walls on-crack brain-melding kind of wacked out UI design that will totally make you go BONKERS for the first few days...but that eventually passes.

Nothing beats PixInsight for astro editing. But you still need Photoshop, because there are still some things Photoshop just does better. You need both of these. (But start with Photoshop.)

1474
EOS Bodies / Re: EOS 7D Mark II Announcement in Q2 of 2014 [CR1]
« on: February 28, 2014, 09:43:58 PM »
And noise reduction software is dramatically better at removing noise and preserving detail than block averaging is.  Plus, smaller pixels mean a higher-corner-frequency AA filter.  Both effects mean that the smaller pixels give you lower noise and better resolving power in the same light and exposure.

Noise reduction software applies to all images, regardless of pixel size.

And it works way better when there is more detail in the original.

Studies have shown that the lower color fidelity of smaller pixels (as enforced by a lower actual charge level, which requires a higher gain at all ISO settings than larger pixels) poses specific problems for NR. Color blotchiness, specifically, becomes a problem MUCH sooner when performing NR on images taken with smaller pixels.

Quote
You can't bring software into the hardware equation here.

Sure I can.  The entire process, from optics to processing, works together to produce the final image.

Your convoluting the issue by bringing in software. Software is a highly subjective matter. As far as I am concerned, as far as this discussion goes, software does not apply. Too many options, too many techniques, too many results.

And, again, anything you can apply to images taken with sensors with smaller pixels can be applied to sensors taken with larger pixels. There is no specific advantage to sensors with smaller pixels as far as software is concerned. It can effectively be reduced to a constant in the equation.

Quote
Sensors are hardware. From a hardware standpoint, smaller pixels/bigger pixels, so long as the total sensor area is the same, it really doesn't matter.

Then why not have just one enormous pixel?

I'm not even going to justify this with a response.

Quote
As for the pixels. I've never said they are bad. Small pixels out-RESOLVE large pixels, they do not necessarily out-PERFORM large pixels.

Not necessarily, but usually.

Just factoring in pixel size, always. If you factor in more than pixel size, such as AA filter, then sure. But that's an additional mark AGAINST smaller pixels. It's harder to create an AA filter that performs ideally for smaller pixels than for lager pixels. That is evident by the rather wide range of AA filter strengths for APS-C cameras (just look at DPR sample images and look at how widely moire varies....where as with larger sensors, the variation is much less.)

Quote
But small pixels can only out-resolve large pixels in certain circumstances.

Virtually every circumstance.

Wrong. When it comes to identical framing, more pixels will always win, in which case full frame sensors with larger pixels will trounce an APS-C sensor with smaller pixels. TROUNCE.

Quote
Smaller pixels will always outresolve larger pixels, but they do not normally outperform larger pixels. The only case where smaller pixels might literally outperform larger pixels is if the smaller pixels had considerably better technology than the larger pixels.

Nope.

Prove it. (BTW, the image below? It doesn't prove it. ;P)

Quote
Pixel performance is a fairly complex thing. I challenge you to pit G15 sports, wildlife, and bird photos against the same kinds of photos from the 1D X.

The 1DX will win because of a bigger sensor and bigger optics, not because of larger pixels.  If it had the G15's pixels, it would do even better.



The images above actually prove my point. The smaller pixels are considerably noisier. They do have more detail, but they are a lot noisier. Your original comment was that smaller pixels were less noisy. That is completely false. Your own images clearly prove they are far noisier.

When it comes to identical output magnification, again your images prove my point. The first column of images clearly demonstrates that the lower image has less noise, but roughly the same detail, as the upper image, however there is definitely more noise in the upper image that comes along with its very slight edge in detail.

Smaller pixels may resolve more than larger pixels, but they will never have less noise than larger pixels, given the same magnification.

1475
EOS Bodies / Re: Will the next xD cameras do 4k?
« on: February 28, 2014, 09:30:43 PM »
I hope so! Just got the new Dell 24" UHD display and checked out some 4k videos clips. Whoa!!!! Out of this world! So awesome!

And yes that means all those downers who insist that 4k is just marketing nonsense hype unless you run 90" or larger HAVE LESS THAN ZERO CLUE. The difference is perfectly clear on my 24" monitor between 1080p and UHD video and it can be striking at times!

(And of course UHD is AWESOME for photos! They look so much more natural, like giant project slides or something without all the digital artifacts and pixels and such type look on 1200p or lower displays and the extra details really make things look so much more realistic.)

(Oh and text! Man the web looks SOOO much better since all the text now looks like a printed page!)

The 90" size is for TVs, not computer screens. And the best size for a TV really depends on how far your couch is from the screen. There certainly is no requirement that you have an 80" or larger screen, and in some cases a 46" or 50" screen may be a lot better for your setup. Regardless, the point is to double the pixel density at whatever your viewing distance is. If you currently sit 12 feet from your 60" 1080p TV, then getting a 60" UHD TV would be absolutely incredible!

As far as a Dell 24" UHD goes, I would be willing to bet it is no more than three feet from your face, and likely closer than that. ;)

1476
Reviews / 6D Noise Levels and Comparison Tests
« on: February 28, 2014, 09:25:33 PM »
Someone from CloudyNights forum performed some useful tests of the 6D noise levels at different temperatures at astrophotography exposure lengths. Very interesting stuff, for those who are interested. You can find the images here at the original thread:

http://www.cloudynights.com/ubbthreads/showflat.php?Cat=0&Number=6402677&page=0&view=collapsed&sb=5&o=all&fpart=1&vc=#Post6402677

One of the very interesting things is you can see how much temperature affects read noise levels. The images are taken at +21°C, +7°C, and -7°C, with exposure times of 300 seconds.


1477
EOS Bodies / Re: EOS 7D Mark II Announcement in Q2 of 2014 [CR1]
« on: February 28, 2014, 09:05:31 PM »
And noise reduction software is dramatically better at removing noise and preserving detail than block averaging is.  Plus, smaller pixels mean a higher-corner-frequency AA filter.  Both effects mean that the smaller pixels give you lower noise and better resolving power in the same light and exposure.

Noise reduction software applies to all images, regardless of pixel size. You can't bring software into the hardware equation here. Sensors are hardware. From a hardware standpoint, smaller pixels/bigger pixels, so long as the total sensor area is the same, it really doesn't matter.

Trying to bring in post-processing aspects brings in a massive amount of subjectivity into the discussion, and then it becomes impossible to guage anything. Person A might use Topaz DeNoise, Person B might use Neat Image, Person C might just use LR/PS built in NR. Let's keep the argument to concrete information that we can all agree on. Sensor area/output magnification. That's all that would really matter. A smaller sensor has the potential to produce sharper results, but overall, noise is going to be the same (at best).

I would prefer the 16-18mpx low noise, high DR option myself.

For the millionth time, lower pixel counts do NOT mean lower noise and higher DR!  In fact, the other way is more likely.

Hmm strange then that the Canon 5D and 40D were both approx 10mpx cameras of the same generation but the IQ, noise and DR of the 5D is clearly better than the 40D (at a given ISO). Or if you prefer the Nikon D300 and D3 both c. 12mpx cameras of the same generation and guess what the D3 has better IQ, noise and DR! So regardless of the maths or anything else, when the chips are down large pixels seem to outperform small ones

Large sensors out-perform small sensors.  Small pixels out-perform large pixels as long as you don't get so small that the smaller pixels are too small for the manufacturing technology making them.

The 70D, even with 40MP out-performs the 7D with 18MP.  The G15 with its teeny, tiny pixels out-performs the 1Dx in DR even with its enormous pixels.

The idea that small pixels are somehow bad is long, long out-of-date.

First, I really have to quash this idea, because it is fundamentally WRONG: The 70D is NOT NOT NOT a 40mp camera!!!!!!!!!!!!! The 70D has 20.2 million pixels. Only the center 80% of those pixels (16.16mp) have split PHOTODIODES. A photodiode and a pixel are not the same thing. The 70D has, only has, always has had, and will only ever have, 20.2 million PIXELS. The 16.16 million center rectangle of pixels have split photodiodes. There are 32.32 million photodiodes packed into 16.16 million pixels, which comprise the center 80% of the sensors 20.2 million pixels in total.

When it comes to DPAF, photodiodes != pixels. DPAF pixels have two photodiodes, but they are still one pixel. The split photodiodes are underneath the microlens and color filter...so you could never read 32.32 million pixels out independently and have it be anything better or different than reading those 16.16 million pixels out. The split halves are the same pixel, under the same filter and same microlens. If they were separate pixels, DPAF simply wouldn't work. The entire point of the technology is that you can read light from each half of the lens, and therefor detect phase differential, from each and every individual PIXEL. The 70D has 20.2 million pixels. Only. In which case, the gap between the 7D and 70D is 2.2mp...which is practically trivial, since both sensors have roughly the same total area. (The 70D's real advantage is that it is actually slightly larger in dimensions than the 7D...more total area, more total light, albeit  a nearly trivial "more".)



As for the pixels. I've never said they are bad. Small pixels out-RESOLVE large pixels, they do not necessarily out-PERFORM large pixels. But small pixels can only out-resolve large pixels in certain circumstances. Sometimes, having more pixels for identical framing means large pixels can effectively outresolve smaller pixels...because you can either use a longer lens, or get closer, and still achieve the same framing. If resolving power is all that matters to you, and you have excellent skill with noise reduction (which is arguably more difficult to apply to images made from smaller pixels than images made from larger pixels), then smaller pixels will certainly be better for your use case.

Smaller pixels will always outresolve larger pixels, but they do not normally outperform larger pixels. The only case where smaller pixels might literally outperform larger pixels is if the smaller pixels had considerably better technology than the larger pixels. If you packed in ultra high Q.E. silicon materials (i.e. black silicon), ultra low noise readout (i.e. slower frequency readout, thermal cooling), backside illumination, high power microlenses and double microlens layers, etc. then sure, you could produce smaller pixels that might be capable of outperforming larger pixels....for a time... But the same technology can always be applied to larger pixels. On a normalized basis, where the technology field is flat, (and where you don't assume some specific post processing software is used to change the output of the sensor), smaller pixels cannot perform --> better <-- than larger pixels. At best, they could perform as well, at worst...well, they would perform worse.

Pixel performance is a fairly complex thing. I challenge you to pit G15 sports, wildlife, and bird photos against the same kinds of photos from the 1D X. I'm willing to bet good money that, assuming you find work from skilled photographers who actually know how to effectively work the equipment in hand, you will NEVER find any G15 images that are better than 1D X images. The G15 may have greater DR per pixel, but the 1D X trounces it in terms of sensor area.

1478
Reviews / Re: Nikon D4s VS Canon 1Dx Comparison
« on: February 28, 2014, 08:32:05 PM »
208ms shutter lag on the D4s, seriously?

Not sure where you're getting your data…
From their analysis:


Fair enough.  More evidence of their incompetence as reviewers. 

208 ms is the value Snapsort reports for the D4, and they got that from Imaging Resourse - except they took the time for AF + shutter release, instead of just the shutter lag (which is actually 43 ms).

It would have to be 43ms, rather than 208 ms.

There is no way the shutter lag could be 208ms. There are only 1000ms per second, so 1000/208 is 4.8. If the shutter lag was 208ms, the D4 could only achieve 4.8 frames per second. Shutter lag has to be less than the total time to initiate exposure, actuate the shutter, end exposure, and flip the mirror/read the sensor, because that TOTAL lag time is what determines the maximum frame rate. Total inter-frame lag time would have to be 90ms for the D4/D4s to achieve 11fps.

1479
Reviews / Re: Nikon D4s VS Canon 1Dx Comparison
« on: February 28, 2014, 05:54:55 PM »
Not only is the article clearly biased towards Nikon, it is exceptionally shallow. There is no real testing going on here, no real depth, so it is very easy for the writer to make subjective claims, as they didn't really gather enough empirical data of a high enough quality and consistency to refute their claims.

The individual writing the review certainly doesn't seem to know the 1D X either. There is no need to pre-lock on your subject with the 1D X. In my experience, it nails focus wherever your focus points are based on the AF mode, which utilizes the RGB metering sensor and dedicated DIGIC 4 processor to compute all the necessary information. The 1D X has all the same capabilities as Nikon's AF "3D" system, and certainly seems to be more effective at computing accurate focus quickly.

1480
EOS Bodies / Re: EOS 7D Mark II Announcement in Q2 of 2014 [CR1]
« on: February 28, 2014, 03:45:04 PM »
yes perfect. My original point was that I would prefer less larger pixels because IMHO larger pixels mean less noise etc. The point of comparing the SAME generation / mpx sensors, one APC-C and the other FF proves that point in the real world. Both Canon and Nikon are as close as they can be in all but pixel size. Lee Jay is suggesting exactly the opposite in that more (smaller) pixels would provide less noise

If you are not changing sensor size, then more/fewer pixels doesn't mean much. Assuming you are using the full frame. If you are reach limited, then smaller pixels have a definite and intrinsic value...you can crop more, and still have good detail. You definitely won't have less noise...you'll have more noise, however cropping higher resolution detail with more noise is often better than cropping lower resolution detail with less noise. Especially in the APS-C world.

1481
EOS Bodies / Re: EOS 7D Mark II Announcement in Q2 of 2014 [CR1]
« on: February 28, 2014, 03:33:34 PM »
People seem to be suggesting (not just on this forum) that Canon are scratching their heads over the potential specifications of a 7D Mk II. I'd have thought that it was pretty obvious -an APS-C sensor version of the 5D Mk III with a higher frame rate (i.e. 8-12fps).

The elephant in the room is whether the 20MP sensor from the 70D is good enough for their "flagship APS-C camera" or whether Canon are waiting to launch a new generation of sensors in the 7D Mk II. The more time passes from the 70D's introduction, the more likely I think the 7D Mk II will be the launch vehicle for the new generation of sensor; I would therefore expect any announcement to be just prior to Photokina. [Sod's law they will announce it next month and make this prediction wrong!  ::)]

Purely speculation but I would imagine one of Canon's biggest concerns is a new 7D Mark II potentially eating into 1D X and the big white market. They need to make it attractive enough, but not so attractive to take away any of the market from their flagship body and lenses.

This is such an old and tired argument. Canon has nothing to fear from the 7D II stealing from the 1D X. The 1D X is going to be a superior camera in every respect. If someone can afford it and wants the best quality they can get, they are going to get the 1D X. In my previous comment, I explain why. Ultimately, noise is more about sensor area than pixel size. When it comes to pixel peeping, pixel size matters, but pixel peeping isn't photography...it's just a waste of time. FF sensors have more total area than APS-C sensors. For identically framed subjects, that means FF always has the potential to gather more light. More light, less noise. If you choose to stop down, then that is an artistic or technical choice, not a limitation of the technology.

In no way, regardless of what features Canon puts into the 7D II or how good they are, will it ever really steal sales away from the 1D X. On the contrary, by making the 7D II as good as they possibly can at the cheapest price point they can, it will GREATLY increase their sales. The simple fact of the matter is many, many people would probably LOVE to have a 1D X, they simply cannot afford it. The biggest thing stealing sales away from the 1D X is it's price. A feature-rich, highly capable "Mini 1D X" in the 7D II would give all those people a far more affordable option that is in reach...increasing total DSLR sales.

1482
EOS Bodies / Re: EOS 7D Mark II Announcement in Q2 of 2014 [CR1]
« on: February 28, 2014, 03:28:57 PM »
Seeing as the D4s is coming with a 'new' 16 mp sensor, I'm going to be brave and guess the 7DII will also be 16 mp, aps class leading low light performance, very fast and no pop up flash. See you in the second quarter.
Please God hear our prayers. Only 16 megapixel camera with ISO 3200 without noise bothering, costing less than $ 2000.

I'd rather have 24, 32 or even 72MP.  More resolution and less noise that way.

That's a misconception. If you account for noise as a factor of total sensor area, it doesn't really matter how large or small your pixel are. The expectation is that you are downsampling any and all of those sensors to some common output size...i.e. the same magnification.

Otherwise, smaller pixels are always going to have more noise at the pixel level. Any technology you might apply to smaller pixels is applicable to larger pixels. Any potential technological gains you might have that allow smaller pixels are only going to make bigger pixels better. In no way can smaller pixels be less noisy than larger pixels. They may resolve more detail, but assuming Q.E. remains roughly the same, that detail WILL be noisier.


All else being equal, if you have 6 micron pixels and 3 micron pixels, the 3 micron pixels are going to have 1/4 the FWC. A 6 micron pixel might have 60,000e- max charge at ISO 100, where as a 3 micron pixel is going to have 15,000e- max charge. Since noise is the square root of the signal, you have 244e- noise with 6 micron pixels, and 122e- noise with 3 micron pixels. In other words, you have a 244:1 SNR with 6 micron pixels, and a 122:1 SNR with 3 micron pixels.

The only way to make those smaller pixels equal to the larger pixels is to downsample by a factor of two.

What's the problem with having a high resolution sensor that allows detailed images at low ISO and then downsampling to reduce noise when you need to used higher ISOs?

I'm asking because you seem to know your stuff and I'd like to get this cleared up once and for all!

Oh, there is absolutely nothing wrong with it. It just won't give you LESS noise. Assuming we have two APS-C sensors, if we view them a 100%, the image taken with the sensor with smaller pixels will be noisier. If we sample them to the same size, noise will be equal. The sensor with smaller pixels will be crisper when scaled to the same size, but there won't be any real difference in noise.

Why? Because both sensors have the same total physical area. Assuming the same output magnification, the only thing that matters is sensor area, not pixel size.

This is a different argument than FF vs. APS-C. In the case of FF vs. APS-C, you can look at it a couple of ways. There is equivalence. You frame the same scene identically with both FF and APS-C (doesn't matter if you get closer with FF or use a longer lens). You need a narrower aperture with FF in order to achieve the same DOF as APS-C. You end up with the same amount of noise for the same output magnification. Again, total sensor area matters here, however you have normalized all factors, so noise relative to output magnification is going to be similar.

However, I don't think that is generally how photographers think. In my experience, photographers who want the same framing with FF as their APS-C counterparts ALSO want a thinner DOF and blurrier background. That is especially the case with those who do portraiture, weddings, studio work, etc. with shorter and medium focal lengths. In that case, FF is always going to be vastly superior to APS-C. Not only do you have greater total sensor area, but you have larger pixels AND a faster aperture. No contest. Smaller pixels on a smaller sensor cannot compete in any way, shape, or form.

In any case, in none of the above scenarios will smaller pixels give you BETTER noise characteristics. They may allow sharper images, but from a noise standpoint, you at best can get the same noise performance out of smaller pixels for the same sensor size. Smaller pixels on a smaller sensor, in common use cases they will never be as good as larger pixels on a larger sensor, and at best they will only be "as good".

1483
EOS Bodies / Re: Will the next xD cameras do 4k?
« on: February 28, 2014, 03:01:51 PM »
4K video is now featuring on SmartPhones:
- Sony Xperia Z2 brings 4K video to its flagship smartphone
http://connect.dpreview.com/post/4506883679/sony-xperia-z2-records-4k-video
- Samsung Galaxy S5 adds 16MP camera with 4K video
http://connect.dpreview.com/post/7372383200/samsung-galaxy-s5-features16mp-and-4k-video

Will the 7D Mark II be the first "normal" DSLR with 4k video??

I think Sony and Samsung have every reason to put 4K in their phones, given that they have 4K televisions to sell as well. Its an incentive, buy one and get the other one because its compatible with each other. I can see the advertisements now: Don't have enough 4K content yet? Thats fine, make your own 4K content with our phones. Then buy our TVs to view your 4K content.

In my opinion, get yourself a GoPro 3+ black edition... borrow one... buy one, use it for a week or 2 and return it.
Test it out... Record 4K and see what you can actually do with it. I know its not exactly 4K at the desired frame rates that you want... but you will see how limited you are with it.
Just do it... it will explain every thing I was saying up till now.

Well, I shoot video and I don't use a cell phone to do it. Trust me, 4K will be a big improvement over what we have now.

Remember, even if you are delivering in 2K, having your source footage in 4K is a significant advantage in many ways.

This is very true, for sure. However it also kind of assumes you know why it is an advantage, and that you have the post-processing software to take advantage of it. I still don't see this as a reason for Canon to put 4k in all of their DSLRs with upcoming releases. It might be grounds for them to release firmware updates for the 1D X and 5D III to support 24fps 4k, though.

1484
EOS Bodies / Re: Is Dual Pixel Tech Coming to the EOS 5D Mark III?
« on: February 28, 2014, 03:00:11 PM »
I thought Magic Lantern had already done this with the 5D3??

cayenne

Dual pixel tech is a hardware feature. It cannot be added with firmware. Canon would have to have actually manufactured the 5D III sensor with dual pixel technology at the time they released it in order to add the capability with a firmware update later on.

Given the work that ML has done, if that was the case, I would have expected them to have figured that out by now, what with all of the things they have been doing with the 5D III lately. Given that they have not, it seems unlikely that the 5D III sensor was actually manufactured with DPAF tech.

If Canon does offer an upgrade, it would be a "Send your camera in and we'll replace the main board with one that has a DPAF sensor". And, that would probably cost a pretty penny, too! They may release an interim update to the 5D III, like the 5D IIIdp, that includes just a new sensor and no other model changes. Canon has done small interim camera model updates in the past, like the 1D IIn.

1485
Landscape / Re: Deep Sky Astrophotography
« on: February 28, 2014, 01:25:27 PM »
Having not even read all of that yet, and doubt I will tonight...let me just say that, I think you knew I was being sarcastic, because that actually is really what I think of you.  And I know you think even worse of me, so I know you were also being sarcastic.  But it's kind of fun to not let our personal mutual disgust get in the way of other important things such as photography.

I also wanted to say that, having not thought much about what I asked above (apparently), I can answer my own question with a simple answer (as in, why not get a scope instead, etc.).

(Besides the fact that you are a birder)...It's because you want a wider field of view than most telescopes provide, correct?  I'm pretty sure most of the astro images I've seen, that needed a wider field of view, were not shot with telescopes, but rather SLR cameras and lenses.

There are some very good short focal length refractors out there specifically designed for wide field work. There is also the HyperStar option for Celestron SCT scopes, which focal reduces them to f/2 instruments. If you have a 2800mm f/10 11" EdgeHD, and convert it with Hyperstar, you now have a 560mm f/2 imager...that is an even wider field than I get with my 600mm. Some refractors are as short as 400mm.

I think a lot of people DO use their camera lenses for very wide field work, for sure....although I think that is more a matter of convenience than anything. A good apochromatic 80mm f/4 scope (320mm focal length) can cost a pretty penny (several thousand), where as a 300mm f/4 camera lens might cost $1000 cheaper (although can still cost a thousand or two itself.) The actual apo 80mm f/4 refractor will be a much better device for imaging, and if y our serious about your astrophotography, it's the better route to go for wide field work (unless your talking Canon Great White telephotos, in which case until you get to the real high end range of apo scopes, the Canon lenses will be better.)

There are actually some professional scientific groups that use arrays of Canon lenses to do deep sky imaging. I know that EF 200mm f/1.8 L and f/2 L, EF 300mm f/2.8 L, EF 400mm f/2.8 L lenses have all been used in ultra fast (i.e. f/1!) telescopic arrays. Some have been used to find ultra dim deep field objects (super distant, dim galaxies), others have been used to find the dimmest nebula and galactic disc detail ever (the size of the average galaxy, according to some papers about an array that uses 12 EF 400mm f/2.8 L lenses in an f/1 setup, is significantly larger than is normally seen in your average visible light imaging...at f/1, you can gather so much light that you can see the dimmest structures in the universe with the exception of what Hubble itself sees.)

Ok, I read some of the end of one of your posts.  9 micro meters for a pixel on a medium format imager...impressive.  Would you happen to know what sort of imagers some of the well known observatories use?  I'm sure it's probably customized, or "bespoke" componentry, but was just curious.  I imagine the sensor is even larger than medium format.  The one in the Hubble Space Telescope I assume, is quite large, but probably not the largest.  Perhaps the "wide field" space scope uses an even larger imager (the one that hunts extra-solar planets, detects phase shifts from stars)...I think this is not even really called an imager, is it?

Hubble has some large imagers, but its newer and more advanced ones are not all that large. Certainly not the largest.

A lot of professional observatories use PlaneWave scopes on Paramount ME II mounts with FLI imagers as the lowest end imagers they might use. There are some much larger imagers out there. Some have diagonals as large as 90mm, which is utterly massive, that's a 64mmx64mm sensor. These sensors also tend to have around 70dB of dynamic range. When you factor in a multi-stage watercooled TEC with a 70°C to 80°C Delta-T and read noise levels in the 0.001e- range, and they utterly blow the crap out of your average DSLR sensor or even a cooled $10,000 astro CCD imager. Imagers like that tend to cost a hundred grand a piece.

The larger PlaneWave scopes, including the $200,000 28", have become pretty standard these days for professional installations. They are usually set up as arrays and calibrated to point at the same locations in the sky synchronously. So, you might have an array of five PlaneWave 28" CDKs all with the high end 65mm or 90mm (diagonal) sensors. Your average multi-scope array setup for a university probably costs a couple million bucks, but in terms of combined relative aperture and sensitivity, such a setup can rival a mountain top observatory for total light gathering capacity, at a tenth the cost or less.

The largest telescopes on earth, like the Keck 10 meter, is an f/1.74 monstrosity. The Keck observatory houses multiple scopes, uses active optics, and dozens of imaging devices. I doubt any branded cameras were used...they probably use sensors from Teledyne, E2V, etc. and built them directly into the system.

Pages: 1 ... 97 98 [99] 100 101 ... 300