November 23, 2014, 09:42:03 PM

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - jrista

Pages: 1 ... 83 84 [85] 86 87 ... 309
1261
Animal Kingdom / Re: Show your Bird Portraits
« on: May 07, 2014, 01:42:53 AM »
Jrista, cool bird, nice shots!  Winter has returned with snow the last two days - ugh.

As your Prez says, expect worse weather from here on! 

Jack

Well, our Prez, the narcissistic god-complexed emperor-savior, is an blazing idiot-buffoon, so don't listen to him! :P

Winter is ending. It's only really been in the last two weeks that things went from still freezing at night, to 65 degrees at night and high 70's during the day. If that only happened to us down here, it's probably still a couple weeks or so for you guys higher north. Life always finds a way to force itself forth again.

1262
Animal Kingdom / Re: Show your Bird Portraits
« on: May 07, 2014, 01:13:11 AM »
Glossy/White-faced Ibis Hybrid

Common to Colorado are the White-faced Ibis. They are beautiful birds, long slender necks, burgundy feathers with green wings and a patchwork of faintly colored feathers on their backs. These birds are fairly elusive here in Colorado, and they have been a target of mine for a couple years now. I've seen flocks flying overhead, usually at sunset, and only been able to get remote silhouette shots.

A couple days ago, at the Cottonwood Creek wetland, a good-sized flock of Ibis were hanging out, bathing and preening in the calmer backwaters of the wetland ponds. After some time carefully getting into position, I finally managed to get some nice shots of these beautiful waders. Once they were finally framed in my lens, I realized that at least one, if not a few, looked like Glossy Ibis. A VERY similar bird, the Glossy Ibis is endemic to the Everglades of Florida, and very rarely ventures anywhere else. The key difference is the very thin white border around the Glossies face, where as the White-faced has a much larger border that blends into their burgundy and green head and neck feathers. However the face on the Ibis in front of me was a thin and mottled white line...somewhat different from a Glossy.

The White-faced and Glossy Ibis have only a very small region of the Gulf Coast where they cohabitate a couple times a year. In my research to identify the bird I've captured here, the only photos that looked identical were labeled "White-faced/Glossy Ibis Hybrid". I'm honestly unsure how common a hybrid cross between White-faced and Glossy Ibis is, however given the small overlap in their ranges, I suspect it can't be that much more common in Colorado than the Glossy itself.

Hybridized Ibis
White-faced and Glossy

Cottonwood Creek Wetland
Cherry Creek, Colorado

Canon EOS 7D
Canon EF 600mm f/4 L II
Gitzo GT3532LS + Jobu Pro 2







Read more on my blog.

1263
Photography Technique / Re: So I really stepped into it....
« on: May 07, 2014, 12:56:01 AM »
good summary, also don't forget everyone on Facebook is an expert at everything...

My motto is similar, but different:

Just forget Facebook!! :P

 ;D ;D ;D

@JD:

Facebook, like most other social networks online, is a cesspool. It doesn't matter what goes in, everything that comes out is covered in sh*t. I was one of the early members of Facebook within a month or so after it came online, back when MySpace was THE place that EVERYONE was...and everyone gave you a quizzical look when you said "I'm on Facebook!" These days...I honestly wish I'd never registered my account. I no longer have any personal info on there. I've denied access to my facebook account to most everything, I've eliminated every game that somehow became linked into it, etc. I only use it for occasional updates about my photography and a very few other things. I'm also on twitter...it's pretty much purely about my photography, and that is just done automatically when I update my WordPress blog.

If this all happened on Facebook, I'd just completely forget it. I think your demonstration images with the figurine were excellent and very explanatory. The advice you gave was solid. The problem with people these days is more often than not, they already think they are experts at whatever it is they think they are experts at...when they ask for advice, they aren't looking for advice...they are simply looking for someone to reinforce their already-formulated and overly inflated opinion of themselves and their skill. I think you got sucked into one of those inverted vortices where yes means no and "Help me" really means "Assert my own opinion of myself...verify me, so I can feel good!"

Bleh. Facebook. BARF.

1264
EOS Bodies / Re: New Sensor Technology Coming From Canon? [CR1]
« on: May 06, 2014, 11:37:49 PM »
Why do the etchings always have to go in the same direction?
I guess it's how are they cut out? what do they use, a saw  ;D

Well, there is no specific reason why they couldn't etch some additional sensors in the perpendicular direction, but it would be costly. The way sensor fabrication works is by etching the silicon with extreme UV light via a template. The template is oriented in a single direction. The wafer is moved underneath the light beam so that multiple sensors can be etched. Etching of a single sensor is a multi-step process, with various steps involving masking, etching, dissolution of masks, more etching, doping and layering of new materials, masking, etching, etc. This stuff has to be precise to the level of a few nanometers at most, so it is entirely automated. Rotating the wafer to etch additional sensors in a different direction introduces a source of error that could hurt yield.

I was under the impression that chip vendors typically used a single mask (template) for the entire wafer, though.  If so, then the additional work to add a sensor in the other direction would be limited to modifying the mask with an additional set of clear spots for the additional chip's features, modifying the cutting program slightly, and then modifying the picker to grab that one chip and rotate it ninety degrees.

If they aren't using one mask per wafer, then I suspect they're in a world of hurt, yield-wise, because the alignment of the mask would have to be perfect twenty to eighty times per pass across a given wafer, whereas with a single mask, it only has to be perfect once per pass across the wafer.

If Canon hasn't done this already, they should probably sit down, do the math on what percentage of chips are full-frame, and then design masks to etch the full-frame sensors at the center of the wafer, and surround them with crop sensors to maximize the surface coverage.  In theory, they could also mask the DIGIC chips, lens microcontrollers, etc. in the borders, so that only a tiny bit of the silicon wafer is wasted (because I'm pretty sure the robots have to have some bare spots near the edge of the wafers so that they can safely grab them).

Granted, you can't do that for every combination of chips—IIRC, some silicon parts likely require significantly different doping—but for parts that are fairly similar, you should be able to do so.  At a bare minimum, I would expect that you could combine different sizes of sensors almost arbitrarily, including not only full-frame and crop sensors, but also smaller sensors for use in camera phones and point-and-shoot cameras.

If you are assuming they use a single mask in a single exposure to generate an entire wafer of sensors, then you would be incorrect. Remember that the whole point of using a mask and deep or extreme ultraviolet light wavelengths is that it allows the mask to be orders of magnitude larger than the actual CMOS device being fabricated. Were talking many thousands to millions of times larger...macro scale vs. nano scale. To make a mask large enough to expose an entire wafer at once would be....immense. Generating and focusing the light beam would be an equally immense undertaking (assuming it's even possible to bend light enough to do it.) You seem to think that making a single mask to expose the wafer in one shot is easier...if it was, I'm sure everyone would have moved to that approach decades ago. Fabbing one die at a time is how it's done in all industries, including CPUs, GPUs, etc. (which are considerably more complex devices than an image sensor, and use smaller processes as well.)

Fabricating a sensor is a multi-step, multi-layer process, per-sensor (or per-cpu, per-gpu, per-IC), not per-wafer. They design a sensor, generate the templates necessary to etch and layer the necessary materials for all of the transistors, wiring, and other components involved in that sensor, then use that template again and again to fabricate multiple sensors per wafer. For each pass, the wafer is coated with a photoresist, which when exposed by DUV or EUV light, changes it's chemical structure. Every die on the wafer is exposed one after the other with the first template, then the entire wafer is bathed in chemicals to remove the exposed photoresist, etch away the exposed silicon, and dope the remaining silicon if necessary. The rest of the photorisist for the first pass is removed, a new layer of silicon or silicon-based material is added, another layer of photoresist is added, and the wafer is sent through the stepper again. Rinse, repeat, etc.

There are steppers, and there are scanners. Some large CMOS (like the very large ultra-sensitive CMOS sensor Canon developed a few years ago) devices cannot even be exposed by a single beam, in order to get proper focus, the beam has to be smaller than the full size of the template...so photolithography scanners allow larger devices to be fabricated via a longer exposure by moving both the wafer and the UV reticle opposite each other during exposure. Canon manufactures both photolithography steppers and scanners, and according to their site, these devices support 200mm and 300mm wafers, and their latest devices can apparently use some techniques to image below the 90nm diffraction limit of the DUV light they use (so Canon is more than capable of fabricating sensors on a 180nm process with their own photolithography technology, and on 300mm wafers at that).

It's all automated and computerized, human hands aren't directly involved in moving the wafer or anything like that (at least not until it's done), so redirecting the beam or moving the wafer can be exceptionally precise. There has to be some negative space around each sensor anyway to allow them to be cut out of the die, but that's a very careful balance of just exactly the right amount of space...not too little as you risk damaging dies during cutting, and not too much that you waste space. The thing of it is, it all works in one orientation...while the wafer and reticule can be moved horizontally, from the things I've read about photolithography devices, there is no rotation of the wafer or template or anything like that. It moves under the template and UV beam, out to the chemical bath for etching and processing, on to have another layer of silicon deposited, back under the template, so on and so forth. It is probably possible to build a fab that could fabricate devices in multiple orientations, however I'm certain there are multiple challenges to making that possible, and it would likely increase cost exponentially (it wouldn't just be changes to the stepper or scanner...you would have to make sure the entire manufacturing pipeline was capable of dealing with devices of differing orientation...that includes the steps involved in cutting the wafer and separating out each die, packaging the die which involves either adding pins or a land grid array and the like, etc.)

1265
Animal Kingdom / Re: Show your Bird Portraits
« on: May 06, 2014, 03:18:57 PM »
Ruddy shellduck, female
This is a very rare visitor. It is actually the first time I have seen one. I suspect it is a runaway from a duck farm from somewhere south of here though. But a very beautiful duck it is.
1DX, 600mm f4L IS II + 1.4xIII
1/800s, f8.0, ISO800

What a beauty! Your a lucky guy!

1266
Animal Kingdom / Re: Show your Bird Portraits
« on: May 06, 2014, 11:46:59 AM »
Float tubes - thanks guys  - was thinking an inflatable boat but they are much larger and heavier so this may be a very handy alternative.  Here's a good article: http://flyfish-edmonton.webs.com/floattubing.htm

Jack

I'd still be pretty worried about losing my gear. If I had some kind of floating stand to put the camera on...something very stable that couldn't be swamped, then I might feel safer...but even with a float tube, if I'm just holding my gear.... *shudder*

1267
Third Party Manufacturers / Re: Landscape Filters
« on: May 05, 2014, 10:56:02 PM »
LEE is notorious sold out here in germany.
i heard the LEE ND-GRAD filters are polished on the thighs of virgins .... and there is a shortage of virgins.  ;)

LOL! Well, I guess that's why I have such a hard time finding LEE Grads for sale. :P

I bought into the Lee Filter System a while ago, maybe almost five years ago now. While I will say that it was difficult to buy in, as Lee is perpetually behind on producing enough supply for their demand, their filters are definitely worth it. I've tried other filters, and while quality seems to be improving these days, five years ago it wasn't uncommon to see a marked reduction in IQ when using off-brand filters vs. Lee's filters. They really are a step above the rest in most cases.

I still find that there are filter shortages, Lee filters almost always seem to be sold out, however I now have most of the filters I need, so it's pretty rare that I need another (one case recently would be my broken polarizer...I haven't replaced it yet, it's been out of stock on the relatively rare occasions I look for it.)

1268
EOS Bodies / Re: More Sensor Technology Talk [CR1]
« on: May 05, 2014, 10:13:04 PM »
Welcome.

Re A7R - http://www.sansmirror.com/cameras/a-note-about-camera-reviews/sony-nex-camera-reviews/sony-a7-and-a7r-review.html

Scroll down to "How do they Perform?"

I believe that only applies to their 11-bit "RAW" encoding. That would be something akin to Canon's sRAW and mRAW, not necessarily in encoding, but in lossyness. Neither are actually RAW files, they encode data in a specific way. In Canon's case, the m/sRAW formats are YCb'Cr' formats, or Luminance + Chrominance Blue-Yellow + Chrominance Red-Green. The Y or Luminance channels is stored full resolution, however the Cb and Cr channels are stored "sparse". In Canon's case, all of the stored values are still 14-bit precision, but they do store lower chrominance data. Canon's images would be superior to Sony's, in both that they store more information in total, as well as with a greater bit depth...however both will suffer from the same limitation: The information is not actually RAW, which severely limits your editing latitude.

Generally speaking, the fact that these formats store lower resolution color information doesn't matter all that much. Because of the way our brains process information, if done carefully, a lower resolution chrominance is "missed" in favor of a higher level of detail. YCbCr formats have been around for a long time, since the dawn of color TV even. The Luminance channel was extracted and sent in full detail, while the blue/yellow and red/green channels were sent separately, in a more highly compressed format. This actually allowed color information to be piggybacked on the same signal that "black and white" TV channels were sent on, making it possible for B&W TVs to pick up the same signal as Color TVs.

If you have paid any attention to Canon's video features, you've already heard of similar video compression techniques. You may have heard of 4:1:1, 4:2:2, or 4:4:4. Those numbers refer to the Y, Cb, and Cr channel encoding. A 4:1:1 encoding has full luminance and 1/4 Cb & Cr channels. A 4:2:2 encoding has full luminance and 1/2 Cb and Cr channels.  As you might expect, a 4:4:4 encoding use the same sampling rate for all three channels, and is effectively "full resolution". A standard RAW image is also technically a 4:4:4 R'G'B' image.


1269
Animal Kingdom / Re: Show your Bird Portraits
« on: May 05, 2014, 06:47:17 PM »
Feeding Cardinal

I am actually fairly certain that is a bird with a severely diseased, deformed beak. A few of the house finches each year around here end up with corrupted, diseased beaks like that. It's kind of sad. It usually happens to the birds that become malnourished due to an injury or lost eye during a fight (house finches can get pretty brutal during mating season).

1270
Animal Kingdom / Re: Show your Bird Portraits
« on: May 05, 2014, 06:05:38 PM »
Eldar, funny I was just thinking of buying hip waders for shooting but wondered how safe they are for gear if you stumble??!  Any good stories?

Jack
He he, the advantage of being a flyfisher is that you are used to balance on slippery rocks. Best advice is probably to have a wading stick. That way you always have an extra support on the bottom. I try to not go in too deep though. Luckily I have no fun stories to tell, meaning all my equipment have survived so far ;)

I would strongly recommend wading pants though. You can get fairly good ones fairly cheap.

I wish I could wade in the waters around me. Most of our lakes are part of wetlands, which means they don't have rocky shores or rock covered bottoms...it's all decaying plant matter, which ultimately results in this soft black muck that is several feet deep. Step in it, and at the very least your going to lose your shoe...try to actually walk through it, and you might actually lose yourself as well, and certainly your gear. :\

1271
Animal Kingdom / Re: Show your Bird Portraits
« on: May 05, 2014, 04:56:30 PM »
Beautiful shots, Eldar! That 1D X is a creamy background machine...man, what I would give to have that kind of SNR.

1272
EOS Bodies / Re: More Sensor Technology Talk [CR1]
« on: May 05, 2014, 04:54:22 PM »
(f-ratio doesn't usually matter for planetary, as you image planets by taking videos with thousands of frames for anywhere from a couple minutes to as long as a half hour...then filter, register, and stack the best frames of the video, which is basically performing a superresolution integration...that eliminates blurring from seeing, and effectively allows you to image well beyond the diffraction limit.)

This is very interesting, and news to me. Dare I ask how that is possible? I assumed stacking would take the image to the theoretical best the setup can produce - how does it deal with diffraction? I was using my 500L with extenders to photograph planets using stacking recently, and assumed softness due to diffraction (I was at 4000mm f/40 for Jupiter and 5600mm f/56 for Mars).

There are different ways to stack. The most common is averaging, either basic averaging, weighted-averaging, or sigma-kappa clipping averaging. Those forms of stacking are usually used on star field images, for nebula, galaxies, clusters, to reduce noise (noise is reduced by a factor of SQRT(stackCount)...so stacking 100 frames reduces noise by a factor of 10.)

You can also use "drizzle" stacking and other forms of superresolution stacking. The purpose of these methods is less to reduce noise (although they do help reduce noise), and more to increase detail. Stacking for superresolution aims to chose the best version or versions of any given pixel out of thousands of frames, and sample each pixel in each frame and across frames multiple times with alternate "rotation" factors or something similar. That allows the algorithm to extract the maximum amount of information for each point of your subject.

While diffraction certainly limits your resolution when doing planetary imaging, seeing limits it to a FAR greater degree. The vast majority of blurrieness when doing planetary imaging is due to atmospheric turbulence and poor transparency, by about an order of magnitude compared to diffraction. Stacking thousands of frames with a superresolution algorithm easily cuts through both, assuming you get enough high quality frames. Because these algorithms pick the best version of a pixel and multisample each pixel, you can end up with surprisingly high detail images, despite the effects of seeing and diffraction.

1273
Good timing in Copenhagen

Photo shot with: Canon 6D and Canon 70-200 2.8

https://www.flickr.com/photos/77973666@N06/

Wow. I think we need to redefine what "good" means now...

1274
Software & Accessories / Re: Microfibre Cloths for Lens Cleaning
« on: May 05, 2014, 01:34:05 PM »
Thanks for all the feedback and terrific suggestions. I had no idea this was going to turn into such an interesting topic. I have always just used a microfiber cloth to clean my lenses but am now considering some of the suggestions above.

Btw - what's wrong with putting the microfibre cloth in the washing machine? What happens to it?

When I was collecting crystal whisky glasses I was advised not to dry them with a cloth that had been washed with softener as it, and other chemicals we tend to put in washing machines, can cloud the glass. How this correlates to the glass found on camera lenses I do not know, but I tend to just get new cloths rather than wash them. That said, the guy I was chatting to has never had a problem washing his, in fact he wishes the company were still producing them.

I don't think it's a huge issue but I can see the potential for problems depending on what the cloth is exposed to in the laundry process.  I agree that cloth can retain various chemicals or compounds from a wash process.  If I were to wash an important item like a lens cloth, I would probably just hand wash it so I can control what is introduced to the cloth in the form of dirt or other contaminants from other dirty items, soaps, grit, etc.  All you are trying to do is remove some light oils, dust and light dirt from the cloth anyway.  Woolite or some other delicate detergent would probably work great, then simply hang dry the cloth.  If you've ever held a dryer softener sheet, you will get an idea what is left on clean clothes in the dryer.  Nice for skin maybe but not for leaving smudges on lens glass.

Yeah that's a good point I don't want left over detergent / softner or lint on it plus our washing machine isn't the best at completely removing all that junk! I think I'll just hand wash them from now on. Thanks for the tip!

Washing your cleaning cloths the same way you wash your clothes is a bad idea. Most cloths washing detergents and softeners are explicitly designed to leave behind sent molecules to "freshen" up your clothing. Not all detergent gets rinsed out either, unless you use a doubly-long extended rinse cycle, and even then, your still going to have soap residues in the fabric.

Washing your cleaning cloths with your cloths, or in the same way as your cloths, is a sure way to ruin them. You want very clean cloths, without any residues or detergents or other molecules of any kind.

One of the best ways to clean cleaning cloths is to use activated water. This is water that's been sent through electrolysis, which slightly changes the pH and also created "charge bubbles", electrically charged nodules of water molecules that bond to dirt in a similar way to detergent. Since it's really just water, there is nothing to be left behind.

1275
Diving for Fish

Cherry Creek, a state park, wetland, and nature reserve only a few minutes from my home, has really started to heat up with a whole ton of bird arrivals. Last night, I had a Black-crowned Night Heron practically pose for me, and at one point, he dove off his branch in an attempt to catch a fish. Sadly, the fishcapade was a failure, but I did capture a rather awesome flight shot.

Black-crowned Night Heron
Cottonwood Creek Wetland
Cherry Creek, Colorado

Canon EOS 7D
Canon EF 600mm f/4 L II
Gitzo GT3532LS + Jobu Pro 2




Pages: 1 ... 83 84 [85] 86 87 ... 309