December 22, 2014, 03:02:48 PM

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - jrista

Pages: 1 ... 77 78 [79] 80 81 ... 322
Macro / Re: Flower macros
« on: July 07, 2014, 03:41:39 PM »
DPC, you really have an eye for this stuff! Amazing images!

EOS Bodies / Re: Eos7D mk2, How EXCITED will you be if . . .?
« on: July 07, 2014, 03:38:55 PM »
I'll get excited if the 7DII has substantially better IQ... AND that sensor gets used in the next EOS M!

Better IQ and much higher performing AF, and used in the next EOS M. I'd be interested in that. I want a more capable ILC for days when I generally can't bring my big DSLR kit...something with high resolution, fast focus, and good IQ (across the board, including better DR) would be nice to have. I could use it for wildlife/birds and landscapes.

EOS Bodies / Re: EOS 7D Mark II Announcement September 5, 2014?
« on: July 07, 2014, 03:36:58 PM »
It's funny...I have anticipated this camera for so long, as I've loved my 7D. Now that it's just around the corner...the only cameras I can think about have nothing to do with terrestrial photography! :P

As for Surface...Microsoft's future is dependent upon the entire Microsoft ecosystem being directly competitive with Apple products, specifically. To be quite blunt, Microsoft's hardware partners SUCK ASS. They NEEDED a big, fat, PAINFUL kick in the rear end to knock some sense into them. The mobile windows hardware market has been failing for years...products have gotten cheaper and cheaper, and the quality of those products has tanked right along with price and profit margin.

Well, if that's the way you see it. I see it a bit differently - Microsoft reduced the profitability of these companies up to the point their R&D became mainly small incremental upgrades and now enters the same market, able to out price their hardware due to software licensing costs. Actually, because of this threat, Linux got considerably better video card support from AMD just last year due to Microsoft's actions, and that's just from the top of my head. And what it comes to products getting cheaper, that's probably true. What I don't agree with is quality.

Comparing something to Apple stuff doesn't really impress, it's a company that can't even get their OFFICIAL chargers working (=cutting corners with electrical safety to reduce size of the charger). If we had the same legislation before entering EU, it would not have been even possible to sell the OFFICIAL Apple chargers here due to safety regulation violations.

Generally in Europe, it's considered a bad move to jump to enter the same area as your customers - it is guaranteed to create ill-will, so you really shouldn't be surprised because of this. Funny thing is, this is exactly the recent stuff why Microsoft is not liked, but you're downplaying this example by saying it's a genius move. Well, I don't know, it could be strategical genius at play, but the chances are, you're also taking a risk of alienating your OEMs. It doesn't happen in a second, though, and Microsoft has cash to play. See where I'm getting at?

There is what is "considered", and there is what's actually happening. In terms of what's actually happening, Microsoft's entry into the tablet market has forced their competitors to become competitive. It was a stagnant market. The "cheap" products from Microsoft partners kept getting cheaper and cheaper, shoddier and shoddier, with price points down to a few hundred bucks. There was no quality, because the third-party product manufacturers had built their reputations on cheap and replaceable instead. That wasn't Microsoft's doing.

With Surface now a competitor, and Microsoft primarily competing with the higher end Apple, vs. the ultra low end crap that used to be standard fare for Windows-based products, Microsoft is forcing their PARTNERS to step up their game, enter a higher quality realm that also brings with it the potential for higher profits (as clearly demonstrated by Apple's high profit margins with high quality parts.) Consumers expect, and demand, QUALITY products now...the Windows ecosystem was dying because Microsoft partners designed it to be a CHEAP products venue. Something had to be done about that, otherwise the Windows platform WOULD have died, probably a silent death that no one noticed because there was nothing worth buying.

BTW, Microsoft is NOT price undercutting their other competitors in the windows ecosystem...Microsoft's Surface line is actually fairly expensive, and price cuts have only been because they were NECESSARY in order to increase sales...for comparable hardware, there are many cheaper options than Microsoft's products. There are also even higher quality products from others, like Dell, that rival the value and cost of Apple products.

Is Microsoft's move into the market as a direct hardware player liked by their partners-become-competitors? No, surely not. However that doesn't change the fact that it was necessary. Without Microsoft FORCING their partners-become-competitors to actually BE COMPETITIVE, the entire market would have died. It was RACING towards death already, and racing towards it not really because Microsoft products suck...they don' was racing towards death because NONE of the Microsoft/Windows ecosystem products were even remotely competitive with THEIR PRIMARY COMPETITOR: Apple!

I don't deny that Microsoft's move is unpopular and disliked. That doesn't change that it was utterly essential for Microsoft to FORCE their partners to step up their game, and drag themselves out of the muck of the ultra-cheap, ultra-low-quality crapware products they were making, into the higher level game that Apple plays. Apple is the focal point of the mobile computing industry, there is no question about that. Whether they deserve the reputation and respect they have or not, people do adore them and their products. Apple is the baseline...everything else has to be judged by that. A year ago, things still looked pretty bleak for the Windows ecosystem. Today? I just purchased a Dell XPS 15 that tops the specs of a MacBook Pro and Air combined, for less than two grand. It's a SOLIDLY built device that is just as beautiful as any Apple product, well built, blazing fast, fully touch capable. It's a wonderful product. And I HONESTLY do not believe it would have ever come into existence if Microsoft hadn't become a competitor in their own ecosystem.

Sometimes popularity isn't what saves a company...sometimes making the toughest decision possible to spur competition and innovation, even when it's incredibly unpopular, is the right decision. (Just ask Ichan... :P)

The ribbon was a DIRECT response to years of customer feedback on the Office UI. People hated having to dig multiple levels deep within menu systems to find features in Word and Excel primarily. Microsoft designed the ribbon in an effort to solve that exact problem, based on explicit CUSTOMER feedback about the problems with their old Office design. Ribbon was a success in that it brought everything right to the surface, one level deep in a series of tabs.

I know the background of the Ribbon. I've to F______ use it every F______ day. Including Paint (seriously, what the hell Microsoft?) and ZEMAX, whose latest update incorporated it, despite the CUSTOMER FEEDBACK not to go there. Luckily, with professional software, they have to implement menu structure - and I've seen no-one using the Ribbon in CAD software in our house. What it comes to the Office, I agree that user feedback triggered the change, but the change itself is still botched.

You are saying that Ribbon put everything on the surface, right? Take a look on the attached PNG. What is the circled button that I see there? You know, the one that EXPANDS the options in Ribbon? The thing that should NOT exist based on the design criteria? This is basically a RE-VAMPED menu structure for you, with the exception that this is actually WORSE. The expansion button is so small that it's harder to hit than the older text based menu. I actually couldn't find the button first time I needed it!

Add on top the fact that the Ribbon icon size is sort of fixed (I only need the text part, not the graphic icon to begin with - deciphering icons is harder than text). I would like to place much more buttons there, but can't! Because of that, I still can't orient the Ribbon vertically to take advantage of the nowadays wide display aspect ratios. And I've made my opinion known on the Microsoft side.

The little chevron your talking about only appears when the screen size or window size is too small to display the entire ribbon. It's an adaptive thing. There is a LOT of functionality in Microsoft products. Microsoft's options are either to drop functionality, which is 100% guaranteed to cause an uproar...or...find some way of making all the necessary tools available even on screens that are too small to display it all at once.

Try using office maximized on a larger screen. That little chevron your bitching about? It'll disappear...and the entire contents of the entire ribbon will show up on the screen.

Sorry, but I find your complaints about the ribbon just an angry dude finding a reason to be angry about something...

Now your just speculating about Microsoft forcing anything on it's customers. You can still, and will always be able to, buy Office stand-alone. I did. I own a couple stand alone copies. I opted for that, instead of the much cheaper $99/yr Office Cloud standard edition. I prefer to store my data locally...but not everyone does. Some people, some corporations and smaller businesses, much prefer to offload the once-necessary costs and complexities of managing their own computer networks and systems onto a larger business entity that has more talented and effective resources for managing such things.

It could be. And I thought I made it clear this is speculation (though based on several snippets of facts). Getting back there, there's no similar legislation in place for data storage as there is for example book-keeping that small enterprises typically favor too, and data storage is actually much more sensitive area. In Europe, I don't think this would fly - you're simply considered stupid if you do this, until the legal standing is clear. Also add on top that Cloud servers that stay on US soil are suspect for US government actions at any second. This is not to say that your average worker cannot upload anything to Cloud, but he's responsible for the brunt if data loss happens.

I'm very glad I don't live in Europe. The EU has demonstrated for decades that it has a fairly anti-business stance, and the penalties they have levied on large corporations are rather extreme at times. It's a punitive system, constantly punishing, punishing, punishing. I'm not really surprised you hold the opinions you do...I guess the actions of the EU make a lot more sense now...

Cloud is Microsoft's strength. Their biggest competitor there is actually Amazon, and they are making headway, helping spur a competitive market in the cloud services business.

This doesn't make any sense. You're saying Microsoft's cloud is for the enterprise, but as far as I know, Amazon is for consumers. Which is it?

You HAVE heard of the Amazon Cloud Services, right? Amazon is the world's largest online retailer. They couldn't be that if they hadn't developed the technology to support that kind of infrastucture. It was many years ago that Amazon started offering web services to access some of the technological infrastructure they had built, and today, they are the largest provider of core cloud services (i.e. big data, compute cycles, virtualized hosting, etc.) of anyone. Those services are used by enterprise businesses to host...pretty much anything. Even NetFlix is hosted on Amazon's cloud servers.

Microsoft Azure directly competes with Amazon Cloud Services. Microsoft's Cloud Services (i.e. Office in the Cloud) directly competes with Google's web apps. Overall, Microsoft's cloud initiatives are gaining a lot of ground against their competitors.

The way app stores are run isn't really a Microsoft thing. Apple started that trend, and in many ways, it is essential to the protection of consumers. Just look into how many problems and security issues can and have occurred on the Android platform, with it's open app store, vs. how many of those kinds of issues occur on Apple or Microsoft devices. There needs to be some level of buffer, some small barrier to entry, to help weed out the apps that are designed by data and identity thieves for the purposes of data and identity theft, fraud, etc.

I agree with store safety with Android. But, you're saying app store isn't a Microsoft thing. I think here you'll need to look into the future and not in the past as you so readily advised me. Apple is the most profitable high-tech (HAH!) company on Earth, and it stands for a good reason Microsoft has an incentive to go the same way - and this includes orientation towards the consumer. So, the software companies building on Windows ecosystem can also predict that in the future their profit margin drops due to the Microsoft taking a larger share in the Microsoft Store. Which is fine, Microsoft can do whatever they want with their ecosystem and I suppose you get something back for the price, but I'm saying there will be consequences and market share erosion as not everybody will find the properties worth their money. As you are already seeing with the case of Valve. And I never said this had anything to do with Windows 8, but general Microsoft strategy.

Valve was pissed that Microsoft wanted to take a small cut of all in-app sales. Again, that isn't a strategy that Microsoft pioneered...Apple already does that. Valve would have the same problem if they tried to create an app in the Apple store.

As for cost, Microsoft takes the same amount as Apple. They always have. As a matter of fact, Microsoft often gives discounts for app developers, as an incentive, to get them onto the platform. Fundamentally, though, app developers on both platforms pay $99/yr to develop apps, and get 70% of the revenue from the sales. Both companies take 30%, which is then used to cover credit card transaction fees, infrastructural support fees, and the companies cut (which is less than 20% for both companies).

FYI, I was actually supporting Windows against Linux when 7 was released. It's only now that 8 is released and Microsoft's strategy is clear, and it seems consistent UI changes are the norm, I'm considering switching to Linux in next computer update. Microsoft actually never made the jump easier.

I'm not sure what is "consistent" about UI changes. The only two things that changed between 7 and 8 was the start menu...which became a start screen, and the use of ribbons in the core desktop apps (i.e. Explorer). People on Windows have been using ribbon for years now, so it isn't something new. I haven't heard much about that being a sticking point with potential upgraders, either...the biggest complaints are the start screen. But as you can see from other participants in this thread, the vast majority of the complaints about the start screen are entirely unfounded.

Not to mention, if you really want a start can have it. There are free and cheap utilities to bring it back if that's something you REALLY REALLY want. It isn't enough to avoid upgrading, because everything else about Windows 8 has been improved over Windows 7.

Third Party Manufacturers / Re: Nikon's D800E 30% sharper than D800
« on: July 07, 2014, 10:58:25 AM »
That is laughable for several obvious reasons, first, they are saying the Ziess lens is perfect and causes zero resolution loss, that is impossible, it is either breaking the laws of physics, or their measurements are suspect yet again. And, just read any Nikon forum where people own both, and there are a surprising amount, they will tell you that is simply not true, yes the E does resolve slightly more, but 30% more, no.

Not exactly perfect.  just able to use the full resolution of the sensor.

If they sensor was 8 MP and a lens resolved 8, would you call that perfect?  Just pushes the limit of the sensor.

No that isn't how it works, there is a complex relationship between each individual elements efficiency and a systems efficiency. Pretty much all lenses can actually resolve way more than any sensor, just look at the difference between a lens optical bench tested lens and one that relies on a camera sensor, huge difference.

So if the sensor was a 20MP sensor and the lens and sensor were both perfect then you'd expect to get 20MP of resolution, this is what DXO are claiming for the D800E and Zeiss 135 combo. However if we ignore all other factors and the lens is only 99% perfect it can only possibly resolve 99% of a perfect sensors resolution, and no sensor/camera is perfect. So the perfect sensor and 99% perfect lens could equate to 19.8MP in a simplified form.

For the full equations look here under "System Resolution":

(Note, this response is for the benefit of everyone, it is not just a reply to PBD):

I wouldn't necessarily say it's complicated, but total optical system resolving power is non-obvious.

One thing, "perfect" resolution actually means "infinite" resolution. The resolving power of an optical system (i.e. a whole camera with lens and sensor) is limited by the resolving power if the least capable component. If that is the sensor, then resolving power of the whole has an asymptotic relationship with the resolution of the sensor.

I think it's tough to say that a lens resolves 99% of "perfection"...since perfection requires infinite resolving power (at an infinite aperture, to be explicit). What is 99% of infinity? Lens resolving power is also falls off as the aperture is made smaller. Lens bench tests often test at max aperture and at f/8, but that is not guaranteed. So one must be specific when discussing resolving power of a system.

If we have a lens at f/4, and that lens achieves the maximum diffraction-limited resolving power, it resolves 173lp/mm. A theoretical 8mp APS-C sensor would resolve about 80lp/mm. The resolution of the whole camera, lens and sensor combined, can be closely approximated by taking the RMS of the minimum resolvable spot for each, and converting back to lp/mm. The lens resolves a spot of 2.9µm, the sensor a spot of 6.2µm. The two when working together convolve to produce a spot size of 6.85µm, or a system resolution of 73lp/mm. The two together resolve LESS than the resolving power of the least this case, the sensor.

If we dice all the pixels in our 8mp sensor into quarters (make the pixels half as large), we end up with a 32mp sensor capable of resolving 161.3lp/mm. Combined with the same lens, the system resolution is 117lp/mm. If we make a sensor with the same resolving power as the lens, we have a 39.7mp sensor. The resolving power of the system is 122lp/mm. We are still short of the 173lp/mm of the lens. We haven't actually achieved "perfect" resolving power, despite increasing the resolution of our sensor. You never you increase the performance of one component or the other, the bar just keeps getting higher...the mechanisms that convolve the image signal into the final output are constantly working against you, keeping you from actually achieving the real potential of either component. You would have to RADICALLY increase the performance of one in order to approach the limit of the other. To actually resolve the 173lp/mm spatial resolution possible with an f/4 lens, you would need pixels smaller than 0.25µm, or 250nm in size. That is smaller than the wavelengths of all visible light! It's even smaller than near UV, getting into deep UV. A sensor with pixels that small would be a 5.34 GIGApixel sensor!  And that camera would still resolve 171.7lp/'s still falling short of the 173lp/mm theoretical maximum of an ideal f/4 lens.

The only way to achieve perfect resolution is to have both a lens and a sensor with infinite resolving power. Obviously, such a lens does not exist. The best you can hope for is diffraction limited behavior at a lens' maximum aperture. Few lenses achieve diffraction limited behavior at f/4, most still have a small amount of optical aberrations, especially around the periphery. The Otus is one lens that approaches ideal performance pretty closely, though.

EOS-M / Re: Cheap 400mm advice
« on: July 06, 2014, 02:21:01 PM »
Just because a lens is faster for a given focal length doesn't mean it has more resolving power.  It does mean it has more potential resolving power due to larger aperture but aberrations do matter, and small fast mirror lenses are often much poorer optically than larger slower telescopes.

It's not necessarily that it's faster, really. Resolving power is related to the total surface area of the objective (i.e. primary mirror in a relector), which in turn ultimately determines the aperture (physical aperture, not relative aperture), which is ultimately responsible for gathering light. It's a simple test that can be done with stars. Point any two lenses at the same place in the night sky. Ultimately, regardless of which one is actually "faster", the one with the largest physical aperture will resolve more and smaller stars. F-ratio is simply that, a ratio...all it really does is describe in common terms how large the physical aperture of a lens will be for a given focal length. I wasn't trying to say that a "faster" 800mm lens is going to resolve more in my example with 800mm lenses...I was saying that the larger physical aperture is going to be gathering more information per point on any given subject, and thus it will have a higher resolving power.

This is almost exclusively true with telescopes, which are almost always diffraction limited. It is true that cheaper optics in a lens have the potential to introduce aberrations. However in the case of astrophotography, all that really means is instead of resolving a single crisp, bright point of light for a star, you resolve a bright point of light that has some kind of halo around it. Optical aberrations don't necessarily reduce resolution, they just muck with the quality of the image.

EOS-M / Re: Cheap 400mm advice
« on: July 06, 2014, 01:29:21 PM »
How about one of these stuck on an EOS M? That should give me 1280mm.

It ultimately depends on what your goals are. Long focal length is certainly important, and I think around 800mm is a good place to start for shooting the moon with APS-C.

There is another factor, however. Fundamentally, resolving power is linked to the physical size of the aperture. This usually isn't as apparent in normal photography as it is in astrophotography, but when you start resolving the very fine detail that exists in objects in space, this fact begins to become very important.

Assuming you had an 800mm f/4 lens, 800mm f/5.6 lens, and 800mm f/8 lens. Most people's inclination would be to think, they are the same focal length, so they should be the same so long as I expose for longer with the f/5.6 and f/8 lenses. In terms of brightness of the object, that will be true...however the f/5.6 and f/8 lenses won't be resolving as fine a level of detail as the f/4 lens, and the f/8 won't resolve as fine a level as the f/5.6.

It isn't simply a matter of magnifying's maintaining your resolving power as you magnify it more. With the EF 600mm f/4 L II and a 1.4x TC, I have an f/5.6 lens. The reason my moon photos are so sharp and detailed is due to the fact that my combo maintains a high resolving power, thanks to a large aperture (remember, the entire surface area of the lens is gathering light for every single mathematical point on your subject...the more light gathered for each point, the more complete and refined those points will be when focused on your sensor).

If you just go with a 1250mm f/13.9 lens, the moon will be very large, but you won't actually be resolving more detail than say an 800mm f/8 lens. The 1540mm f/12.1 lens is actually going to be a better option than the 1250mm f/13.9 has a much larger physical aperture: 127mm vs. 89mm...a 127mm aperture is actually very nice...close to the 600mm f/4, and it would be my top recommendation from the list of telescopes offered by Lee Jay. A 1600mm f/12 optic is going to be a powerhouse for resolving moon detail....not to mention you could do some amazing planetary imaging with that and a barlow lens as well (at 4800mm with a 3x barlow, a simple web cam or something like the QHY5L-II color planetary camera, some video imaging software (I think the QHY5L-II comes with some software) and a tool like RegiStax, you could create AMAZING planetary images, as well as some awesome close-ups of the moon itself.)

Wordpress should do what you want. It is a multi-user system, so you can set up other users for your wordpress site, and they can upload stuff if they have the right user level (i.e. contributor or editor). WordPress has a LOT of themes, although the best ones are for-pay (usually worth it, one time cost for any given site). WordPress sites hosted at have a decent amount of features, usually enough for the vast majority of people to get by.

If you need ultimate flexibility, you can always find a cheap web host that offers WordPress hosting, and you can install any one of thousands of plugins that pretty much make WordPress one of the most powerful hosting web site and blog platforms on the planet.

If you want to see what WordPress can do, take a look at my personal site: It's built entirely on WordPress, including the blog, photo carousel, gallery, and individual pages.

EOS-M / Re: Cheap 400mm advice
« on: July 06, 2014, 05:31:28 AM »
Hey guys I was wondering which would be better for achieving ~ 400mm focal length with the M. I would like to take some occasional pics of the moon. I've done it before and found 400mm to be long enough with a bit of cropping.

You want a LOT more focal length than 400mm to image the moon. I used an 840mm lens (EF 600mm f/4 L II w/ 1.4x TC) to produce this image:

Look Jon, stop playing these amateur games and get real. This is what you need.

Haha! Now a MOON LENS! :D And apparently, one hell of a giant EOS as well...  :o

EOS-M / Re: Cheap 400mm advice
« on: July 06, 2014, 01:34:32 AM »
Hey guys I was wondering which would be better for achieving ~ 400mm focal length with the M. I would like to take some occasional pics of the moon. I've done it before and found 400mm to be long enough with a bit of cropping.

You want a LOT more focal length than 400mm to image the moon. I used an 840mm lens (EF 600mm f/4 L II w/ 1.4x TC) to produce this image:

I used the 7D and 600mm bare to produce this image:

The moon doesn't even fill the frame at 840mm on the 7D. It would be even smaller in a full frame camera. If your really interested in imaging the moon, you want AS MUCH focal length as you can get your hands on. I haven't tried it yet (well, I tried it, but I couldn't get the darn thing stable enough to actually take any images...I now have better gear, so maybe time to try), but I would be willing to bet that 1200mm and the 7D would STILL not result in the moon filling the frame entirely.

I 'd add that I don't want to scroll for like 20 seconds to reach calculator in Greek Win8 while in windows 7 i could just press start and type calc.

In addition to the previous post's solution (just typing on start screen=instant search, just as fast as win7), in Windows 8.1 you can also press Start+R to get a run dialog box immediately.

Quote from: joemod
Also if Microsoft's decision on win8 UI was correct why do they revert it in win9?
Please accept that regardless my prejudice against Microsoft (which I admit I have since they did the Crusader's expedition against Linux), win8 is unusable as a desktop user for me, while I am pretty fond of windows 7.

win9 does not revert it, it refines it.  windows 7 is the past, while it may be great on the desktop it is a failure on tablets due to the small ui elements.  ignoring tablets would be death of windows in time.  So, Windows 8 was the beginning of an OS that can do desktop and tablet.  Windows 9 refines it, it does not revert it.  The start screen will still be there for tablets, and desktop will get a blend.  Windows Store apps will need to remain a part of desktop Windows as over time it will be the way most apps are delivered.

Most of the points brought up about windows 8.1 here are by those who do not know how to fully use windows 8.1  Your calc "problem" is an example of this, as the procedure is virtually identical for 7 & 8 - you just start typing at the start menu / screen respectively (or can use the run hotkey).  Another poster brought up it is hard to find desktop, when desktop is the default bootup screen for windows 8.1 when not using a tablet - makes me wonder if people have even made a solid attempt at 8.1 or are just haters.  Windows 9 will perhaps make some of these tricks more obvious and make the OS easier to grasp, but in reality it will be the same OS - just polished like 7 is a polished version of Vista.

Hi Ruined.  :D Nice to meet you!  ;D

I 'd add that I don't want to scroll for like 20 seconds to reach calculator in Greek Win8 while in windows 7 i could just press start and type calc. Also if Microsoft's decision on win8 UI was correct why do they revert it in win9?
Please accept that regardless my prejudice against Microsoft (which I admit I have since they did the Crusader's expedition against Linux), win8 is unusable as a desktop user for me, while I am pretty fond of windows 7.


It's so sad that so much misinformation about Windows 8 has permanently infected peoples brains. :P

In windows can STILL just hit start (i.e. the windows key on the keyboard), and just start typing! In Windows 8, search is integrated and FIRST CLASS. On the start screen, you can just start typing...type ANYTHING, and it will search in multiple contexts. If you start typing "calc", a panel will slide out from the right-hand side of the screen, and you'll see a filtered list of apps, then other things, that matched "calc". The FIRST thing that comes up is the calculator:

Once you see it listed and highlighted (takes about 0.02 seconds), you just hit enter and it runs...ON THE DESKTOP! :D

If you run a search, and windows determines it did not actually find exactly what your looking for, hitting enter runs Bing universal search:

Personally, when I first got Windows 8, I didn't make any assumptions about things I figured probably wouldn't work. I just started using the start screen how I'd always used the start menu. I simply started typing on the start screen to search for apps and other things...just like I always did in Windows 7. I wasn't even surprised when it worked...OF COURSE IT WORKED!  ::)

This is why I so actively defend Microsoft. People make a LOT of wild assumptions, then figure their assumptions are actual fact, when in reality they are the farthest thing from. You ASSUMED that Windows 8 couldn't search for apps just by typing the app name in on the start screen. That is fundamentally incorrect. Search is a first class citizen of Windows 8. Not only can you directly search for apps, files, anything else local...but when Windows can't find exactly what your looking for locally, you can launch the universal search, which does a deeper search of everything everywhere...locally, and whatever is indexed by Bing. Such as the case with my "Downtown Denver" search above.

People are gypping themselves by assuming incorrectly about Windows 8, and sticking to Windows 7. Windows 7 is more power hungry, slower, and less capable than Windows 8. That's all there is to it. I would be willing to bet that over 90% of the assumptions about things that are supposedly missing, moved, or improperly implemented in Windows 8 are flat out wrong.

Also if Microsoft's decision on win8 UI was correct why do they revert it in win9?

Well, first, there is no Windows 9 yet. So, Microsoft hasn't "reverted" anything. All the press releases indicate Microsoft is going to be building on the changes in Windows 8.x when Windows 9 finally rolls around. The start screen isn't going anywhere, but it sounds like it will be greatly enhanced. The dual-mode nature of the platform will remain. Deeper integration and reduction of independent code bases for Windows 9 on the desktop and tablets, and Windows Phone 9 on phones, will be reduced even further, bringing us closer to a truly unified OS that runs on everything (probably won't happen before Win 10, but as I said before, things take time, especially in an iterative world.)

If you are referring to Windows 8.1 rather than 9, well again, nothing has been reverted. New capabilities and features have been ADDED, but nothing has been taken away. The start screen, for example, is still there in Windows 8.1. The only difference is now users can choose whether to BOOT to the desktop, or the start screen. That's an extremely simple change, and an obvious one. Why wasn't it in the original release? Who knows, however it isn't surprising for every single feature imaginable by a billion customers to make it into the FIRST release of anything. Every software development project has to pick their battles, solve the most important problems first. Boot to desktop could very well have already been on Microsofts TODO list...and it just didn't make the cut.

If you want an idea of how Microsoft's internal processes work, read Eric Lippert's blog. He was a lead on the C# compiler team for many years. He is a public figure, regular participant in large software development communities like StackOverflow. Being a public figure like that, he was a front man for EVERYONEs feature requests for the C# language. He wrote blog posts on many of them, the ones he received most often, and explained why they could not be added, or explained why if they were added, it had to be done EXTREMELY carefully, or why if it was easy to add them, why they were at the very bottom of the carefully prioritized list of things that needed to be done with C#. Windows is no's software. All software projects have goals and requirements, and that list of goals and requirements is prioritized in order of the most critical to nice but not actually necessary, for any given release. Windows is never going to have every single feature that every single user wants every time it's released...but many of the most important or most frequently demanded features are likely to end up in the product, if it's feasible, in subsequent releases.

EOS Bodies / Re: Canon EOS 5D Mark IV To Feature 4K Video?
« on: July 05, 2014, 04:45:40 PM »
jrista, I don't see how this press release quantify performance. It says nothing about SNR or DR etc, it doesn't even try to claim that the sensor is competitive in those regards. It may also require hardware around it which is not feasible nor practical in todays cameras. For instance to read out and process all that data would require a lot more readout channels and processing power than what you see in a 1DX today.

Press releases may also typically be written by PR or marketing personnel written for other purposes than to scientifically describe their findings.

As for patents, I don't read them but they don't actually give any data about how well actual their actual implementations perform do they? Without that data we can not tell if its awesome or not. What seems great on paper might be bad in practice.

SNR and DR aren't the epitome of sensor performance, though. They are only factors of sensor performance. Both are heavily affected by readout noise, and it's been demonstrated that column-parallel ADC designs produce less read noise, by at least two companies now (Sony and Toshiba, and I believe other high end sensor manufacturers have similar designs in the works as well). Canon described some kind of hyperparallel on-die ADC for the 120mp APS-H.

Assuming the silicon process was the same generation as the cameras of the time it was released, it's logical to assume it has the same fundamental characteristics as the 1D IV. The 1D IV had around 45% Q.E. and the same DR limitations as all Canon cameras (due to read noise). I see no reason to assume this sensor would be significantly different in those fundamental statistics at worst, better if their highly parallelized readout offers similar improvements as Sony and Toshibas. Canon silicon hasn't really changed much over the years...the most significant improvements each generation are a few percent jump in Q.E.

Your also misunderstanding the point of using a column-parallel ADC. You actually DON'T need as much processing horsepower to read out more pixels faster when you hyperparallelize the ADC units. The problem with having too few units is each unit MUST be high powered enough to handle the hundreds of thousands or millions of pixels they have to process. That means higher frequency, and it also means more attention must be paid to the design of those units to limit the amount of noise they add to the signal (and even then, they are noisy parts because of the high frequency).

By using one ADC unit per column, each ADC can operate at a lower frequency. The lower frequency immediately offers a benefit in terms of read noise. Other techniques, such as moving the clock and driver off to a remote area of the die (like Exmor), you can reduce noise even further (Exmor took it one step farther, and used a digital form of CDS, which they claim was better than using analog CDS...however ironically they added analog CDS back into the mix with later version of Exmor for video they do both analog and digital CDS). You trade die space for the ability to operate at a lower frequency and power. With a 180nm process, that's a no brainer. This HAS BEEN DONE...both Sony and Toshiba have working CP-ADC designs built into their CMOS sensors that are actually used in consumer products. Sony has a number of technical documents that explain how they achieved exactly what Canon describes in their 120mp APS-H papers and patents...low power high speed readout of high resolution sensors via hyperparallel ADC.

So, even though Canon's 120mp APS-H isn't in an actual consumer grade product that we can buy, it uses technology that mirrors products from other brands that we can buy, and that have been tested. The most telling are Sony security video cams that use Exmor sensors, which can operate at very high frame rates in very low light...they are not only doing high speed readout with very, very low noise and relatively high DR, they are also doing processing with image processors that are packaged to the bottom of the sensor, and wired directly to it.

To be strait, I am speculating a bit, but it's very educated speculation. It isn't like it's just 100% completely unfounded drivel. :P

For instance Foveon sensors seem like a great technology on paper does it not? No CFA wasting away 2/3rds of the light and no demosaic algorithm interpolating data and making images soft in 100% view. Yet in real life Foveon is outperformed by standard CFA sensors, it gives the resolution but does not perform well in other aspects. Real life performance is what counts and Foveon sensors don't have it (yet, would like to see that change).

As for Foveon, I think your incorrect in your assessment. Foveon only "fails" at ONE thing: resolving power. There have been debates in the past on these forums where Foveon fans claim that because it has a 100% fill factor for all colors, that it has as much or higher RESOLUTION than bayer sensors. Those claims are wrong, as bayer sensors get largely the full benefit of the raw sensor resolution in terms of luminance...they only really suffer in color resolution and color fidelity (both areas where Foveon excels).

For what Foveon is, at it's REAL spatial resolution, they are actually very good. Their red channels are a little nosier, but their blue channels are less noisy than bayer. No surprise, given the layering order of color photodiodes in the Foveon. Even though image dimensions/resolving power for Foveon is lower than in bayer sensors, those smaller images usually exhibit high quality. I do think that color fidelity with Foveon cameras is superior to what I get with my Canon DSLRs (I just like my resolution too much to give it up :P). So I think it's unfair to claim that the real-life performance of Foveon is bad or even poor. For what it is, it's real life performance is very good.

The only drawback of Foveon is it's resolving power...and I truly believe that Sigma has done Foveon a big disservice by trying to upsell it as having more resolution than it really does, or somehow claim that because it gathers full color information per pixel that upsampling it somehow beats bayer sensors for resolution and detail. Actual real-world examples that do exactly that have proven otherwise. Foveon's problem isn't that it's bad's that Sigma owns it, and Sigma doesn't have the marketing power nor the R&D budget to really make Foveon shine and become a highly competitive alternative. Sigma is much more a lens company than a camera or sensor company, IMO. I do believe it COULD be highly competitive in the hands of a wealthier corporation that could more richly fund it's development.

EOS Bodies / Re: Canon EOS 5D Mark IV To Feature 4K Video?
« on: July 05, 2014, 04:15:37 PM »
Even 180nm is, what, 8 generations ago?

Not really. You have to take it in proper context. For large sensors, APS-C and larger, I don't know of any that use a process smaller than 180nm. The ultra tiny form factors, the sensors that are a fraction the size of a fingernail, are the sensors that use very small fabrication processes, but even those aren't eight generations more advanced.  I think a 65nm process is used for sensors with 0.9µm and the upcoming 0.7µm pixel sizes. That would be three generations smaller transistor size (180nm: -> 130nm -> 90nm -> 65nm).

You also can't use CPU transistor fabrication technology as a basis of comparison. They are primarily on 22nm, with 14nm parts supposedly due this year (maybe they are already here, haven't looked into it.) But that is a whole entire different market. We know that Canon's DIGIC 5 used a 65nm process, manufactured by Texas Instruments I think. But that's still a processor. You can't mix that with sensor tech.

At the moment, I think 65nm is the current smallest fabrication process used for image sensors. The next step would be 45nm, and I've read a couple patents that describe sensors with 0.7µm (700nm) pixel sensors that would be fabricated with a 45nm process, but I haven't actually seen anything yet that indicates it's being done.  Even if there were sensors being manufactured with 22nm gates, that is six generations...not eight. Given that larger form factor sensors don't even remotely need a 65nm gate to be highly efficient, I wouldn't say that a 180nm process is out of date for APS-C and FF sensors. If it is out of date, it would only be out of date by one generation, 130nm.

I did read about some high sensitivity sensors recently called SPADs, or Single-Photon Avalanche Diodes, which are designed for specialized purposes (medical imaging and such like PET, FLIM, etc., scientific imaging, astrophotography, etc.) These are pretty bad-ass CMOS devices with ultra high sensitivity (basically photon counters). They have been fabricated on 180nm, 130nm and 90nm processes. They generally seem to be 130nm parts, are usually fairly small sensors (smaller than APS-C), with pixel sizes maybe a little bit smaller than current APS-C parts. They have a specialized pixel structure, but overall are not all that much different than your average CMOS sensor. These are CUTTING EDGE devices...really cutting edge. They do their job extremely well, and really don't need transistors smaller than 90nm. It actually seems smaller processes actually make it more difficult to fabricate these high end sensors than the larger processes.

So I really don't think that 180nm is old and out dated, not for the size of sensors were talking about.

EOS Bodies / Re: Canon EOS 5D Mark IV To Feature 4K Video?
« on: July 05, 2014, 02:38:29 PM »
The thing operated at 9.5fps, and it really doesn't matter if it had small pixels, because fundamental IQ is related to total sensor area and Q.E., not pixel area.

Except that the more rows and columns that are present, the more space is lost to the barriers in-between. If they can keep the area covered by pixels constant whilst reducing the pixel size to provide more pixels then yes, you're right. This comes down to manufacturing process where Canon have been using a larger process.  Canon is an using old .5 µm process, while Sony and Toshiba have advanced to .25 µm and .18 µm processes. See the chipsworks site for more info.

Canon used a smaller process for this. From what I recall about either a press release or some other specs listed somewhere, they sensor was actually stitched together from separately manufactured parts. (A lot of Canon's prototype sensors are actually made that way, by stitching multiple separate fabricated parts into a single device. Their ultra high sensitivity 0.1 Lux large format sensor, for example, is manufactured that way as a simple matter of necessity.) I suspect they fabricated it with the same fab they produce their small form factor sensors with. Chipworks verified years ago that Canon has had a 180nm copper interlink process, one even capable of producing sensors with light pipes, for a while now.

So yes, you are correct, Canon's current APS-C and FF sensors are built on a 500nm process, which wouldn't support a sensor like this (well, it's just that the photodiodes would be really tiny and therefor the fill-factor, the total actual light-sensitive area, of the sensor would be lower than an identically sized sensor with larger pixels). However there was a rumor not long ago that Canon was revamping its fabs, moving to larger wafers. Either they are repurposing some of their existing fabs for the small form factor stuff, or they are building new fabs to expand their 180nm fab capacity.

Pages: 1 ... 77 78 [79] 80 81 ... 322