November 22, 2014, 09:43:09 PM

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - jrista

Pages: 1 ... 65 66 [67] 68 69 ... 309
991
I 'd add that I don't want to scroll for like 20 seconds to reach calculator in Greek Win8 while in windows 7 i could just press start and type calc. Also if Microsoft's decision on win8 UI was correct why do they revert it in win9?
Please accept that regardless my prejudice against Microsoft (which I admit I have since they did the Crusader's expedition against Linux), win8 is unusable as a desktop user for me, while I am pretty fond of windows 7.

*Sigh*

It's so sad that so much misinformation about Windows 8 has permanently infected peoples brains. :P

In windows 8..you can STILL just hit start (i.e. the windows key on the keyboard), and just start typing! In Windows 8, search is integrated and FIRST CLASS. On the start screen, you can just start typing...type ANYTHING, and it will search in multiple contexts. If you start typing "calc", a panel will slide out from the right-hand side of the screen, and you'll see a filtered list of apps, then other things, that matched "calc". The FIRST thing that comes up is the calculator:



Once you see it listed and highlighted (takes about 0.02 seconds), you just hit enter and it runs...ON THE DESKTOP! :D

If you run a search, and windows determines it did not actually find exactly what your looking for, hitting enter runs Bing universal search:



Personally, when I first got Windows 8, I didn't make any assumptions about things I figured probably wouldn't work. I just started using the start screen how I'd always used the start menu. I simply started typing on the start screen to search for apps and other things...just like I always did in Windows 7. I wasn't even surprised when it worked...OF COURSE IT WORKED!  ::)

This is why I so actively defend Microsoft. People make a LOT of wild assumptions, then figure their assumptions are actual fact, when in reality they are the farthest thing from. You ASSUMED that Windows 8 couldn't search for apps just by typing the app name in on the start screen. That is fundamentally incorrect. Search is a first class citizen of Windows 8. Not only can you directly search for apps, files, anything else local...but when Windows can't find exactly what your looking for locally, you can launch the universal search, which does a deeper search of everything everywhere...locally, and whatever is indexed by Bing. Such as the case with my "Downtown Denver" search above.

People are gypping themselves by assuming incorrectly about Windows 8, and sticking to Windows 7. Windows 7 is more power hungry, slower, and less capable than Windows 8. That's all there is to it. I would be willing to bet that over 90% of the assumptions about things that are supposedly missing, moved, or improperly implemented in Windows 8 are flat out wrong.

Quote
Also if Microsoft's decision on win8 UI was correct why do they revert it in win9?

Well, first, there is no Windows 9 yet. So, Microsoft hasn't "reverted" anything. All the press releases indicate Microsoft is going to be building on the changes in Windows 8.x when Windows 9 finally rolls around. The start screen isn't going anywhere, but it sounds like it will be greatly enhanced. The dual-mode nature of the platform will remain. Deeper integration and reduction of independent code bases for Windows 9 on the desktop and tablets, and Windows Phone 9 on phones, will be reduced even further, bringing us closer to a truly unified OS that runs on everything (probably won't happen before Win 10, but as I said before, things take time, especially in an iterative world.)

If you are referring to Windows 8.1 rather than 9, well again, nothing has been reverted. New capabilities and features have been ADDED, but nothing has been taken away. The start screen, for example, is still there in Windows 8.1. The only difference is now users can choose whether to BOOT to the desktop, or the start screen. That's an extremely simple change, and an obvious one. Why wasn't it in the original release? Who knows, however it isn't surprising for every single feature imaginable by a billion customers to make it into the FIRST release of anything. Every software development project has to pick their battles, solve the most important problems first. Boot to desktop could very well have already been on Microsofts TODO list...and it just didn't make the cut.

If you want an idea of how Microsoft's internal processes work, read Eric Lippert's blog. He was a lead on the C# compiler team for many years. He is a public figure, regular participant in large software development communities like StackOverflow. Being a public figure like that, he was a front man for EVERYONEs feature requests for the C# language. He wrote blog posts on many of them, the ones he received most often, and explained why they could not be added, or explained why if they were added, it had to be done EXTREMELY carefully, or why if it was easy to add them, why they were at the very bottom of the carefully prioritized list of things that needed to be done with C#. Windows is no different...it's software. All software projects have goals and requirements, and that list of goals and requirements is prioritized in order of the most critical to nice but not actually necessary, for any given release. Windows is never going to have every single feature that every single user wants every time it's released...but many of the most important or most frequently demanded features are likely to end up in the product, if it's feasible, in subsequent releases.

992
EOS Bodies / Re: Canon EOS 5D Mark IV To Feature 4K Video?
« on: July 05, 2014, 04:45:40 PM »
jrista, I don't see how this press release quantify performance. It says nothing about SNR or DR etc, it doesn't even try to claim that the sensor is competitive in those regards. It may also require hardware around it which is not feasible nor practical in todays cameras. For instance to read out and process all that data would require a lot more readout channels and processing power than what you see in a 1DX today.

Press releases may also typically be written by PR or marketing personnel written for other purposes than to scientifically describe their findings.

As for patents, I don't read them but they don't actually give any data about how well actual their actual implementations perform do they? Without that data we can not tell if its awesome or not. What seems great on paper might be bad in practice.

SNR and DR aren't the epitome of sensor performance, though. They are only factors of sensor performance. Both are heavily affected by readout noise, and it's been demonstrated that column-parallel ADC designs produce less read noise, by at least two companies now (Sony and Toshiba, and I believe other high end sensor manufacturers have similar designs in the works as well). Canon described some kind of hyperparallel on-die ADC for the 120mp APS-H.

Assuming the silicon process was the same generation as the cameras of the time it was released, it's logical to assume it has the same fundamental characteristics as the 1D IV. The 1D IV had around 45% Q.E. and the same DR limitations as all Canon cameras (due to read noise). I see no reason to assume this sensor would be significantly different in those fundamental statistics at worst, better if their highly parallelized readout offers similar improvements as Sony and Toshibas. Canon silicon hasn't really changed much over the years...the most significant improvements each generation are a few percent jump in Q.E.

Your also misunderstanding the point of using a column-parallel ADC. You actually DON'T need as much processing horsepower to read out more pixels faster when you hyperparallelize the ADC units. The problem with having too few units is each unit MUST be high powered enough to handle the hundreds of thousands or millions of pixels they have to process. That means higher frequency, and it also means more attention must be paid to the design of those units to limit the amount of noise they add to the signal (and even then, they are noisy parts because of the high frequency).

By using one ADC unit per column, each ADC can operate at a lower frequency. The lower frequency immediately offers a benefit in terms of read noise. Other techniques, such as moving the clock and driver off to a remote area of the die (like Exmor), you can reduce noise even further (Exmor took it one step farther, and used a digital form of CDS, which they claim was better than using analog CDS...however ironically they added analog CDS back into the mix with later version of Exmor for video cameras...now they do both analog and digital CDS). You trade die space for the ability to operate at a lower frequency and power. With a 180nm process, that's a no brainer. This HAS BEEN DONE...both Sony and Toshiba have working CP-ADC designs built into their CMOS sensors that are actually used in consumer products. Sony has a number of technical documents that explain how they achieved exactly what Canon describes in their 120mp APS-H papers and patents...low power high speed readout of high resolution sensors via hyperparallel ADC.

So, even though Canon's 120mp APS-H isn't in an actual consumer grade product that we can buy, it uses technology that mirrors products from other brands that we can buy, and that have been tested. The most telling are Sony security video cams that use Exmor sensors, which can operate at very high frame rates in very low light...they are not only doing high speed readout with very, very low noise and relatively high DR, they are also doing processing with image processors that are packaged to the bottom of the sensor, and wired directly to it.

To be strait, I am speculating a bit, but it's very educated speculation. It isn't like it's just 100% completely unfounded drivel. :P

For instance Foveon sensors seem like a great technology on paper does it not? No CFA wasting away 2/3rds of the light and no demosaic algorithm interpolating data and making images soft in 100% view. Yet in real life Foveon is outperformed by standard CFA sensors, it gives the resolution but does not perform well in other aspects. Real life performance is what counts and Foveon sensors don't have it (yet, would like to see that change).

As for Foveon, I think your incorrect in your assessment. Foveon only "fails" at ONE thing: resolving power. There have been debates in the past on these forums where Foveon fans claim that because it has a 100% fill factor for all colors, that it has as much or higher RESOLUTION than bayer sensors. Those claims are wrong, as bayer sensors get largely the full benefit of the raw sensor resolution in terms of luminance...they only really suffer in color resolution and color fidelity (both areas where Foveon excels).

For what Foveon is, at it's REAL spatial resolution, they are actually very good. Their red channels are a little nosier, but their blue channels are less noisy than bayer. No surprise, given the layering order of color photodiodes in the Foveon. Even though image dimensions/resolving power for Foveon is lower than in bayer sensors, those smaller images usually exhibit high quality. I do think that color fidelity with Foveon cameras is superior to what I get with my Canon DSLRs (I just like my resolution too much to give it up :P). So I think it's unfair to claim that the real-life performance of Foveon is bad or even poor. For what it is, it's real life performance is very good.

The only drawback of Foveon is it's resolving power...and I truly believe that Sigma has done Foveon a big disservice by trying to upsell it as having more resolution than it really does, or somehow claim that because it gathers full color information per pixel that upsampling it somehow beats bayer sensors for resolution and detail. Actual real-world examples that do exactly that have proven otherwise. Foveon's problem isn't that it's bad technology...it's that Sigma owns it, and Sigma doesn't have the marketing power nor the R&D budget to really make Foveon shine and become a highly competitive alternative. Sigma is much more a lens company than a camera or sensor company, IMO. I do believe it COULD be highly competitive in the hands of a wealthier corporation that could more richly fund it's development.

993
EOS Bodies / Re: Canon EOS 5D Mark IV To Feature 4K Video?
« on: July 05, 2014, 04:15:37 PM »
Even 180nm is, what, 8 generations ago?

Not really. You have to take it in proper context. For large sensors, APS-C and larger, I don't know of any that use a process smaller than 180nm. The ultra tiny form factors, the sensors that are a fraction the size of a fingernail, are the sensors that use very small fabrication processes, but even those aren't eight generations more advanced.  I think a 65nm process is used for sensors with 0.9µm and the upcoming 0.7µm pixel sizes. That would be three generations smaller transistor size (180nm: -> 130nm -> 90nm -> 65nm).

You also can't use CPU transistor fabrication technology as a basis of comparison. They are primarily on 22nm, with 14nm parts supposedly due this year (maybe they are already here, haven't looked into it.) But that is a whole entire different market. We know that Canon's DIGIC 5 used a 65nm process, manufactured by Texas Instruments I think. But that's still a processor. You can't mix that with sensor tech.

At the moment, I think 65nm is the current smallest fabrication process used for image sensors. The next step would be 45nm, and I've read a couple patents that describe sensors with 0.7µm (700nm) pixel sensors that would be fabricated with a 45nm process, but I haven't actually seen anything yet that indicates it's being done.  Even if there were sensors being manufactured with 22nm gates, that is six generations...not eight. Given that larger form factor sensors don't even remotely need a 65nm gate to be highly efficient, I wouldn't say that a 180nm process is out of date for APS-C and FF sensors. If it is out of date, it would only be out of date by one generation, 130nm.

I did read about some high sensitivity sensors recently called SPADs, or Single-Photon Avalanche Diodes, which are designed for specialized purposes (medical imaging and such like PET, FLIM, etc., scientific imaging, astrophotography, etc.) These are pretty bad-ass CMOS devices with ultra high sensitivity (basically photon counters). They have been fabricated on 180nm, 130nm and 90nm processes. They generally seem to be 130nm parts, are usually fairly small sensors (smaller than APS-C), with pixel sizes maybe a little bit smaller than current APS-C parts. They have a specialized pixel structure, but overall are not all that much different than your average CMOS sensor. These are CUTTING EDGE devices...really cutting edge. They do their job extremely well, and really don't need transistors smaller than 90nm. It actually seems smaller processes actually make it more difficult to fabricate these high end sensors than the larger processes.

So I really don't think that 180nm is old and out dated, not for the size of sensors were talking about.

994
EOS Bodies / Re: Canon EOS 5D Mark IV To Feature 4K Video?
« on: July 05, 2014, 02:38:29 PM »
...
The thing operated at 9.5fps, and it really doesn't matter if it had small pixels, because fundamental IQ is related to total sensor area and Q.E., not pixel area.

Except that the more rows and columns that are present, the more space is lost to the barriers in-between. If they can keep the area covered by pixels constant whilst reducing the pixel size to provide more pixels then yes, you're right. This comes down to manufacturing process where Canon have been using a larger process.  Canon is an using old .5 µm process, while Sony and Toshiba have advanced to .25 µm and .18 µm processes. See the chipsworks site for more info.

Canon used a smaller process for this. From what I recall about either a press release or some other specs listed somewhere, they sensor was actually stitched together from separately manufactured parts. (A lot of Canon's prototype sensors are actually made that way, by stitching multiple separate fabricated parts into a single device. Their ultra high sensitivity 0.1 Lux large format sensor, for example, is manufactured that way as a simple matter of necessity.) I suspect they fabricated it with the same fab they produce their small form factor sensors with. Chipworks verified years ago that Canon has had a 180nm copper interlink process, one even capable of producing sensors with light pipes, for a while now.

So yes, you are correct, Canon's current APS-C and FF sensors are built on a 500nm process, which wouldn't support a sensor like this (well, it could...it's just that the photodiodes would be really tiny and therefor the fill-factor, the total actual light-sensitive area, of the sensor would be lower than an identically sized sensor with larger pixels). However there was a rumor not long ago that Canon was revamping its fabs, moving to larger wafers. Either they are repurposing some of their existing fabs for the small form factor stuff, or they are building new fabs to expand their 180nm fab capacity.

995
EOS Bodies / Re: Canon EOS 5D Mark IV To Feature 4K Video?
« on: July 05, 2014, 02:09:56 PM »
Also you said you know the 120mp sensor performed great because of a press release. A press release written by who,  toward what public and for what purpose?

From the horses mouth:

Quote
Canon successfully develops world's first APS-H-size CMOS image sensor to realize record-high resolution of 120 megapixels
TOKYO, August 24, 2010—Canon Inc. announced today that it has successfully developed an APS-H-size*1 CMOS image sensor that delivers an image resolution of approximately 120 megapixels (13,280 x 9,184 pixels), the world's highest level*2 of resolution for its size.

Compared with Canon's highest-resolution commercial CMOS sensor of the same size, comprising approximately 16.1 million pixels, the newly developed sensor features a pixel count that, at approximately 120 million pixels, is nearly 7.5 times larger and offers a 2.4-fold improvement in resolution.*3

With CMOS sensors, while high-speed readout for high pixel counts is achieved through parallel processing, an increase in parallel-processing signal counts can result in such problems as signal delays and minor deviations in timing. By modifying the method employed to control the readout circuit timing, Canon successfully achieved the high-speed readout of sensor signals. As a result, the new CMOS sensor makes possible a maximum output speed of approximately 9.5 frames per second, supporting the continuous shooting of ultra-high-resolution images.

Canon's newly developed CMOS sensor also incorporates a Full HD (1,920 x 1,080 pixels) video output capability. The sensor can output Full HD video from any approximately one-sixtieth-sized section of its total surface area.

Images captured with Canon's newly developed approximately 120-megapixel CMOS image sensor, even when cropped or digitally magnified, maintain higher levels of definition and clarity than ever before. Additionally, the sensor enables image confirmation across a wide image area, with Full HD video viewing of a select portion of the overall frame.

Through the further development of CMOS image sensors, Canon will break new ground in the world of image expression, targeting new still images that largely surpass those made possible with film, and video movies that capitalize on the unique merits of SLR cameras, namely their high mobility and the expressive power offered through interchangeable lenses.

*1   The imaging area of the newly developed sensor measures approx. 29.2 x 20.2 mm.
*2   As of August 20, 2010. Based on a Canon study.
*3   Canon's highest-resolution commercial CMOS sensor, employed in the company's EOS-1Ds Mark III and EOS 5D Mark II digital SLR cameras, is equivalent to the full-frame size of the 35 mm film format and incorporates approximately 21.1 million pixels. In 2007, the company successfully developed an APS-H-size sensor with approximately 50 million pixels.

This is one of many a few press releases, actually. There is another one that describes some of the more technical aspects, describing on-die processing and the like. I haven't found that one yet. To break out the important parts:


Quote
With CMOS sensors, while high-speed readout for high pixel counts is achieved through parallel processing, an increase in parallel-processing signal counts can result in such problems as signal delays and minor deviations in timing. By modifying the method employed to control the readout circuit timing, Canon successfully achieved the high-speed readout of sensor signals. As a result, the new CMOS sensor makes possible a maximum output speed of approximately 9.5 frames per second, supporting the continuous shooting of ultra-high-resolution images.

The technology Canon developed to increase readout was even increased parallelism. The description used at the time this was first announced, as well as the subsequently granted patent for DS-CP-ADC, describe something VERY similar to Sony Exmor, which uses one ADC unit per pixel column, which is different than past sensor designs, which used one ADC unit per group of columns. For example, in a camera with 8 readout channels, and 4000 columns of pixels, every ADC would be responsible for processing 500 columns of pixels, which when you factor in the row count, is hundreds of thousands to millions of pixels per ADC. With column-parallel ADC, each ADC unit is only responsible for processing a few thousand pixels. With the ADC units on the sensor, the distance between pixel and ADC unit is greatly shortened, which allows them to solve the timing issues. Thanks to hyperparallelism, each ADC unit has to do less work, so you can actually achieve faster readout at a lower frequency, this improving readout performance without hurting IQ.

Quote
Images captured with Canon's newly developed approximately 120-megapixel CMOS image sensor, even when cropped or digitally magnified, maintain higher levels of definition and clarity than ever before.

Even if each pixel itself was noisier than Canon's 16mp APS-H sensor, it doesn't matter. Noise is related to total sensor area, quantum efficiency, and in small part to read noise (that only affects the deep shadows). Ignoring dynamic range for a minute (there is no information about the DR of the 120mp APS-H sensor, so I honestly cannot speak to it), if you downsample a 120mp APS-H image to the same dimensions as a 16mp APS-H image, the per-pixel noise is going to average out. Since the two sensors have the same total area, there is unlikely to be any measurable differences. Given the 120mp sensor used a better readout system, I'd be willing to bet good money that it actually had the better noise characteristics. And there is absolutely no question it would have much sharper, clearer details.

I'll see if I can find one of the other press releases, the more technical one, and share it. I'm not trying to mislead anyone. I really try not to assume, whenever possible everything I say is based on some fact or piece of official data somewhere. I have a bit more depth of knowledge than just what this one press release offers because I've read everything there is to read about things like Canon's 120mp APS-H sensor (and plenty more from other sensor manufacturers), so I have a larger body of knowledge to draw from.

I really honestly do believe that Canon's 120mp APS-H sensor, which does actually exist in prototype and uses some of Canon's still photography patents, could be one hell of a powerhouse for IQ. Not necessarily greatly reducing noise...but massively increasing detail and sharpness, either allowing photographers to print really large without having to upsample, or by allowing significant improvements in overall IQ simply by downsampling.

996
EOS Bodies / Re: Canon EOS 5D Mark IV To Feature 4K Video?
« on: July 05, 2014, 01:53:05 PM »
... Guess that's the most concerning thing about Canon. They have some amazing technology...but they aren't using it...so it isn't making money. ...

I just want to comment this. You assume that they have some amazing technology. How do you know it is amazing? If it really was amazing and they could make money on it, it should make it's way out into real products. If it doesn't we can only assume it doesn't perform competitively. Maybe they patent ideas in case they may be used in the future if they manage to overcome some hurdles which make them infeasible today.

Like the demoed 120mp sensor we read about earlier, does anyone outside Canon actually know how it performed? Seems like many just assume it was great, for all I know it could have been terrible.

I'm not assuming. I know for a fact. How? Because I've READ the patents. Canon DOES have some really amazing technology. Many of Canon's patents are similar (but not identical to) patents from Sony and Aptina. The only real difference, as far as I can tell, is Sony and Aptina are actually turning their patents into actual products. Canon...well, so far at least, they seem to just sit on them. I'm hoping that changes with the 7D II.

One of the ones I hope they actually implement is their Dual-Scale CP-ADC patent, as based on the patent it sounds like the closest thing to the Sony Exmor design I've found. If Canon can bring Exmor-like technology to their own cameras, even if it isn't quite as good, it will still be better than what they have.

I also believe that patent is the same technology that Canon used in the 120mp APS-H prototype sensor. We actually know how that performed as well, because Canon published a press release describing it's performance. They described the architecture of the sensor, which clearly stated some kind of hyperparallel on-die processing (i.e. CDS, ADC, etc.) That is exactly what CP-ADC is. The thing operated at 9.5fps, and it really doesn't matter if it had small pixels, because fundamental IQ is related to total sensor area and Q.E., not pixel area. It would have been at least as good as the 1D IV at the time, and any APS-H sensor will have better IQ than an APS-C sensor in identical framing situations. At 120mp, the thing cranked out more resolution than any larger format sensor on the planet...until and since.

Do these patent descriptions quantify the real performance?

The actual physical prototype 120mp APS-H sesnor that Canon actually produced, tested, and gathered data for quantify the real performance.

997
EOS Bodies / Re: Canon EOS 5D Mark IV To Feature 4K Video?
« on: July 05, 2014, 01:15:51 PM »
... Guess that's the most concerning thing about Canon. They have some amazing technology...but they aren't using it...so it isn't making money. ...

I just want to comment this. You assume that they have some amazing technology. How do you know it is amazing? If it really was amazing and they could make money on it, it should make it's way out into real products. If it doesn't we can only assume it doesn't perform competitively. Maybe they patent ideas in case they may be used in the future if they manage to overcome some hurdles which make them infeasible today.

Like the demoed 120mp sensor we read about earlier, does anyone outside Canon actually know how it performed? Seems like many just assume it was great, for all I know it could have been terrible.

I'm not assuming. I know for a fact. How? Because I've READ the patents. Canon DOES have some really amazing technology. Many of Canon's patents are similar (but not identical to) patents from Sony and Aptina. The only real difference, as far as I can tell, is Sony and Aptina are actually turning their patents into actual products. Canon...well, so far at least, they seem to just sit on them. I'm hoping that changes with the 7D II.

One of the ones I hope they actually implement is their Dual-Scale CP-ADC patent, as based on the patent it sounds like the closest thing to the Sony Exmor design I've found. If Canon can bring Exmor-like technology to their own cameras, even if it isn't quite as good, it will still be better than what they have.

I also believe that patent is the same technology that Canon used in the 120mp APS-H prototype sensor. We actually know how that performed as well, because Canon published a press release describing it's performance. They described the architecture of the sensor, which clearly stated some kind of hyperparallel on-die processing (i.e. CDS, ADC, etc.) That is exactly what CP-ADC is. The thing operated at 9.5fps, and it really doesn't matter if it had small pixels, because fundamental IQ is related to total sensor area and Q.E., not pixel area. It would have been at least as good as the 1D IV at the time, and any APS-H sensor will have better IQ than an APS-C sensor in identical framing situations. At 120mp, the thing cranked out more resolution than any larger format sensor on the planet...until and since.

998
EOS Bodies / Re: Canon EOS 5D Mark IV To Feature 4K Video?
« on: July 05, 2014, 02:18:38 AM »
Well we can only judge on what we can buy  :)

As for the patents, Canon have been very aggressive patent registerers for a very long time, which is comical when you examine their early history, but that aside, patents do not products make, we have had rumours of hundreds of them over the years here and few see the light of day, we all know Canon are innovative and do a lot of R&D but most of the time companies patent to cock block anybody else.

They have struggled to make DO lenses work from the word go, they seem to be convinced there is something there and won't let it die but we are not there yet, the 70-300 DO is the biggest piece of $1,400 crap ever, I'd love to know sales figures for the 400 DO.

I still don't see how DPAF helps SLR stills shooters.

As for AF and metering, well they introduced the 45 point AF back in 2000, so it isn't like they didn't have time to put a bit more thought into it, though it isn't "radically" different is it? Dedicated processor and all but the same contrast detect chip behind a sub mirror arrangement since in body AF started. Nikon have had colour sensitive metering for years, and not just in the one top of the line body.

As for the 7D MkII having potential, I must, respectfully, disagree, even if it bests the D7100 in sensor metrics by a half stop or so, so what? That makes it slightly worse on overall image IQ than the 6D.

Don't get me wrong, I am not picking a fight and I am not out to bash Canon, I just see the last few years developments with my eyes wide open, stills are not the driving force they were even five years ago when the 7D made such an impact. In my opinion stills are not seen to be the future by Canon.

If the products haven't landed on a shelf yet then all the R&D in the world is no use to me.

The other truth is that stills are a very mature market, the quality and capability we have now vastly out strip most users needs, the 5D MkIII is probably the most complete stills shooters camera ever and Canon clearly don't believe in much higher MP, DR, blah blah sensor specs at this point. I believe we are on a technology plateau with no signs of the next BIG thing.

For me personally, put the 5D MkIII sensor in the 1Dx MkII, get me those TS-E lenses and I don't care, I'll be retired before my customers or I need more than that.

Very true, patent's don't make a product. My point was only that there is an R&D budget for still photography at Canon, and money is clearly being spent there. Patents do need to actually make their way into a product on a shelf to be meaningful, though...your dead-on there.

I really wonder why Canon doesn't bring more of their innovations into being...almost smacks of Nokia a few years ago...they had a MASSIVE patent library, but it was just IP...they didn't wield it and make competitive products with that technology...and look where they are now... Guess that's the most concerning thing about Canon. They have some amazing technology...but they aren't using it...so it isn't making money.

I still don't see how DPAF helps SLR stills shooters.

I can see it being very useful for focusing landscape shots, which I've always focused manually in live view. DPAF could automate that process.


The other truth is that stills are a very mature market, the quality and capability we have now vastly out strip most users needs, the 5D MkIII is probably the most complete stills shooters camera ever and Canon clearly don't believe in much higher MP, DR, blah blah sensor specs at this point. I believe we are on a technology plateau with no signs of the next BIG thing.

I don't know if I agree with that. DR is obviously a VERY important thing to photographers these days. It is single-handedly the most controversial and common subject when it comes to Canon vs. the others. Even if it isn't as important as many individuals and certain organizations seem to insist, it's clearly a sticking point, and clearly a perception issue between Canon and their customers. I have a hard time believing Canon doesn't know that...not after the last two years and all the debates and conversations and reviews and videos that cover the topic of how much better Nikon/Sony DR is than Canon's.

For Canon to ignore that, and release ANOTHER product without an improvement in that area....well, I think we could actually see some REAL brand migration over the next few years if the 7D II (and worse, the 5D IV/1D X II) hit the streets without a DR improvement. It may not actually matter in most cases, but it matters perceptually...and I think the companies reputation would actually finally be hurt by them not showing any real interest in their sensor IQ. I've seen Canon respond directly to the loudest demands from their customer base in the past. The 1D X and 5D III are exemplary examples of that, in multiple ways. Canon can't ignore the demand for better DR. It would be reputation damaging...

(My words above certainly don't mean Canon is actually going to do anything about it...I guess there is a very good chance they won't...but I do indeed believe it would be damaging to their reputation in the long run if they ignored the single most important demand of their customers after so many years of having that demand levied.)

999
EOS Bodies / Re: Canon EOS 5D Mark IV To Feature 4K Video?
« on: July 05, 2014, 01:13:13 AM »
I have no doubt that the push to video in the C line, not their consumer video camera line, along with a plethora of specialist CN-E enses have seriously impacted the stills orientated camera R&D potential.

I believe Canon see video as the DSLR saviour, and maybe it is, but their very heavy push above the stills market has had repercussions.

We have had interminable delays with some lenses where the CN-E line gets major new lenses at the drop of a hat, a complete abandonment of the "studio" stills orientated pro camera when the C line gets massive upgrades via firmware and hardware. Apart from the RT flash system, that is damn good, I can't think of one innovative Canon feature in recent years that isn't video centric. Good but slow IS primes with STM, video, dual pixel AF, video, etc etc.

Sure the 16-35 f4 IS, the 24-70 MkII, and the 70-200 IS MkII are sterling lenses though they are just as useful to wedding video shooters, but where are the 35L MkII (the C line got their 35mm T1.5 ages ago and there is no way that is a tweeked MkI 1.4), the bread an butter stills 100-400 MkII, a 400 f5.6 with IS, the stills market based 45mm and 90mm TS-E MkII's, I'll tell you where they are, they are in B&H under the Cine line banner.

Stills have jumped the shark as far as Canon are concerned, surveillance video cameras, bread and butter TV, documentary and news video are the next cash cows and the niche is studio/movie video. Stills Explorers of Light are getting dumped for videographers, the TV ads are pushing quality video as the core selling point of DSLR's.

Everything you say is very true, so far, for RELEASED products.

Your ignoring all the patent filings and that one major upcoming product release that could very well change that, in a big way. Since the introduction of the 7D, Canon has filed a number of still photography sensor patents, including layered sensors and recently a portraiture sensor. They have filed a good number of DO lens patents, as well as a number of patents for other lenses. There were certainly also a couple DPAF patents in there as well, however that improves both video and live view focus, so it isn't purely a video only feature. With the release of the 1D X and 5D III, we saw the introduction of a radically redesigned new 61pt AF system, and a new metering system for the 1D X. (We never saw patents on those ahead of time...they just showed up in the final products, to everyone's pleasure and surprise.) Canon has also released patents related to readout technology a few times over the last 2-3 years...including an on-die, dual-scale CP-ADC patent, a power source  decoupling patent (might have the potential to eliminate dark current noise), etc.

The 7D II has the potential to change a lot for stills photographers. Canon has mentioned on a couple occasions that they are working on other sensor IQ improvements. That includes some kind of thermal regulation of the sensor (again, could reduce dark current noise), and probably a fab process shrink.

So sure...all of the recently released products from Canon have been video related. But there is plenty of evidence that Canon has continued to innovate on the still photography front the last few years as well. The only difference is that we haven't seen any of those still photography innovations actually land on a shelf in a product....YET.

1000
EOS Bodies / Re: Canon EOS 5D Mark IV To Feature 4K Video?
« on: July 05, 2014, 12:31:01 AM »

And, I have no idea whether adding and upping the video features on a DSLR increases the cost or helps pay for the R&D for features that still photographers are coming to love (e.g., live view).

Live View comfortably predated video.

So did/does the 40D, and the 1D MkIII predated both of them and that does too.


My 50D had live view.

(This is in response to the whole entire quoted chain of posts above, not just the latest post by pbd.)

While Live View came along a good while ago, I'd offer that it is a very primitive form of "video." There was certainly additional R&D invested into developing that (assuming that is even the real foundation of DSLR video) into something that gave the 5D II it's epic sales numbers and cinematographers cause to use it in major TV and Movie productions. I don't think we got HD video "for free" just because we had Live View.

I also honestly don't know if Canon's R&D budget for DSLR video really takes anything away from R&D for stills or not. It certainly seems logical to think so in one context...when thinking only about the photography division of Canon. Canon is a large company, though, and they have long had a video/camcorder division. Who is to say, when you expand the context within which logic applies, that DSLR's aren't simply benefiting from a separate R&D budget, and that the video features were getting are actually fairly cheap because Canon already does R&D into that, and they have a well established body of experience there?

I haven't used the video features of my DSLR much, however the 7D was never really geared for it. Now that I have a 5D III, I may well start using the video features for wildlife stuff. I am not sure if I'd really appreciate still photography R&D budget being used for video features if that is where Canon is primarily spending the budget...however on the other hand, video is now an endemic feature of DSLRs. Canon has the benefit of a large R&D budget in general to produce highly integrated product lines. If they were to lose competitiveness because of the removal of video features...that too could hurt their ability to fund improvements for still photography.

Honestly, I don't think we can really know how video features in a DSLR affect Canon's progress on stills features. I think video is now a standard part of the package. I don't think that is going to change any time soon. We have no real evidence that it's hurting their still photography features, however it certainly expands the marketability of the products. If there is anything Canon is good at, it's maintaining and expanding their customer base...and that can only have a positive effect on their final revenues and R&D budgets in the end. So I consider video in a DSLR a good thing....long term, it just means more features overall, a larger customer base, and more funds that allow Canon to keep making better products in the future.

1001
jrista, the way you normally write about Canon's product line and strategic decisions, and now about Microsoft and their latest products, reminds me a lot about the time when I was a teenager and fell in love for the first time. It's a long time since then, but I still remember the mind set I was in.

To put some ugly zits on that Microsoft crush you seemingly have developed, I may draw to your attention to the fact that Microsoft has not abandoned their predatory style. Just remember the way their henchman Stephen "trojan horse" Elop took over Nokia, killed its long awaited new product strategy, turned a profitable company into a loss making, demoralized corporate cadaver that was ultimately coup de grace'd by Microsoft themselves - with a paycheck for Elop that easily matches all of CR's membership taken together. One should not be surprised that computer folks, who got burned by Microsoft's tactics twenty years ago, are still a bit touchy, especially when the company and their products are presented like ... well, see my first paragraph here.

Back to the original topic: If Microsoft would have wanted access to DSLR or lens related patents, they could have gotten a similar deal from Nikon for a lot less. After all Nikon is a much, much smaller company, most likely with a much smaller patent portfolio, and they are still able to manufacture and market competitive DSLRs and lenses. Let's not forget that Canon's camera division is just a small part of the whole enterprise, and I can well imagine that Microsoft saw a lot more utility in Canon's large office product line and IP, and that this was the real motivation behind the deal.

Your making some wild accusations about the Elop thing. I think they are unfounded, and I think THAT is the kind of crap Microsoft gets rap for that they do not deserve. Elop is an idiot. He always has been, always will be. If Microsoft had chosen Elop to be their new CEO, then I'd have probably ditched MS products in the long term...Elop would have UTTERLY DESTROYED Microsoft. He would have sold off their most lucrative brands and catered to the every whim of the stock holder. They would have been a completely dead brand outside of a niche enterprise market within less than a decade.

I'm not happy Elop is still around, I am SURE there are a lot of people at MS who feel the same...but that's the world of business. One thing Elop does know is how to maneuver himself into lucrative positions, and extract a few monster pay days here and there. For some, that's just the world of business, it's what they do. I find it despicable. I'm still reserving judgement of MS' new CEO. He deserves some time to learn the reigns, make a mistake or two and learn from them, before I either label him another idiot, or the potential savior of a company that has a lot of (unrealized) potential.

As for just being a mindless fan, no, I'm not. I am a fan, don't get me wrong. But I've been through many phases with Microsoft. I generally abhor Apple. Always have. Never liked their approach, their products, their vendor lockin, or Jobs insistnce on having just one friggin button! :P There was a time when I was so dissatisfied with Microsoft that I moved to iPhone...that was a MASSIVE change for me. I stuck with it for years, too, and when the iPhone 4S came around, I thought the product was finally getting somewhere...but there is where it's stayed for the last couple of years, and many of the key problems were never fixed. (That's one of those things Apple does...a lot of people hate Microsoft for changing things every few years, other people hate Apple for ignoring the same old problems for years and years.) Right now, I think Microsoft is a great company. They are producing better products, some of them are excellent, their stock is rising fast (which indicates I'm not alone in my assessment), and I am eminently familiar with the brand.

My big thing is I think Microsoft takes a bigger hit when it comes to people ragging on it than they deserve, while some like Apple don't get nearly enough. I think Microsoft needs a defender who will set some of the record strait. I am not calling them a perfect company...they have their crummy products, and they have certainly made their mistakes. But things change. Things have changed for the better at Microsoft in recent years, and I am happy to recognize that. It may not remain that way...and if it does not...well, I'll at the very least stick with my current products and avoid upgrading, and I'll wait for the next cycle where things get good again. And, maybe, try out some alternatives in between. I try not to hold  grudge. (BTW, I still own and use Apple products...the ones I think are worth it...if/when Apple makes some significant changes to iOS to fix the issues it has that I don't like, I'lll happily give a future iPad a try....I miss some apps that haven't yet made it to the MS ecosystem.)

1002
Windows 8 is a dual-mode operating system. On a desktop, if you prefer, you can still use the classic windows desktop all the time. You can boot to it and use it pretty much exclusively. The only explicit change is the removal of the start menu for the start screen. But the start screen works 100% perfectly well with mouse and keyboard (and, for that matter, it also works with a TV remote when using a Media Center remote control). There is absolutely NOTHING about Windows 8 that makes it difficult to use on the desktop with a kb/mouse. I've been doing it since day one. This is Microsoft's greatest mistake...not properly educating their customers as to what their OS can do. Windows 8 is Windows 7, with more. That's it. There hasn't been a loss of compatibility.

I'm pounding away on a keyboard right now, in Chrome, on the desktop, on a standard computer with no touch screen...in Windows 8.1. Touch is not a requirement in Windows 8. It's an option.

Well, I'll have to look into it more deeply then.

I understand that touch is not a requirement, but how to get there (1) isn't clear and (2) is filled with misinformation.

There is a tile on the start menu for the desktop. It's a pretty big tile by default. You can also always hit the windows key to swap back and forth between desktop and start menu. You can also get to it via WinKey-Tab (which cycles through your tasks). On a touch system, swipe from the left to cycle through tasks. The desktop is now just a task like any other, so it will always come up when cycling through apps.

There are a lot of ways of getting to the desktop. In Win8.1 Updt. 1, Microsoft actually asks you where you want to boot to by default...desktop or start menu. It's pretty easy to get to the desktop. One your there...well, if you've used windows over the last two decades at all, then you know exactly how to use it. ;)

I apologize in advance for the following. Jrista you seem to be too offensive to posters who don't like Microsoft or Windows 8. I wanted to reply to Ruined writing something against the usability of windows 8 and I am afraid that you will bash me. Why that?
By the way it seems that most of the replies have nothing to do with the announcement.

I am not being offensive, and it isn't "bashing"...I am simply direct. I don't like beating around the bush. I'm the guy on the forums who doesn't like misinformation, and I correct it at every opportunity. If you can't handle that, you can feel free to ignore me on the forums (it's an actual feature, you could block my posts forever).

There is FAR too much misinformation and unfounded hate for Microsoft. They have a past, like any of us...but the Microsoft of today is a very different company of the Microsoft of decades past. The reasons people hate Microsoft (and I really do use that word explicitly, it's pure unadulterated hate for many people) are old, archaic, and usually unfounded. I think people are missing out in many cases, if they are choosing to avoid upgrading to Microsoft's latest operating systems or software because of misinformation, hearsay, and the outright lies about the company and it's products online.

In the 90's and early 2000's, Windows was known more for the BSOD than anything else. To be 100% perfectly honest...I haven't even seen a blue screen in an absolute minimum of a year. On my own computers, I haven't seen a BSOD in years. Whenever any kind of issue has occurred, the operating system has usually self-corrected itself. I've had hardware issues (I was actually underpowering a high powered video card by about 5 amps for a long while), and drivers would crash, and the operating system would detect that failure, reset the video card, reset the driver, and restore everything to working condition (let me see a Mac to THAT!! HA!)

I honestly don't believe Microsoft deserves the bad rap they get. I honestly don't understand the long term persistent hate they get from so many people, over products that are long dead and irrelevant in todays world. I honestly don't understand why the same old tired excuses people use to justify their MS hate, which are now a decade old or more, are constantly recycled and regurgitated across the net incessantly by people who appear to have last used windows in the 1990's!!!! It's illogical, and it's just plain dumb.

I've had plenty of problems with the iPhones I've owned over the years. The worst of all was the atrocious call quality through GENERATIONS of iPhones...call quality that I originally blamed AT&T for until I finally switched to a Windows Phone 7 device...and then, suddenly, out of the blue...my call quality was PERFECT. Crystal clear, crisp, loud. I was blown away. I've had iOS devices lock up on my often. I've seen more broken iPhone screens than I can count or even remember. I've never once had any of those problems with my old HTC WP7 phone or my Lumia. I've dropped the lumia on a few occasions...once directly on it's glass face. The thing still doesn't have a scratch on it, and there is one tiny microscopic nick in the glass that can only be seen when the light is at the right angle. The device has worked flawlessly, continued to provide that perfect phone call clarity, throughout it's now two year old life.

There is no perfect company, and Apple, of all companies, is more monopolistic in it's practices than Microsoft ever was (HUGE multi-billion dollar lawsuits over ROUNDED CORNER ICONS??? Purposely locking people into their vendor-specific connections, instead of being compatible with the rest of the world and all the rest of the worlds devices???)

Sorry, but the kind of loyalty that remains loyal to a company to a fault really irks me. Apple has a fanatical following that dwarfs any kind of following Microsoft has ever had, and it's all IN SPITE of the drawbacks and failings of Apple, IN SPITE of their monopolistic and hefty vendor lock in tactics. IN SPITE of the long-term terrible working conditions of their overseas factory workers. Apple is no more deserving of that kind of...what...LOVE...from their consumers than any other company, and yet they get it anyway. Why? Because they are not Microsoft? Totally illogical.

I switched brands, when Microsoft stuff got sucky. Gave another brand a try. Changed my loyalties. The grass really wasn't greener on the other side, it was more costly, riddled with vendor lock in, and still had bugs and hardware issues. I switched back, and while it still isn't the brilliant deep emerald blue-green I want, the grass definitely tastes better and is more often a deeper green on the Microsoft side these days.

1003
not properly educating their customers as to what their OS can do.

If you have to "educate" your customer you've already lost.

Every company educates their customers about their products. A significant part of that is done through advertising, on TV and elsewhere. Microsoft could have educated their customers with a 15 second spot on TV, showing Windows 8 in both the new touch/metro mode as well as the old desktop mode. That's as simple as it would have had to be. For anyone who has actually used Windows 8, it only takes a moment to find the desktop tile on the start screen and get back into the desktop.

The majority of the complaints come from people who don't actually seem to use Windows 8. They then seem to regularly make the assumption that Windows 8 is only touch. Let's not even call it education...let's just call it communication. Microsoft's only real failure with Windows 8 was failure to communicate the FACT that it is 100% compatible with standard desktop computers with a keyboard and mouse. I speak from personal experience, having used Windows 8 since beta, that there are ZERO difficulties with using Windows 8, any version of it, with a keyboard and mouse. ZERO. Microsoft simply needed to communicate and show that full compatibility, and the misconception, that still seems to persist today, that Windows 8 doesn't work well with kb/mouse or doesn't work without touch, is FALSE.

That is Microsoft's greatest failing. They don't communicate as well as Apple. I don't think Ballmer really cared that much about advertising and communicating their products capabilities to consumers. I hope that will change under new management...as if Microsoft can overcome the issues they have communicating their products capabilities to their customers, their customers could stop assuming incorrect things about Microsoft products and simply get along with using them.


I'm a software developer with over thirty years of experience in UI design, I know what I'm talking about. 8 UI design has several flaws - would you like an EOS 1 Mark 8 or EOS 5 Mark 8 with only touch controls - no dials, no buttons? I think you would find it very hard to use. And with an UI that when you modify settings wholly hides the image you're working with displaying a lot of huge, useless contents you're not interested in? Then it could be a great camera, but you would find it hard to use because it doesn't work as you expect a camera should work. And you wouldn't like if Canon or whatever else tell you "you just need to be (re)educated".
That said, I like Windows 8 on my phone and on my Surface 2 Pro tablet - but they are different devices. To take photos on a phone I can accept a touch interface, but I would never accept it on my DSLR. Different devices need a different UI.

I've been writing code since the age of 8, and writing software since the 90's. I've been doing graphic design and UI development for about the same amount of time. ;) I'm honestly not sure what UI design flaws your talking about:

"And with an UI that when you modify settings wholly hides the image you're working with displaying a lot of huge, useless contents you're not interested in?"

What exactly are you referring to, here? I have never had any experience with Windows 8 that would match that rather vague description of...something...

Also, I do not believe this patent sharing deal between Canon and Microsoft really has anything to do with putting Windows Phone into a Canon DSLR. Maybe, at some point, some years down the road, I think we might see a Canon touch UI powered by the Windows Phone PLATFORM. I don't think it would actually be the WP8 we currently use on our phones today...I think it would be more like XBox 1. It would make use of the technology Microsoft has, the core operating system and the app platform, to build something custom that actually worked quite ideally with a Canon camera. Something that supported external buttons in ADDITION to a multi-touch UI.

Others have made this argument before. Personally, I'm a button and dial guy. But when you get into the menu system...it would actually be really nice to have touch capabilities. Or when your on the settings grid on a Canon DSLR, it would be nice just to be able to touch one of the cells, rather than press a button, to configure any of the common settings...EC, WB, AF, ISO, etc.

Your thinking a little too literally here. Canon having access to Microsoft's patents is a REALLY GOOD THING. They won't just drop WP8 on their next DSLRs or compact cameras. However they could utilize Microsofts multitouch patents, or even their mobile OS kernel, as a platform upon which to build something more well suited to their products. It's just the PATENTS that are shared...the underlying technological concepts. Not any of Microsoft's OSs themselves. (I actually assume that Canon would have to pay a license fee to actually use WP8 itself on a DSLR.)

1004
Well apparently, Microsoft wants to have some Canon tech in their imaging stuff now. Unfortunately, I guess nobody told them that it's not exactly the same thing to transfer from system camera level imaging to mobile imaging. It does make me wonder whether when Nokia imploded the people who really knew the details of Pureview technology jumped ship... As far as I know, Nokia lost a ton of talent at the moment when Windows strategy was announced.

Gotta back claims like that up, Mika. There have been no mentions of a mass of talent leaving the company since Microsoft acquired it. There shouldn't be, either, as it should be business as usual...Microsoft owns the Lumia unit now, that doesn't mean they are going to change everything right off the bat (or change anything...Lumia is the most successful Windows phone, and it's driving the growth of Windows phone in the market...best not mess with something that works.)

PureView is the best camera technology in a phone right now. Why your complaining about that now that it's in Microsoft hands, I cannot fathom.

Anyways, Microsoft hate is not because Windows 8 didn't work, or had underlying issues. The hate is because Microsoft doesn't listen to customers or just does business moves that people see are going to cost them more in the long run. And that they are trying to push their monopolistic software attitude to other business areas where they have no foothold. Or backstabbing their hardware buddies with releasing Surface to begin with.

This is again a scrap out of the 1990's. Microsoft has been directly listening to customer feedback for many years now. They have been an extremely open and cooporative company, vs. a monopolistic company, since the whole anti-trust suit. This very deal is a PERFECT example of the NON-competitive nature of the Microsoft of today. Your once again living in the past.

As for Surface...Microsoft's future is dependent upon the entire Microsoft ecosystem being directly competitive with Apple products, specifically. To be quite blunt, Microsoft's hardware partners SUCK ASS. They NEEDED a big, fat, PAINFUL kick in the rear end to knock some sense into them. The mobile windows hardware market has been failing for years...products have gotten cheaper and cheaper, and the quality of those products has tanked right along with price and profit margin.

I just purchased a brand new Dell XPS 15, with an i7, 16GB ram, 512GB SSD, and a 15.6" 3800x1800 pixel QHD+ screen. For less than two grand. This thing is built like a MacBook Pro, and it runs circles around one. It is a BEAUTIFUL device, with a backlit keyboard and a construction quality like I've never before seen in the Microsoft ecosystem. I also am 100% ABSOLUTELY CERTAIN that it would have never existed if Microsoft hadn't entered the game and produced a driving incentive for their own business partners to one-up them. Microsoft's strategy with Surface worked, IMO. Their PARTNERS, who are now also their competitors, are building better products. They are building competitive products that no only compete with the Surface (which is a good device, I have a Surface Pro myself), but also compete directly with Apple products.

The Dell XPS 15 is a beautiful example of the genius behind Microsoft making themselves a competitor in their own ecosystem...it was an essential move to revitalize their industry. No one sees that...because everyone is stuck in the late 1990's and an anti-trust suit that wasn't satisfactory to their hateful expectations. Times have changed...time to get up to date.

Vista with Office updates that forced non-customizable Ribbon was bad enough, and on top of that, the companies had to pay to get people on courses how to use 20 year old tools again.

Now, add another Office change (2010), the UI didn't remain constant, although I think it was a general improvement to 2007. Add on Windows 8 screwing the operating system UI again with Microsoft marketing trying to push it as a "vast improvement" where real world experience was completely different. Especially when beta testers WARNED the company about this.

The ribbon was a DIRECT response to years of customer feedback on the Office UI. People hated having to dig multiple levels deep within menu systems to find features in Word and Excel primarily. Microsoft designed the ribbon in an effort to solve that exact problem, based on explicit CUSTOMER feedback about the problems with their old Office design. Ribbon was a success in that it brought everything right to the surface, one level deep in a series of tabs.

The problem, again, is that people simply don't seem to like change, if that change is coming from Microsoft. (If it is Apple changing something, everyone hails it as revolutionary genius...such as in the iOS 7 change...which, ironically, was simply to make their semi-3D rounded corner icons mostly flat rounded corner icons...oh, and to add a little bit of translucency in a few new places...flat...which, ironically, was largely pioneered by Microsoft with their Metro UI design). Change is the focal point of progress. Everything has to change at some point to be improved. Microsoft has made REASONABLE changes to things like Office, such as with the introduction of the Ribbon UI, and usually in direct response to customer feedback.

I've seen changes made to the Zune desktop player and XBox Music UIs based directly on my feedback...I asked for a couple explicit features directly to Microsoft over the phone, based on an issue I was having. I referenced a number of threads on Microsoft forums where the same feature was being asked for. Within maybe a month, an update was pushed that added the feature and one other change I'd asked for.

Microsoft's ecosystem is huge. For as many people as use iPhones, Microsoft's installed base of Windows computers is well over a BILLION now. The majority of those are Windows 7 and Windows 8.x (and the server counterparts), with a rapidly fading presence in XP. When you have an installed base in the billions, it's impossible to make any change that satisfies 100% of your user base (not even Apple could accomplish that...iOS 7/8 has had it's fair share of detractors, sometimes audible in the throng of brainwashed fanaticism.)

It doesn't help that Windows 8.1 removes a part of the forced stupidity (though I wouldn't cross my fingers), the version name is already tainted. It has to be Windows 9 and an attitude change to recover from this. The point is, if the most downloaded third-party application is Classic Shell, the UI was ****ed to begin with. Note that this holds for the business side experience when using desktops with large screens.

Unfortunately, Microsoft also started to push for cloud integration in Office, and at this part of the world, there's not a lot of businesses who would like to upload critical information to servers based in the US given the current legislation that can confiscate the data at any point. I'm pretty sure Microsoft's plan is to start forcing cloud services down on our throats gradually to charge the usage basis for monthly services, and that I don't want.

Now your just speculating about Microsoft forcing anything on it's customers. You can still, and will always be able to, buy Office stand-alone. I did. I own a couple stand alone copies. I opted for that, instead of the much cheaper $99/yr Office Cloud standard edition. I prefer to store my data locally...but not everyone does. Some people, some corporations and smaller businesses, much prefer to offload the once-necessary costs and complexities of managing their own computer networks and systems onto a larger business entity that has more talented and effective resources for managing such things.

The Cloud, as far as Microsoft is concerned, isn't about the end consumer. The cloud is about the enterprise and the business user. Microsoft's cloud business is actually one of their more successful business units, as well. They have been seeing consistent growth in the Azure cloud and cloud-based service offerings. A lot of people and a lot of corporations WANT cloud offerings. With social and search services like Google and Facebook coming under fire for a lot of misuse of customer data, Microsoft has just been plugging away doing what they do...enterprise systems and support. They offer a truly viable alternative to Google that is more secure and untainted with a history of data abuse or spying or controversial "social experimentation" on an unknowing populace of users or anything like that.

Cloud is Microsoft's strength. Their biggest competitor there is actually Amazon, and they are making headway, helping spur a competitive market in the cloud services business.

I also definitely don't like the Microsoft store integration of the computer UI, and from what it seems, neither did the entertainment industry. Ask how bad it had to be if Valve switched on to developing their own operating system!

The way app stores are run isn't really a Microsoft thing. Apple started that trend, and in many ways, it is essential to the protection of consumers. Just look into how many problems and security issues can and have occurred on the Android platform, with it's open app store, vs. how many of those kinds of issues occur on Apple or Microsoft devices. There needs to be some level of buffer, some small barrier to entry, to help weed out the apps that are designed by data and identity thieves for the purposes of data and identity theft, fraud, etc.

The other issue here is costs and revenue. Microsoft runs the server farm that manages their app store, just like Apple does. It is also a key source of long-term revenue, for both companies. It's a business choice those companies made. Again, when you have such a massive ecosystem, you cannot make decisions that satisfy 100% of your customers.

The Valve problem would have been the same if Steam wanted to do it on the iOS platform. Valve did not want to share it's revenue with Microsoft. Ok, fine. That's the business decision Valve has made. That doesn't make it some kind of a referendum against Windows 8. It simply means that Steam doesn't want to share their revenue, and that's certainly a decision they are allowed to make. It doesn't matter in the end anyway...Windows 8 is still Windows 7 when your on the desktop, and Steam has always worked the same as it always has. There is no loss for Valve here...there is no requirement that they move to an app store model.

There is also no reason that Valve couldn't work with Microsoft on a deal to have a Steam metro app that worked in a unique way to support Valves needs. Microsoft worked with the VLC media player team to help them create a version of VLC that would operate under the (necessary, for security purposes) sandboxing and library limitations of standard Windows 8 apps. VLC makes use of some key low-level C libraries for the kind of performance they require, which are normally not allowed in metro apps. And yet...the first version of VLC for Windows 8 was released a number of months ago.

This is, of course, from my point of view. If you ask me, Windows 8 could've worked had the preferred UI been a simple question in the beginning. Ribbon would work better if it was customizable. Microsoft's name would look better if it wasn't seen nowadays as a potential competitor with their customers and so on.

You have clearly never been part of a software development project, certainly not on any large-scale project that had a large installed base of users. You have to START somewhere. You have to make the decisions of what things your going to include, so you can allocate the resources to implement those things, then send em through the long and complex pipeline of proper testing, QA, refinement, patent generation, pre-production testing, release preparation, stock production, and final release to the storefront shelf and consumers. Microsoft made their decisions about where to START with Windows 8. They have been making progressive updates and improvements that, once again based directly on customer feedback, are greatly improving the product. It's a process. Processes take time.

(BTW, Ribbon IS customizable...highly. You don't quite seem to have your facts strait about Microsoft or their products...probably because you abandoned Microsoft a decade ago, and have simply been regurgitating the same old drivel about mean, predatory, hateful old behemoth "Microsoft the Monopoly" for the same amount of time. Things have changed...and your seeing everything through a lens that keeps you stuck in the past.)

1005
People just like to hate on Microsoft, even when they've done good. Windows 8 is the only truly universal platform that runs multiple devices, in multiple operating modes, simultaneously. Not even Apple has topped that, and I don't think they will. I purchased an original Surface Pro, because I'd been waiting for years to be able to have a fully touch-capable device for when I'm out roaming around, while not actually having to leave my full Windows desktop capabilities at home. I was probably one of the first people using Lightroom, Photoshop and EOS Utility on a Windows 8 tablet, tethered to my Canon 7D, out in the mountains, taking landscape photos and processing them with a pressure sensitive pen on the spot with a fully featured photo editing software (not some limited or otherwise gimped "app" as is the case on iPad.)

I also think a lot of people miss the magnitude of this.

W8 is essentially the same interface across a phone, tablet/laptop and desktop PC.

You don't have a bunch of stuff that no longer works with OS upgrades like the fruity side. Proprietary connectors that get changed every other generation so the accessories no longer work...

If the top Nokia Lumia phone was available on T-Mobile instead of AT&T, I would have got one. Big mistake on Nokia's part.

I think MS dropped the ball with the "RT" version of the Surface.

I would also immensley prefer the iPad 4:3 screen on a Surface.   

I'll grant that the W8 interface is different from XP and 7. I like W8 on a touchscreen but my desktop isn't touch, so I'm a little leery of setting it up. My computer has an included upgrade to W8.1 if I decide to try it.

I have had only 2 issue with my older hardware not working in W8, and as far as I can tell, it stems from the manufacturer no longer supporting the product- so not the fault of MS. Everything else I have works. Even downloading Epson 3800 drivers to a Surface Pro2.

Any insights on using W8 in a non-touch environment?

Windows 8 is a dual-mode operating system. On a desktop, if you prefer, you can still use the classic windows desktop all the time. You can boot to it and use it pretty much exclusively. The only explicit change is the removal of the start menu for the start screen. But the start screen works 100% perfectly well with mouse and keyboard (and, for that matter, it also works with a TV remote when using a Media Center remote control). There is absolutely NOTHING about Windows 8 that makes it difficult to use on the desktop with a kb/mouse. I've been doing it since day one. This is Microsoft's greatest mistake...not properly educating their customers as to what their OS can do. Windows 8 is Windows 7, with more. That's it. There hasn't been a loss of compatibility.

I'm pounding away on a keyboard right now, in Chrome, on the desktop, on a standard computer with no touch screen...in Windows 8.1. Touch is not a requirement in Windows 8. It's an option.

Pages: 1 ... 65 66 [67] 68 69 ... 309