An article Canon should read.

Status
Not open for further replies.
Z

ZeuZ

Guest
Peter Hill said:
"Actually here in Belgium there is one : canonline.be, the only photography store which sells only Canon."

No, that's not a Canon store, that's just a shop selling only Canons. There are heaps of those, and that's not what I meant. Canon, unlike Apple, does not do retail business.

:) tomatoes tomaatoes, it's a store and it only sells Canon, a Canon store, but you're right, Canon doesn't run retail like apples.
 
Upvote 0
Jul 21, 2010
31,098
12,863
archangelrichard said:
"Clearly, DJL329 was stating that the 1DsIII and the 5DII have essentially the same sensor. Do you disagree with that?" -- yes, I do. --

I'm going to assume you disagree with the clause about the 1DsIII and the 5DII having the same sensor. If you're disagreeing with the fact that that was the intent of DJL329 statement, reading comprehension is clearly an issue for you, and you might want to go re-read his original post, which I quoted.

archangelrichard said:
"If so, you're disagreeing with Canon's published statements" -- well. Canon never said they were the same sensor but Similar; or as dpreview puts it "21 megapixel CMOS sensor (very similar to the sensor in the EOS-1Ds Mark III)"

They are similar, true. The 5DII's sensor has modified circuitry for NR. The photosites themselves are identical, in size, number and electronic characteristics.

archangelrichard said:
being so much longer your basic notion of handholding at 1 over the focal length is out the window; that depended on lenses with normal gemoetry that have a smaller arc of movement; the longer the lens the greater the arc so you have to use the actual length - and just eyeballing it it is close to 135mm real length at 17mm focus so you hand hold at 1/125th of a second and no slower

Silly me, I didn't realize that the 1/focal length rule was based on the physical length of the lens. I though it derived from the field of view relative to the impact of angular motion. It's great to know that I can handhold a 200mm f/2.8L II prime at a 1/125 s, but a fully-extended 70-200mm f/2.8L zoom needs 1/200 s. Speaking of the 70-200mm, as an internal zoom lens I suppose it can be handheld at the same shutter speed at both 70mm and 200mm settings, because the physical length of the lens doesn't change - right? Oh, and that must also mean that sensor size has no effect on the shutter speed at which a lens can be handheld, because a different sensor won't change the physical length of a lens. But maybe you were talking only about the 17-85mm lens - you can handhold that at a minimum 1/125 s whether it's set to 17mm or 85mm? Since Canon bodies are programmed uses the 1/focal length rule to select the shutter speed in Av mode, I suppose the camera would avoid shutter speeds less than 1/125 s with that lens set at 17mm? Try it...I bet you know something Canon's engineers don't. Maybe you should send the engineers at the Canon Optics R&D Center in Utsunomiya those links to Wikipedia and the MIT lecture notes, because your understanding of optics probably surpasses theirs as much as it surpasses my own. But thanks for the link to the lecture notes/slides. A bit basic, though - since I've built multiphoton microscope systems from the ground up, I've got a reasonable understanding of optical physics.

archangelrichard said:
"Because Apple only makes one model of computer, right?" - actually; NO. Please get your facts straight

"Because Mac OS X doesn't include the Terminal app for a command-line Unix shell interface that allows me to alter most system parameters at will." - It doesn't? Really?

NO, I don't know everything but clearly I know and more importantly understand more than some

Clearly, you can't include sarcasm on that short list of things which you understand. In case it wasn't clear to you, the above discussion of the 1/focal length rule was replete with sarcasm.

But just as clearly, I'm guilty of feeding the troll. Apologies to the members of this forum who maintain the ability to be civil and carry on intelligent discourse.
 
Upvote 0

Eagle Eye

Recovering Full-Framer
CR Pro
Jul 5, 2011
194
65
Virginia
If Canon were more like Apple, they would insall a direct print button on their cameras that would interface with Canon printers...

People, lets settle this thread down. Obviously Apple and Canon are both doing something right, as they are both profitable companies. Complain all you want about one or the other, but you cant dismiss that hard fact.
 
Upvote 0

DJL329

EOS R5
CR Pro
Aug 26, 2010
622
89
www.flickr.com
neuroanatomist said:
But just as clearly, I'm guilty of feeding the troll. Apologies to the members of this forum who maintain the ability to be civil and carry on intelligent discourse.

Quite right, neuroanatomist. The best course of action for us all is to simply ignore this person. Don't read his responses and certainly don't reply to them.
 
Upvote 0
archangelrichard said:
this is for lettherightlensin

"I hate Apple worship when it comes to tech since their Apple II's were very soon outdated junk" - this hasn't happened yet; and for reasons of what the Apple II is - completely modifiable. Don't like the keyboard? Get any ascii encoded keyboard and build a cable end (I did and had 25 feet of cable to boot) Don't like the video? You can add a video board and run at higher resolutions (e.g. I had a Spies Labs "double Hi-Res" board with 80 columns of text and 64 colors (well, OK; 8 were black and variations of black). Want Sound? Choose from a number of sound cards from aftermarket manufacturers like the Solid State Music board with 16 voices and synthesizer in and out ports (in fact the standard speaker could be connected to an external speaker - I did with a 12" shelf speaker and the tone drops a bit like a boy going through puberty). Add-on tenkey keyboards, add on 8088 processor board with the Original Seattle Computer DOS (bought by Microsoft and changed into PC DOS / MS DOS) AND add in an 8080 board and run CP/M (OK so I ran BOTH - at the same time!); etc. Disk drives? you can get dual double side double density, Hard drives, 8" floppies, you name it. Memory? There is available on the aftermarket a 786K board (more than you could put on a PC)

yeah and how many programs actually took advantage of all that stuff?
exactly

It had a slow 1MHz 6502 chip which had to drive not just regular calculations but audio and video as well. No custom graphics or audio bus. No nothing. Most bare bones boring design you could imagine. It's not like the other makes couldn't take expansions either, you could plug stuff into them as well although not many expansion cards were made for the others.

the Apple II didn't usher in any modern hardware design, not a custom chip or advanced bus in the thing
I give it credit for starting the whole home computer thing, it gets a lot for that. But boom it was quickly blow away by the Atari 800 which foreshadowed the modern home computer by using custom chips for various tasks as a standard.

there is no way to obsolete a generic computer like this but what made it end up obsolete was the added convenience of other computers that came out later based on what Apple did: for example, the reason the first PC had openly documented slots was to compete with Apple, used Shurgart Assoc. standard floppy interface was to compete with Apple, had openly accessible video standards was to compete with Apple -- and the bad news? Apple's 1.85 Mhz processor had a 1 Mhz throughput but IBM's 4.77 Mhz 8088 "16 bit" (not really but) had to PAGE the processor to read commands OR inputs (unlike the true 16 bit 8086 which needed 32 bit memory) AND had to page memory to get above 64K because of this; had a throughput of 1 Mhz! (the cp/m based 8080 processor computer got up to 8 Mhz throughput but had no graphics capabilities nor as easy to interface as the Apple II's 6502)

Yes, the original IBM PC was every bit as much of a wreck as the Apple II. IBM didn't even believe in the home market and put a trash team on it since they didn't want to devote any good resources to it.

At the same time people were spending thousands and thousands for creaky IBM clones running MS-DOS or still Apple IIs believe it or not or the original MACs some people were getting stuff like Amiga 1000s with 8Mhz fully 16bit 68000 CPU (ok MAC had that part), custom audio/graphics bus, advanced custom chipsets to drive audio, graphics (with display processors that could sync graphics code to be run to exact location of current place the beam was scanning a CRT),DMA to discs/ports and a fully pre-emptive multi-tasking OS with a GUI that had some modern innards that Linux only wishes it had.

And remember the mess people had to go through each time they installed even a basic soundblaster or something in a clone or an Apple? Well it was autoconfig plug and play baby with the Amiga from day one. Plug it in and flick the on switch and it works perfectly, automatically.

All included, all standard and supported by all programs and all for only a fraction of the price of the PC clones or the MAC and shockingly for not even much more than a bare bones Apple II, come on.

How long did it take for Windows or MAC to get pre-emptive multi-tasking? First they didn't even do any sort of multi-tasking and then they did non-premeptive fixed slices and other messes. It took them a while. Years. Amiga had it back in 1985! While clones were clunking along with MS DOS junk and then early rudiments of Windows which couldn't even handle "windows" properly at first. Windows on amiga were 100% full as on the MAC and they could also run at different resolutions on different vertical segments of the monitor, by using the copper chip to reprogram the display output on each scanline.

Plus the code was a heck of a lot more efficient taking only 1/4 and 1/16th the amount of code needed to be run through to accomplish each task switch (once the other two finally even got around to offering task switching).

And look at a modern PC, with it's custom PCIe buses, DMA, custom chips offloading everything. The hardware looks a heck of a lot more like an old Amiga than an old Apple or Clone.

It sure is a shame that it wasn't OS like Amiga or BeOS or something that everyone was using now (obviously in more modern form).

This is a horrible oversimplification - which matches the extremely horrible one that you used; completely ignorant lies like "when there were advanced computers running at 16x the speed" - NO there were not, the TI 99 was a real 32 bits and the most advanced but their Midsize computer division insisted it be so handicapped that they never made a dent in the PC market and NOBODY ran at 16X the speed. "with 4096 colors at once" - I don't know what you are talking about; Atari couldn't do this (but they could "strobe" the color register so it would appear to go through it's 16 colors and rotate through others; the PC's CGA did 16 where Appkle did 8, EGA only got to 256 colors and that was years later; "stereo wave sampled sound built" - actually built in is an issue with new rechnology as it locks you into something that could easily get obsoleted but you could easily add this to an Apple II; "pre-emptive multi-tasking," - you could do this with add in processor boards as I said I did above

Really so now the Apple II OS was pre-emptive multi-tasking? Not even the MAC did that until years after release.

The Amiga did 4096 colors at once in 1985. Granted that is too late to be fair to compare to the Apple II, except.... Apple fanboys used to routinely tell people to not get Atari or CBM toys and to at least get an Apple II instead! Yes I even had dealers tell me that the Apple II was a far better buy than any Amiga! Some told me that because the Amiga used custom chips to offload audio and graphics and could do sprites and multi-hardware overlayed bitmaps and show 4096 colors at once that meant it was a toy since no serious, professional computer of any use for anything real would ever have 4096 colors or sprites or custom graphics chips (and the same jokers I see today bragging about their latest nvidia cards and how serious their machines are, LOL). So it is fair to compare the Amiga to the Apple II since the other side made it fair by trying to promote it over the Amiga back then as utterly laughable as that was.

And yeah the Amiga did have 16x, actually more than that, times the computing power of an Apple II (and sure plug in some new APple card that a few obscure pieces of software use, well heck then plug one into an AMiga, etc.). And 4096 colors at once, etc.

"The MAC was basically junk within a year compared to competition" -- and again you just have no idea and spout the same fanboy ignorance; what competition? Who made computers that did what the first Mac did?

wow.
you must be kidding.
Atari ST??
Amiga 1000??


Windows 3 was almost ten years away, nobody else was using graphics fonts - a very basic technology to what we do today; the Mac used SCSI (a Parallel interface rather than cheaper one-bit-at-a-time serial li8ke IBM used); if you don't understand the technology you don't know what you are talking about

hate to say it but there were other things other than clones and some of them did use graphics fonts and SCSI and they even had direct DMA for floppies and stored more on a floppy and had faster throughput than the MAC

they had GUI like the MAC only they also had much more serious and robust command line interfaces as well compelte with UNIX-like pipes and so on and one of them even had multi-tasking back then

There are no facts to what you posted, no proof, no sense. All you did, in the worst fashion imaginable, was lash out ignorantly for no apparent reason - thus is the illogic of the fanboy

Funny then how your precious Apple (and IBM and MS) hid Amigas under the desks to run their own booths at some computer trade shows since their own computers couldn't cut it half as well. If the actually companies decided to hide other brands under the table rather then show off using their own on the table top then maybe that says something. And maybe you are the fanboy.
 
Upvote 0
gmrza said:
gene_can_sing said:
....

If in the future, Nikon or Sony even had 1/2 the vision of Apple (which I doubt they ever will), Canon in all their conservatism would fall like a house of cards, just like what Apple has done to their numerous competitors.

Time to play devil's advocate...

  • With the AE-1 Canon led the charge to mass-market adoption of SLRs. Supposedly the AE-1 was the first camera with an embedded microcontroller.
  • Canon introduced the world's first inkjet printer.
  • Didn't Canon lead the market to the adoption of a full electronic lens interface? (EOS 650 and EF lens mount in 1987)
  • Canon led the market for full frame studio professional DSLRs with the 1Ds. (It wasn't the first full frame pro DSLR, but it was a major departure in terms of utility.)
  • Wasn't Canon the first company to market an enthusiast-level full frame DSLR? (5D Classic)
  • Wasn't Canon the first to release HD video in a full frame DSLR? (5D mkII)
  • Wasn't Canon the first major camera manufacturer to launch a 70-200mm image stabilised lens?

Canon has made some daring bets -

Yes they have. They were often at the forefront pushing things. And during this time you did see sidelines turn from black to white. They really did drive things forward.

More recently they have bragged about being infinitely far ahead of Nikon for FF (just months before Nikon released FF hah) an dhow they didn;t need to do anything other then rest on their high throne. And they have become super stingy and market droid driven when it comes to body specs or showing any recent ingenuity and foresight. So yeah until a few years ago give them props. But they are not doing what they could with the computer inside a camera day and age they are fully into now and they have been doing dumb little things like going back and removing MFA from the 60D just to entice 70D upgrades and even with the 7D heaven forbid they just give the full 1 series AF, etc. etc.

A few years ago they had it in them to basically bury Nikon and take over but instead they went into hyper protect current products and segmentation and now they are actually losing market ground a bit. (although it is possible in the long run that is better they did this since if they had blasted away the competition they may have become sluggish and expensive for years upon years)
 
Upvote 0
C

Civius

Guest
Yeah, it would be so supergreat if Canon was more like Apple. We wouldn't have these stupid memory cards but you could choose between 7D 16GB, 7D 32GB and 7D 64GB. Oh and the prices for all the products would be double what they are now but you wouldn't mind because, cmon, it's a Canon product so it's just pure awesomeness and really, really easy to use because there would be only one button.
 
Upvote 0
Very interesting article/business case indeed. But I happen to disagree with it. I agree with the general notion of disruptive technology and how these things have been handled pretty smartly by Mr. Jobs. He is an exceptional leader. In no way does the situation compare to the auto industry. Detroit's problems are rooted in running companies like bureaucracies - thanks in part to the outrageous deals that the unions were able to make over decades. This is not about "cheaper and smaller cars" as some people claim. Today's European and Japanese car fleets for the most part are neither cheaper nor smaller.

Canon and Nikon are in a very unique market segment. The only similarity that I see with Apple is that a lot of the sales are based on image and perception. And I would argue that Canon and Nikon actually live up to a lot of the expectations of the pro-level and mid-range buyer, while mostly satisfying the vast number of P&S buyers. Not a bad business concept in my book. If Apple on the other hand will survive without their big follower-creating Steve Jobs has yet to be seen. I don't see how most Apple products are in any way superior. I came pretty close a few times to buying an Apple computer and decided against it any time. Those that are used by pros in the creative industry are very expensive and at the same time very limited in a number of ways. If photography and/or music was not only my passion but my day job I'd probably have one of the big ones. Everything else is a toy. Pretty styling (obviously important if you sit at Starbucks all day) and good enough to write emails and to be on Facespace or whatever. Other than that I find most of their stuff more than annoying. I do like my iPod classic though - simple and big enough to hold my entire CD collection at a halfway decent sample rate. But their laptops are a joke. Either big and expensive or they don't even have a firewire connection that I need for a bunch of hardware that still serves me well. And anything touchscreeny just makes me cringe. That's why I still love my ThinkPads, regular PCs with Win XP and my old fashioned BlackBerry with a real keyboard.

But that's me. Obviously a lot of way more hip people think otherwise. The question remains for how long. Apple got really lucky with the iPod. Nobody was ever able to come up with an alternative. The iTunes store was a very smart move. I doubt that they will be able to continue this with their other gadgets. There are already plenty of alternatives to the iPad and iPhone for those folks who like this approach. Image alone may not cut it for much longer, especially after Mr. Jobs retirement.
 
Upvote 0
A

archangelrichard

Guest
let -

It is obvious you are a fan boy (of Atari and Amiga no less) but you still have your fundamental facts wrong

Apple II came long before the Atari ST and Amiga, (as was IBM) so you are comparing Apples to oranges here. NO, the 6502 Apple used was 1.85 MHZ (it had a 1 MHZ throughput - Just like the 4.77 MHZ 8088 IBM used - that chip had to page itself for instructions and then page through a number of 64K pages of RAM)

Custom chips? The deal was that Apple wanted others to make video and audio cards (which would have those chips); NO it did NOT do audio, it strobed (flashed on or off) a speaker, faster or slower, which generated different tones; if you wanted audio you bought a sound card or a synthesizer card or ..... NO it did NOT do video (other than text), that was the LANGUAGE chips (Integer basic or Applesoft)

The Atari 800 used the same 1.85 Mhz 6502 Apple did but they designed their system to be all Atari - and paid the price (from dumb things like a dual drive unit with one controller that - if one drive went down it took the other with it, to plug in memory modules and rom cartridges to ALL SERIAL interfaces), this was a toy in design as opposed to the Apple which was a hobbyist's computer

Coincidentally I was a dealer and supported the silicon valley club which had several Atari employee members (and, yes; Nolan Bushnell who founded Atari was an alcoholic who ended up buying a bar in sunnyvale as it ended up being cheaper than paying his present and expected future bills)

I was also a Commodore dealer but all their products were a little off - from the CBM 8032 and Pet which were designed as instrument controller / readers to the Vic 20 (which had 3 major ROM versions, each larger than the last and each disrupted the start point of aftermarket games) to the 64 (which still did not have a real DOS) and on (Jack Trammiel left the company he turned from a calculator company into a computer company and formed Sirius Computer (later renamed Victor) which made IBM semi-compatible (he had better video which wasn't compatible and needed Sirius dri9vers; he had a 4 speed floppy drive with over 4 times the capacity of IBM standard - it went faster as it went inward, but it could not read IBM standard disks; he had a special keyboard / mode that when you hit the "Calc" key became a calculator until you hit that key again at which point it would transfer the current total into whatever program you were in - but some IBM programs choked when you did this, again it needed special drivers)

The Amiga was not designed by Commodore but was bought, and yes it had special video and audio chips (as did the Mac II - it's competitor and the Atari ST as well) and for those who watched Babylon 5 all the graphics were Amiga based running Videotoaster

"the Apple II didn't usher in any modern hardware design, not a custom chip or advanced bus in the thing" - how is that modern hardware design? The PClones still don't have custom chip and just what do you think is an advanced bus? The only two bus designs at the time were S-100 (generic CP/M computers but not enough bandwidth for 16 bit computers) and Apple II; you are again confusing the mother design for the daughters - and that would be the Mac II comparing to the Atari ST and Amiga

"But boom it was quickly blow away by the Atari 800" - Uh, NO! As i said above the Atari was a consumer product, you could not do even ten percent of what you could with an Apple II and that WAS their purpose (coincidentally the all serial usage was because of TV interference, Atari was in a consumer market but Apple was not) the Atari had GAMES, GAMES, and more GAMES; while the Apple had a lot of business and personal software (and the educational market pretty much to itself as they gave Apple II's to schools); there was no way to make an Atari run 80 column nor using it with a higher res monitor rather than a TV

The Atari had "Shepardson" Basic (which had funny math and no floating point; they later added a Microsoft basic on floppy but it had few users), an absolutely horrible DOS (there was an aftermarket DOS and a better Basic from another company in cupertino whose owner spec'd the shepardson basic while at Atari - a friend of mine); it was designed against modification (I confused one of the Atari Tech people at my store - I had changed the default background color by adjusting the potentiometer and he thought it must be broken)

This goes back to the fanboy issue; these computers were in different markets but you fail to recognize that; FLAO

"And remember the mess people had to go through each time they installed even a basic soundblaster or something in a clone or an Apple?" -- I'm sorry you don't remember but Apple had Plug-and-play in the Mac;s and IT WORKED (that nasty forcing manufacturers to follow the rules in order to sell software guaranteed to work on the Mac)

I am sorry you weren't there when this was happening as you clearly demonstrate with your confusion of generations, I was. There were a lot of computers coming out almost daily and then the industry partially collapsed during the first Reagan Recession, Atari home computer division got sold to Warner Bros; TI (which actually had a real live 32 bit processor but could not get the computer the small computer division wanted to sell because it stepped all over the mid size computers in performance) got out of the market, Osborne bombed when they announced a full screen version years before they could deliver, Eagle computer's founder, Dennis Barnhart, discovered that Ferrari's do not fly wel when upside down at 140 + mph (it went off the street and into a city park and trees do not move out of the way); George Morrow could not make enough money out of his innovative designs and sold his "lunchbox" portable to Zenith who made millions from it; the main authors of Wordstar formed a new company (newstar) but could not find enough of a market (they were programmers, not sales people) so they merged back into wordstar (MicroPro) in time to write the best selling version (3); and on and on

See, what fanboy's do not understand is that everything has it's market, even the Timex Sinclair (which did develop an aftermarket) An individual computer may be the best IN IT's TARGET market and lousy for other people looking for other things. Rant all you want, nothing is going to change reality and this is the reality - Commodore had a better computer (Amiga) for what it did but that wasn't enough; the Mac II beat it because you could adapt it to do all that and then some; the Atari ST was tied to a company that wanted consumer products and it was not one

This is not what the thread is about; that is about companies (for a good short book "The Macintosh Way" by Guy Kawasaki describes how Apple had product evangelists with the mac who went to companies developing hardware and software for the Mac, saw that they had current examples of the systems; any Apple tech support they needed, and generally promoted the product like to retail as well[ no one else was doing this - Canon does this with marketing reps who come to stores, give demo's, usually do rebates with the store to push sales, etc.)

Unlike computers, Cameras are evolutionary; you don't change the lens mount without a damm good reason; you don't change the basic shape or function of cameras (the new EF is an example of a trend growing up from pocket cameras and the consumerization of the DSLR); so there is no particularly disruptive technology here (unless you think that smartphone / cameras are in the same market as DSLR's)

Canon and Apple aren't in the same kind of market and I don't agree with the article; which oversimplified what Apple does
 
Upvote 0
Isn't it fair enough to say that built into Apple's nature is a want not to be nostalgic, and not minding about reducing lines, making changes - this is much harder for some other companies.

An interesting example of this may be soon what happens to the iPod. It's been superseded, so where it goes we'll find out shortly.
http://tech.fortune.cnn.com/2011/08/24/corporate-antibodies-why-apple-seems-to-be-immune/

Apple has the company so they basically can create walled off skunkworks, assigned to do specific projects, no-one else apart from top level staff know about (see the recompile to Intel, iPhone/iPad design for example). Most all Apple staff get to find out what their colleagues have been working on the day of the release, same time consumers do. They they get to work on making their area of coding/projects fit in once it's announced.

If I was at Canon - i'd be wondering if the convergence train steamrolling in from Android and iOS isn't going to start biting harder - when do dSLR manufacturers get seen as dumb hardware creators, who need someone else to help with better OS?
It's not there, but 8MP within a smartphone - it's biting at point and shoots. We've already seen apps to control dSLR, but when do consumers start asking for their dSLR to be slaved remotely/tethered wirelessly to their tablet/iPhone, with live preview? Wouldn't many people want to review the focus/depth of field on a shot, but are hindered by the screen on their camera?
Maybe it won't happen yet, but the skunkworks that Google is doing on computational photography could be interesting.
Especially as cameras become better video recorders - who wouldn't want pre-created focus pulling, automated focus/aperture/ISO/other bracketing, quick review, wireless flash control via an smartphone app etc...
 
Upvote 0

unfocused

Photos/Photo Book Reviews: www.thecuriouseye.com
Jul 20, 2010
7,184
5,483
70
Springfield, IL
www.thecuriouseye.com
If I was at Canon - i'd be wondering if the convergence train steamrolling in from Android and iOS isn't going to start biting harder

What would make anyone think that Canon is unaware of the trends that are occurring? Some of us are old enough to remember when Canon was just one more undifferentiated manufacturer of consumer cameras. Canon did not become the incredibly successful company they are by being slow-witted.

Just because we are not privy to what their marketing, engineering and other development divisions are working on doesn't mean they are not working on hundreds of innovative technologies at any one time.
 
Upvote 0
Status
Not open for further replies.