I think that the number of iPhone photos that are being published are a strong indication that the sensor is far less important that the photographer.
Both are important for different reasons. Yeah, you can take great photos with an iPhone under ideal conditions. Lower the light level to indoor levels and it starts to struggle. You'll almost certainly never see many published photos outdoors at night taken with an iPhone. You'll never see any significant number of published sports photos coming from an iPhone. And so on. Basically, there's nothing the photographer can do to fix the extreme motion blur of a half-second exposure.
And there are shots that simply cannot feasibly be taken with an iPhone, because it is not practical to physically place the phone close enough to the subject to get a decent shot. I mean yes, ostensibly you could build a remote shooting helicopter rig for your iPhone, but really, what's the point? Zoom lenses matter.
To answer the topic question, yes, IMO, sensors make the camera. You can work around a weak autofocus system by learning to use a manual focus lens effectively. You can work around a slow repeat rate by learning to time your shots better. You can't work around a poor quality sensor; the sensor quality fundamentally defines the quality of the resulting image. And you can't change the sensor; you're stuck with it. This means that the sensor is the single most important attribute of the camera itself. All else is secondary—important, even useful, but secondary to the quality of the sensor.
To be fair, beyond the point at which the sensor is "good enough" for a particular purpose, you do start to see diminishing returns from sensor improvement. Therefore, it is not necessarily the case that improving
the sensor is more important than improving other aspects of the camera. The answer to that question depends entirely on your starting point. If you start with a lousy sensor, improving the sensor is the most important thing; if you start with a great sensor, improving it is mostly unimportant, and other factors start to dominate. But if you stick a crappy sensor in a 5D Mark III, it would be a crappy camera, whereas Canon stuck a crappy AF system in the 6D, and it's still a great camera.
If you're looking at camera systems, the availability of glass whose quality is good enough to fully take advantage of the sensor's performance is of equal importance to the quality of the sensor. And again, all else is secondary, for the same reasons.
Pedantically, I should also add that the rest of the analog image pipeline is critical for the same reason that the sensor is. For most modern cameras, the analog image pipeline is part of the sensor, but for some reason, Canon hasn't made that leap yet. So for Canon cameras, there are other parts of the camera that are just as important as the sensor, solely because they're years behind the rest of the industry in terms of the way they design their image acquisition hardware. But I digress.
And the main differences between the 5D Mark III sensor and the 6D sensor are that the 6D sensor is slightly lower resolution, and its downstream amplifier circuitry seems to have a lower and more consistent noise floor from channel to channel, resulting in less banding and dark noise. Either that or the analog signal path is better shielded from noise sources. Either way, the result is the same.