There is a reason why a standard is adopted, or not. JPG prevailed because the industry converged on a single 8-bit format, with variable compression, and more than enough for the sensors, screens and most printing systems. JPEG2000 suffered from excessive customization possibilities, 1 to 39 bits per channel, different color encodings, and to the best of my knowledge camera manufacturers never got together to standardize to a specific set of specifications such that any application can read all JPG2000 files from any origin, the way they can with JPG.
Some (me) use RAW as an insurance policy, allowing for exposure correction, color balance, shadows or brights assimilation, etc, but from there we produce jpgs to print of share with others, and now that 10-bit monitors and other facilities are there is makes sense to look for something new. Apple does not create standards if there is a competent one out there, they went along with MP3 until it was no longer satisfactory, now you will find most A/D D/A converters supporting Apple Lossless Audio Codec, which is license- and royalty-free under the Apache 2.0 licensing scheme. For photography Apple adopted HEIF, a format defined by
Moving Picture Experts Group (MPEG) (ISO/IEC 23008-12), and made it popular by integrating it with the iPhone, iOS and MacOS. So it is not really an Apple standard so much as a standard that was dormant until a very influential company adopted it. Canon are smart to follow that trend.
I am interested in HEIF, even if I do not have (yet) a 10 bit monitor, because I would be able to take images from the camera with the possibility to correct + - 1 EV the luminosity, so I may not need to swamp my disk drives with RAW files quite so much.