It depends on what the AI is being used for. All the hype is around artificial image creation and getting answers from the likes of Chat GPT. Neither of those are worth much, nor are likely to monetize well. I have a fairly small computer with an Intel B50 graphics card in it that I set up to do local AI. It will do most of the image generation that the big dogs do and Intel AI Playground is free for the loading, which really challenges the likelihood of huge data center monetization. Now the places where AI is useful. Nearly all modern cameras are using some form of AI to assist with autofocus and the relative improvement in AF over the last few years has been huge. If you do serious processing of photos, the Topaz suite (which uses AI extensively) is pretty much a must have, particularly if you shoot in difficult conditions at high ISO. The noise reduction and sharpening tools are unmatched. Even Adobe is using AI for noise reduction and object removal in Lightroom Classic. That is just in our little neck of the woods. You can question the value of the feature, but Tesla self-driving cars actually do work remarkably well. At the other end of the spectrum (and maybe the only place the data centers could see real revenue), AI driven warfare is currently being demonstrated and will only increase in capability in the future. That one is more than a little scary. In the end, I don't see any applications for AI that are going to pay for the enormous cap ex that is being thrown at data centers, not to mention the power bills. Virtually all the useful applications for AI to date are distributed functions that don't need data centers, except possibly once for training.