I use AI daily for work. It has valid & valuable use cases. It's also still in its infancy and will continue to get better over time. Where that leads us (good or bad) is anyone's guess, but it's not going away.Does anybody actually think AI is worth it? I don't, for the most part. I'm sure there's some good that can come from it but it seems like a lot of bad stuff. That being said, I don't know all of the uses, so I'm curious of others opinions or knowledge on the matter.
That is never going to happen.This AI bullshit has to be stopped. Governments have to step in and regulate this sector so the ordinary people can buy memory.
It also depends on which type of AI we are talking about.Does anybody actually think AI is worth it? I don't, for the most part. I'm sure there's some good that can come from it but it seems like a lot of bad stuff. That being said, I don't know all of the uses, so I'm curious of others opinions or knowledge on the matter.
This may be odd, but I feel bad for the kids that want to build gaming rigs. I remember spending every penny I made one summer doing my first one. Some of the PC builder groups are therapy sessions at this point.
Time to switch to new Chinese brands. These compare making a killing at the expense of consumers. Time to boycott these greedy companies.
This creates a problem for junior developers, but also for the future as eventually there will be less senior developers, whom are still needed... but it is unfashionable to discuss such unsavory topics![]()
This AI bullshit has to be stopped. Governments have to step in and regulate this sector so the ordinary people can buy memory.
It depends on what the AI is being used for. All the hype is around artificial image creation and getting answers from the likes of Chat GPT. Neither of those are worth much, nor are likely to monetize well. I have a fairly small computer with an Intel B50 graphics card in it that I set up to do local AI. It will do most of the image generation that the big dogs do and Intel AI Playground is free for the loading, which really challenges the likelihood of huge data center monetization. Now the places where AI is useful. Nearly all modern cameras are using some form of AI to assist with autofocus and the relative improvement in AF over the last few years has been huge. If you do serious processing of photos, the Topaz suite (which uses AI extensively) is pretty much a must have, particularly if you shoot in difficult conditions at high ISO. The noise reduction and sharpening tools are unmatched. Even Adobe is using AI for noise reduction and object removal in Lightroom Classic. That is just in our little neck of the woods. You can question the value of the feature, but Tesla self-driving cars actually do work remarkably well. At the other end of the spectrum (and maybe the only place the data centers could see real revenue), AI driven warfare is currently being demonstrated and will only increase in capability in the future. That one is more than a little scary. In the end, I don't see any applications for AI that are going to pay for the enormous cap ex that is being thrown at data centers, not to mention the power bills. Virtually all the useful applications for AI to date are distributed functions that don't need data centers, except possibly once for training.
Indeed. I use ML routinely for scientific image analysis, virtual chemical library screening, etc. Very much value added. But even the companies that provide the software/services have taken to calling it AI. Customers expect it, venture backers ask if companies are using ‘AI-based drug discovery’, and the misnomer perpetuates itself.You're engaging in the exact conflation that billions of dollars of marketing for generative AI have intended you to make.
Generative AI and classical ML approaches with actual design intent are not analogous or appropriately lumped together.
Sorry if I didn't use the Generative vs ML labels, but I did point out that I have a pretty cheap machine that will do "generative AI" in my office and the consequence being that there is not likely a monetization option for data centers to do that kind of "work". Data centers built to do generative AI are really just an extension of the whole "cloud" mindset, which is all about control (marketed as convenience and safety). The number people compromised when a "cloud" facility gets hacked continues to get bigger. At some point, customers will figure out that safety is not really one of the features of the cloud. This is particularly true of generative AI, which is constantly integrating your every interface with it into its data base. At some point, your entire persona becomes public. Not a desirable outcome for most folks.You're engaging in the exact conflation that billions of dollars of marketing for generative AI have intended you to make.
Generative AI and classical ML approaches with actual design intent are not analogous or appropriately lumped together.
Classical deep learning applications like noise reduction, subject recognition, etc. do work, and they are used to defend "AI" generally, meaning generative AI, which is slop that has extremely little value outside of generating spam.
It is a classic motte and bailey. "AI" is a completely worthless label that doesn't reflect any of the technologies allegedly included under it.
No worries, the label is just a nuisance (imo) and can easily get in the way of communication.Sorry if I didn't use the Generative vs ML labels, but I did point out that I have a pretty cheap machine that will do "generative AI" in my office and the consequence being that there is not likely a monetization option for data centers to do that kind of "work". Data centers built to do generative AI are really just an extension of the whole "cloud" mindset, which is all about control (marketed as convenience and safety). The number people compromised when a "cloud" facility gets hacked continues to get bigger. At some point, customers will figure out that safety is not really one of the features of the cloud. This is particularly true of generative AI, which is constantly integrating your every interface with it into its data base. At some point, your entire persona becomes public. Not a desirable outcome for most folks.
Indeed. I use ML routinely for scientific image analysis, virtual chemical library screening, etc. Very much value added. But even the companies that provide the software/services have taken to calling it AI. Customers expect it, venture backers ask if companies are using ‘AI-based drug discovery’, and the misnomer perpetuates itself.
Well it's that, and spam. Half the internet is gen AI slop nowI'm not sure if you can post Instagram links here but this one relates directly to this thread.
Sure, let's boycott Micron. Oh wait, they already decided to stop selling to consumers entirely. The entire point here is that the big memory makers do not need to sell to consumers right now because b2b demand is so insanely high.Time to switch to new Chinese brands. These compare making a killing at the expense of consumers. Time to boycott these greedy companies.
I have not made any purchases of SSD or memory cards since 2024. Wow have prices increased. In Sept. 2024 I purchased several Sandisk 4TB SSD storage for $329.00 /ea. from Amazon. Today the same product sells for $726.00/ea. Luckily I purchased six of them and should be set for the foreseeable future. Same goes for memory cards I purchased several CFE type B v4 cards at 1TB or greater storage and am set for the foreseeable future as well.
Never thought I would see SSDs more than double in price.
My nephew built a gaming rig. He was really into it. He mowed lawns, walked dogs, and any other job he could to buy parts. He stopped recently when everything kept surging in price. He learned a lot about the economy, but he’s pretty heartbroken about it. His friends stopped too.This may be odd, but I feel bad for the kids that want to build gaming rigs. I remember spending every penny I made one summer doing my first one. Some of the PC builder groups are therapy sessions at this point.