deleted by creator
Just tippy-tappin’ my way across the internet.
deleted by creator
What a useless headline. God forbid they just give the actual capacity rather than some abstract, bullshit, flexible measure that means nothing to anyone.
But you’re still limited to the opinions of people who post on Lemmy, which, as someone who occasionally posts on Lemmy, is not a shining beacon of quality.
Even if I just went by what I get on the first page of a Google search, I’d expect I’d find what I need much faster using Google than I would using Lemmy based purely on the volume of info Google has access to. And that’s not even taking into account things like Google’s ability to search within other sites.
Unless Lemmy has gotten like 100 billions times better in the last week, this isn’t even a fair comparison.
Edit: lol, just realised you’re the same guy from the Nvidia thread.
Nvidia makes some of the best hardware for training AI models. Increased investment in AI will inevitably increase demand for Nvidia hardware. It may boost other hardware makers too, but Nvidia is getting the biggest boost by far.
Maybe I’m being dumb or missing something but this feels incredibly obvious to me.
LLM tools can already write basic code and will likely improve a lot, but there are more reasons to learn to code than to actually do coding yourself. Even just to be able to understand and verify the shit the AI tools spit out before pushing it live.
Nvidia knows that the more people who use AI tools, the more their hardware sells. They benefit directly from people not being able to code themselves or relying more on AI tools.
They want their magic line to keep going up. That’s all.
It makes no sense. AI tools will obviously have an impact on the profession development, but suggesting that no one should learn to code is like saying no one should learn to drive because one day cars will drive themselves. It’s utter self-serving nonsense.
This is objectively stupid. There are tonnes of things you learn in maths that are useful for everyday life even if you don’t do the actual calculations by hand.
I deleted all my posts before closing my accounts back when they were breaking third-party apps, although I’m sure they probably kept a private log of all posts specifically for this purpose.
To be honest, I expect AI companies are scraping Lemmy and other places for training data anyway, but I’d rather Reddit specifically not make any money off my posts.
Realistically, a couple of 10TB drives would have me covered for like a decade at least. If these massive drives bring down the price of much smaller ones, I’m a happy boy.
I don’t mean to be dismissive of your entire train of thought (I can’t follow a lot of it, probably because I’m not a dev and not familiar with a lot of the concepts you’re talking about) but all the things you’ve described that I can understand would require these tools to be a fuckload better, on an order we haven’t even begun to get close to yet, in order to not be super predictable.
It’s all wonderful in theory, but we’re not even close to what would be needed to even half-ass this stuff.
That remains to be seen. We have yet to see one of these things actually get good at anything, so we don’t know how hard that last part is to do. I don’t think we can assume there will be continuous linear progress. Maybe it’ll take one year, maybe it’ll take 10, maybe it’ll just never reach that point.
Writer here, absolutely not having this experience. Generative AI tools are bad at writing, but people generally have a pretty low bar for what they think is good enough.
These things are great if you care about tech demos and not quality of output. If you actually need the end result to be good though, you’re gonna be waiting a while.
This shit seems like the perfect basis for a discrimination lawsuit.
Maybe it’s time for a grown up CEO.
You can absolutely contact Microsoft (or Apple) for support, plus basically any computer store will happily charge a small fee for basic tech support, or you can call the computer manufacturer or reseller. On the Linux side, unless you bought from something like System76, the chances of you finding an official support network that an elderly person would find usable and accessible are pretty slim.
That’s absolutely not the idea I have in my head. If you read most of my replies here, I think I explain pretty clearly that the main issue I see with Linux is not actually the software itself, it’s that there’s not a good, normie-friendly support system for when things do go wrong or things aren’t immediately obvious.
I also tend to advocate for MacOS more than Windows. Although I’ve used both my whole life, I find macOS a lot more intuitive than windows, and I would generally never recommend windows unless there’s a specific need for it.
Mid 60s and up, around retiring age (or at least that’s retiring age where I am).
I have, but that’s not the point. There are places you can take a computer and say “hey, I’d like one Windows installation, please.” There are exceedingly few places that would help an old person set up a Linux installation, at which point theyre at the mercy of whatever nerd in their life will do it, and then just hoping they don’t move or die.
Im specifically not expecting them to do it themselves, which is why I think Linux is not a good option.
I don’t get the point of playing what if.
If Linux somehow grew its market share to 80% of all users then there probably would be some form of support-based business or companies forking off their own version and building their own supported platforms, and the we end up with a bunch of closed platforms competing for all the money by offering a more polished experience for a premium.
Or none of that happens. I don’t know, this is all just make-believe because it’s a scenario that’s never going to happen.
Not really. A 4K movie means nothing to 99% of people. Is it 4GB? 40? 400? How many can my phone hold? Or my computer?
This only makes things more understandable if you use a point of reference that everyone you’re talking to is familiar with. The fact that they had to then explain how big a 4K movie is in the article clearly shows that even they know that this doesn’t help people. It’s just a big flashy number.
Just for context, I’m a writer, I understand the point of using these abstract measures to give a frame of reference. But in this case, just giving the capacity in GB/TB would have been easier to understand. It just wouldn’t have been as sensational of a headline.