• 1 Post
  • 236 Comments
Joined 3 months ago
cake
Cake day: June 9th, 2024

help-circle

  • Alternately, what’d be really neat would be an easy way to mostly completely do a webpage setup for someone using the free hosting options that do exist.

    Like, a tool that makes handling deploying something to Github Pages or Cloudflare Pages or whomever else offers basically free web hosting that isn’t nerdy to the point that you need a 6,000 word document to explain the steps you’d have to take to get a webpage from a HTML editor to being actually hosted.

    Or, IDK, maybe going back for ye old domain.com/~username/ web hosting could be an interesting project to take on, since I’m sure handling file uploads like that should be trivial (lots and loooots of ways to do that.). Just have to not end up going Straight To Jail offering hosting for people, I suppose.








  • Fair points on VR games being fairly social. I was more thinking of the in-person social experience, which is still involving some portion of people sitting around stuffing their face into a headset and wandering off into their own world.

    IMO, this is something that AR/MR stuff could do a great job of making more social by adding the game to the world, rather than taking the person out of the world to the game but, of course, this also restricts what kind of games you can do so is probably only a partial solution and/or improvement on the current state of affairs.

    I also agree that it’s way too expensive still, and probably always will be because the market is, as you mentioned, small.

    PCVR is pretty much dead despite its proponents running around declaring that it’s just fine like it’s a Monty Python skit. And the tech for truly untethered headsets is really only owned by a single (awful) company and only because the god-CEO thinks it’s a fun thing to dump money on which means it’s subject to sudden death if he retires/dies/is ousted/has to take time off to molt/has enough shareholder pressure put on him.

    Even then, it’s only on a second generation (the original Quest was… beta, at best) and is expensive enough that you have to really have a reason to be interested rather than it being something you could just add to your gaming options.

    I’d like VR to take off and the experiences to more resemble some of the sci-fi worlds that have a or take place in a virtual reality world, but honestly, I’ve thought that would be cool for like 20 years now and we’re only very slightly closer than we were then, we just have smaller headsets and somewhat improved graphics.



  • Train to Busan, Parasite, Unlocked, Wonderland, Anatomy of a Fall and Close have been ones I’ve seen recently that I liked.

    I think some of those are available on Netflix, but as I don’t use Netflix I can’t say which ones and for certain, though.

    Edit: I just realized some of those are vague and will lead to a billion other movies lol. The first 4 are S. Korean, the last two are French and they’re all from 2020 or newer so anything not from there or older isn’t the right one.


  • You’re not wrong (and those are freaking enormous dies that have to cost apple a goddamn fortune to make at scale), but like, it also isn’t an Apples-to-Apples comparison.

    nVidia/Intel/AMD have gone for the maximum performance and fuck any heat/noise/power usage path. They haven’t given a shit about low-power optimizations or investing in designs that are more suited to low-power usage (a M3 max will pull ~80w if you flog the crap out of it, so let’s use that number) implementations. IMO the wrong choice, but I’m just a computer janitor that uses the things, I don’t design them.

    Apple picked a uarch that was already low power (fun fact: ARM was so low power that the first test chips would run off the board’s standby power and would boot BEFORE they were actually turned on) and then focused in on making it as fast as possible with the least power as possible: the compute cores have come from the mobile side prior to being turned into desktop chips.

    I’m rambling but: until nVidia and x86 vendors prioritize power usage over raw performance (which they did with zen5 and you saw how that shit spiraled into a fucking PR shit mess) then you’re going to get next year’s die shrink, but with more transistors using the same power with slightly better performance. It’s entirely down to design decisions, and frankly, x86 (and to some degree so has nVidia) have painted themselves into a corner by relying on process node improvements (which are very rapidly going to stop happening) and modest IPC uplifts to stay ahead of everyone else.

    I’m hoping Qualcomm does a good job staying competitive with their ARM stuff, but it’s also Qualcomm and rooting for them feels like cheering on cancer.


  • Power consumption numbers like that are expected, though.

    One thing to keep in mind is how big the die is and how many transistors are in a GPU.

    As a direct-ish comparison, there’s about 25 billion transistors in a 14900k, and 76 billion in a 4090.

    Big die + lots and lots of transistors = bigly power usage.

    I wouldn’t imagine that the 5000-series GPUs are going to be smaller or have less transistors, so I’d expect this to be in the die shrink lowers power usage, but more transistors increase power usage zone.



  • movie industry that’s been complete trash for a while now.

    This is not a callout of you in particular so don’t get offended, but that’s really only true if you look at the trash coming out of Hollywood.

    There’s some spectacularly good shit coming out of like France and South Korea (depending on what genres you’re a fan of, anyways), as well as like, everywhere else.

    Shitty movies that are just shitty sequels to something that wasn’t very good (or yet another fucking Marvel movie) is a self-inflicted wound, and not really a sign that you can’t possibly do better.


  • Well, that’s the doomer take.

    The rumors are that the 80 series card is 10% faster than the 90 series card from last gen: that’s not a ‘10%’ improvement, assuming the prices are the same, that’s more like a 40% improvement. I think a LOT of people don’t realize how shitty the 4080 was compared to the 4090 and are vastly mis-valuing that rumor.

    I’d also argue the ‘GAMES MUST BE ULTRA AT 4K144 OR DONT BOTHER’ take is wrong. My gaming has moved almost entirely to my Rog Ally and you know what? Shit is just as fun and way more convenient than the 7700x/3080 12gb desktop even if it’s 1080p low and not 1440p120. If the only thing the game has going for it is ‘ooh it’s pretty’ then it’s unlikely to be one of those games people care about in six months.

    And anyways, who gives a crap about AAAAAAAAAAAAA games? Indie games are rocking it in every genre you could care to mention, and the higher budget stuff like BG 3 is, well, probably the best RPG since FO:NV (fight me!).

    And yes, VR is in a shitty place because nobody gives a crap about it. I’ve got a Rift, Rift S, Quest, and a Quest 2 and you know what? It’s not interesting. It’s a fun toy that, but it has zero sticking power and that’s frankly due to two things:

    1. It’s not a social experience at all.
    2. There’s no budget for the kind of games that would drive adoption, because there’s no adoption to justify spending money on a VR version.

    If you could justify spending the kind of money that would lead to having a cool VR experience, then yeah, it might be more compelling but that’s been tried and nobody bought anything. Will say that Beat Saber is great, but one stellar experience will not sell anyone on anything.

    And AI is this year’s crypto which was last year’s whatever and it’s bubbles and VC scams all the way down and pretty much always has been. Tech hops from thing to thing that they go all in on because they can hype it and cash out. Good for them, and be skeptical of shit, but if it sticks it sticks, and if it doesn’t it doesn’t.



  • As someone who’s been buying (though not intentionally) exclusively AMD laptops for the past 9 years or so, yeah, this resonates pretty well with the user experience.

    I mean, none of the laptops are bad or defective or whatever, but the quality of support and feature support and just the general amount of time it takes to get things pushed out has always been shit compared to Intel stuff.

    AMD can’t manage firmware and software fixes for shit, regardless of product line and if I were an OEM, I’d probably be pissed at their stupid slow bullshit too.

    Example: 2022 G14 was totally getting USB4. Got a beta for it, and then Asus went ‘Fucking AMD isn’t helping or providing stuff we need, so this beta is all you’re getting, go yell at them.’ Is that the whole story? Maybe not, but it certainly feels perfectly reasonable based on how AMD has supported everything prior to that as well, so I tend to think it’s enough of the story to be true.

    Good hardware (mostly), and it’s reliable enough and it does the job, but it’s very much a dont-expect-support kind of experience past the first couple of months after release. (And yes, I know the OEM carries a good portion of responsibility there, but if there’s not a new firmware/microcode/etc from AMD to fix an issue, then what are they supposed to ship to you?)



  • Depends on your threat model and actual realistic concerns.

    Ultimately, if it comes down to it, there’s very little you can do that’s failsafe and 100% guaranteed: the provider has access to your disk, all data in your instances RAM (including encryption keys), and can watch your processes execute in real time and see even the specific instructions your vCPU is executing.

    Don’t put illegal shit on hardware you do not physically own and have physical control over, and encrypt everything else but like, if the value of your shit is high enough, you’re fucked if you’re using someone else’s computer.