• 1 Post
  • 374 Comments
Joined 1 year ago
cake
Cake day: August 15th, 2023

help-circle
  • I am curious as to why they would offload any AI tasks to another chip? I just did a super quick search for upscaling models on GitHub (https://github.com/marcan/cl-waifu2x/tree/master/models) and they are tiny as far as AI models go.

    Its the rendering bit that takes all the complex maths, and if that is reduced, that would leave plenty of room for running a baby AI. Granted, the method I linked to was only doing 29k pixels per second, but they said they weren’t GPU optimized. (FSR4 is going to be fully GPU optimized, I am sure of it.)

    If the rendered image is only 85% of a 4k image, that’s ~1.2 million pixels that need to be computed and it still seems plausible to keep everything on the GPU.

    With all of that blurted out, is FSR4 AI going to be offloaded to something else? It seems like there would be a significant technical challenges in creating another data bus that would also have to sync with memory and the GPU for offloading AI compute at speeds that didn’t risk create additional lag. (I am just hypothesizing, btw.)













  • It’s a markup language(ish) but it’s not a programming language. XML would be closer to programming, IMHO, since you could have simple things like recursion. That example is even pushing what I would consider “programming”, but anyone can feel free to disagree.

    SQL is in the same category for me. It’s a query language and can get super complex, perform some basic logic, but you can’t exactly write “snake” in it. Sure, you could use cmdshell or something else to do something more complex, but that would no longer be SQL.

    My simplistic expectation of an actual programming language would be that you can automate an entire platform at the OS level (or lower) instead of automating functions contained within a service or application. (JVMs and other languages that are “containerized” are weird outliers, by my definition.)

    I am not trying to step on anyone’s toes here. I just never have really thought about what I personally consider a programming language to be.



  • “Your TV has become a digital billboard.”

    It’s been a digital billboard for at least 40 years of my life. Radio was no different, so be sure to drink your Ovaltine.

    Have you never seen a commercial before? Cheap subsidized hardware? Bloatware loaded on phones? Bloatware on TVs? Games that require 5 mins of ad time? Google’s crippling of Chrome to break ad blockers? Unskippable ads on YouTube? Sponsored ad spots in YouTube videos? All the 3rd party logos on Smart TV boxes? Product placements in movies? Ad placements before the movie starts? The list goes on.

    The entire entertainment industry is based around advertising. Every delivery platform is designed to show you ads first and entertainment second.

    People have problems figuring that out?



  • remotelove@lemmy.catoScience Memes@mander.xyzThoughts
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    2
    ·
    edit-2
    1 month ago

    Heck yeah it do. The brain is powered by glucose, and more brain activity will use much more of it. Jokingly, it’s how I can tell a new engineer from one that is extremely experienced: A bright young engineer is usually skinny from problem solving all day. An older one is likely stuck in more meetings where brainpower is a liability and is probably on the heavier side.

    Also, as an occasional eater of magic mushrooms, I keep packs of glucose in the house that are typically used for diabetic emergencies. Psilocybin pushes a brain into overdrive and causes my blood sugar to nosedive. (There are studies about possibly using psilocybin as a diabetic treatment to improve pancreatic function, btw…)



  • remotelove@lemmy.catoMemes@sopuli.xyzScreenshot
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    1 month ago

    While I haven’t even thought about digging around for root-cause, I have seen different behaviors across a few Windows insurances. Some pop up the UI, one still just captures the last active screen, or all screens… (It’s weird, but I don’t care enough about it to even Google the config.) Hell, it might have been patched and I haven’t noticed yet.