Report it. (New account, blog spam, funky domain, poorly configured server, etc.)
Report it. (New account, blog spam, funky domain, poorly configured server, etc.)
I’m in the US, but I suspect mail is fairly the same across North 'Murica. (The government would handle super-remote locations still?)
The only mail I get are either bills or trash. Packages usually don’t ship via USPS. I rarely, if ever, send anything.
I wouldn’t cry at all if it cost me $5 to send a letter every two or three years.
Just call it X-Twitter. It ain’t twitter any more, and “X” is just dumb as fuck.
I remember they finally were able to make a ball point pen all by themselves in 2017.
When I actually start seeing products that aren’t contaminated with fake ICs or are actually grounded properly without hyper-strict foreign supervision, I’ll change my tune. Until then, there isn’t an article in the world that will convince me that China is actually innovating or taking steps to make quality products.
What the hell are you popping off about? All you need to do is go to Amazon or Ali Express to see the absolute plethora of Chinese product clones. Just search for a 3D printers, as a good example.
Yeah, I guess I am a bit prejudice against the bulk Chinese electronic garbage that is usually an extreme fire hazard. Maybe it’s all the fake or counterfeit chips I have had to replace that has pissed me off.
Look bub. I don’t give a flying fuck about China or the politics involved. Their manufacturing sucks ass and their actual their innovation is rare. If they copied products correctly, I would have a bit more respect for their business model.
If you want to go all-out tankie on people, maybe you should go back to your own instance.
That’s great and all, but you missed the theme of this thread.
China has been the king of bulk products for years. Saying that recent investments will alter world politics in 25 years is a bit strange. Saturating markets is what China does best.
There is a bit more history behind TSMC. You left out the bits where they partnered with other companies, like Philips, that gave them access to proprietary information. They continued building relationships with other large companies and investing back into their own business.
China isn’t doing that. China has had access to older fab equipment for years but still fails to truly innovate. If US companies could trust China enough not to steal modern tech, there could be some real benefits to having fabs in China. The world kinda figured out never to send proprietary information to China years ago. Companies still do and doesn’t take long for a thousand clones to pop up on Ali Express shortly after.
Trains are one thing, modern chip fab is a completely different. Buying older equipment is not going to get them anywhere but into the production of chips that have been on the market for 10 years already.
This is one industry where each generation has hard limits for manufacturing.
But they are buying mature-node equipment, says the article. That doesn’t mean shit other then more cloning and counterfeiting.
Future chips not affected by THIS cpu bug yet.
That gave me the idea to toss in a coconut or two into bags this year. I’ll reserve those for the “kids” that are obviously too old for this stuff.
It’s a markup language(ish) but it’s not a programming language. XML would be closer to programming, IMHO, since you could have simple things like recursion. That example is even pushing what I would consider “programming”, but anyone can feel free to disagree.
SQL is in the same category for me. It’s a query language and can get super complex, perform some basic logic, but you can’t exactly write “snake” in it. Sure, you could use cmdshell or something else to do something more complex, but that would no longer be SQL.
My simplistic expectation of an actual programming language would be that you can automate an entire platform at the OS level (or lower) instead of automating functions contained within a service or application. (JVMs and other languages that are “containerized” are weird outliers, by my definition.)
I am not trying to step on anyone’s toes here. I just never have really thought about what I personally consider a programming language to be.
Here is a start for you: https://www.msn.com/ko-kr/news/world/20년간-수류탄을-망치로-써온-할머니에-中-화들짝/ar-BB1oYj9a
It’s all Korean URL encoding in that link, btw. Here is a screenshot anyway.
“Your TV has become a digital billboard.”
It’s been a digital billboard for at least 40 years of my life. Radio was no different, so be sure to drink your Ovaltine.
Have you never seen a commercial before? Cheap subsidized hardware? Bloatware loaded on phones? Bloatware on TVs? Games that require 5 mins of ad time? Google’s crippling of Chrome to break ad blockers? Unskippable ads on YouTube? Sponsored ad spots in YouTube videos? All the 3rd party logos on Smart TV boxes? Product placements in movies? Ad placements before the movie starts? The list goes on.
The entire entertainment industry is based around advertising. Every delivery platform is designed to show you ads first and entertainment second.
People have problems figuring that out?
This isn’t a new concept and it’s really stupid that Ars is presenting it that way.
If companies didn’t know this, then they are already out of business. If the viewers didn’t know this… well… I can’t help you.
Heck yeah it do. The brain is powered by glucose, and more brain activity will use much more of it. Jokingly, it’s how I can tell a new engineer from one that is extremely experienced: A bright young engineer is usually skinny from problem solving all day. An older one is likely stuck in more meetings where brainpower is a liability and is probably on the heavier side.
Also, as an occasional eater of magic mushrooms, I keep packs of glucose in the house that are typically used for diabetic emergencies. Psilocybin pushes a brain into overdrive and causes my blood sugar to nosedive. (There are studies about possibly using psilocybin as a diabetic treatment to improve pancreatic function, btw…)
While I haven’t even thought about digging around for root-cause, I have seen different behaviors across a few Windows insurances. Some pop up the UI, one still just captures the last active screen, or all screens… (It’s weird, but I don’t care enough about it to even Google the config.) Hell, it might have been patched and I haven’t noticed yet.
Ok, I admit I don’t understand the humor. My immediate response was, “sounds about right because of how these things happen”.(I can be kinda dumb like that sometimes.)
Security advisories may not be immediately announced until a patch is available. If this is in regards to FreeBSD-SA-24:08.openssh, a patch was available the day before it was announced and then refined for prod over the next few days : https://www.freebsd.org/security/advisories/FreeBSD-SA-24:08.openssh.asc
The timing of this stuff is always wonky and it doesn’t look like it hit a could of news places until today, about a week after: https://cyberpress.org/vulnerability-in-openssh-freebsd/
I am curious as to why they would offload any AI tasks to another chip? I just did a super quick search for upscaling models on GitHub (https://github.com/marcan/cl-waifu2x/tree/master/models) and they are tiny as far as AI models go.
Its the rendering bit that takes all the complex maths, and if that is reduced, that would leave plenty of room for running a baby AI. Granted, the method I linked to was only doing 29k pixels per second, but they said they weren’t GPU optimized. (FSR4 is going to be fully GPU optimized, I am sure of it.)
If the rendered image is only 85% of a 4k image, that’s ~1.2 million pixels that need to be computed and it still seems plausible to keep everything on the GPU.
With all of that blurted out, is FSR4 AI going to be offloaded to something else? It seems like there would be a significant technical challenges in creating another data bus that would also have to sync with memory and the GPU for offloading AI compute at speeds that didn’t risk create additional lag. (I am just hypothesizing, btw.)