I suspect this is because of the looming end of Windows 10. There’s a large segment of Windows users, myself included, with Visual Studio being the only remaining tie to the Windows ecosystem. Extremely smart move by JetBrains, if true.
I suspect this is because of the looming end of Windows 10. There’s a large segment of Windows users, myself included, with Visual Studio being the only remaining tie to the Windows ecosystem. Extremely smart move by JetBrains, if true.
I decided to split the difference, by leaving in the gates, but fusing off the functionality. That way, if I was right about Itanium and what AMD would do, Intel could very quickly get back in the game with x86. As far as I’m concerned, that’s exactly what did happen.
I’m sure he got a massive bonus for this decision, when all the suits realized he was right and he’d saved their asses. /s
As I understand it (and assuming you know what asymmetric keys are)…
It’s about using public/private key pairs and swapping them in wherever you would use a password. Except, passwords are things users can actually remember in their head, and are short enough to be typed in to a UI. Asymmetric keys are neither of these things, so trying to actually implement passkeys means solving this newly-created problem of “how the hell do users manage them” and the tech world seems to be collectively failing to realize that the benefit isn’t worth the cost. That last bit is subjective opinion, of course, but I’ve yet to see any end-users actually be enthusiastic about passkeys.
If that’s still flying over your head, there’s a direct real-world corollary that you’re probably already familiar with, but I haven’t seen mentioned yet: Chip-enabled Credit Cards. Chip cards still use symmetric cryptography, instead of asymmetric, but the “proper” implementation of passkeys, in my mind, would be basically chip cards. The card keeps your public/private key pair on it, with embedded circuitry that allows it to do encryption with the private key, without ever having to expose it. Of course, the problem would be the same as the problem with chip cards in the US, the one that quite nearly killed the existence of them: everyone that wants to support or use passkeys would then need to have a passkey reader, that you plug into when you want to login somewhere. We could probably make a lot of headway on this by just using USB, but that would make passkey cards more complicated, more expensive, and more prone to being damaged over time. Plus, that doesn’t really help people wanting to login to shit with their phones.
Automated certificate lifecycle management is going to be the norm for businesses moving forward.
This seems counter-intuitive to the goal of “improving internet security”. Automation is a double-edged sword. Convenient, sure, but also an attack vector, one where malicious activity is less likely to be noticed, because actual people aren’t involved in tbe process, anymore.
We’ve got ample evidence of this kinda thing with passwords: increasing complexity requirements and lifetime requirements improves security, only up to a point. Push it too far, and it actually ends up DECREASING security, because it encourages bad practices to get around the increased burden of implementation.
The hell is that summary, AI-generated? Why yes, people DO work inside the TikToc building.
Talk about burying the lede, by not elaborating on that title, like the article does. “Stripping” does not mean that teenagers are being “stripped” from the platform, or from feeds, like I figured. It literally means that THEY are stripping. OnlyFans style. For gifts. Jesus fuck.
First game I ever played where I was like “yo, I actively WANT to do the speedrun achievement, and the deathless achievement.” So, first game where I ever did those things. Maybe I’m just crazy, but I found them way easier than I expected.
Also, a prime example of storytelling through music.
I appreciate the “carrot with a bit out of it” icon.
It’s the capability of a program to “reflect” upon itself, I.E. to inspect and understand its own code.
As an example, In C# you can write a class…
public class MyClass
{
public void MyMethod()
{
...
}
}
…and you can create an instance of it, and use it, like this…
var myClass = new MyClass();
myClass.MyMethod();
Simple enough, nothing we haven’t all seen before.
But you can do the same thing with reflection, as such…
var type = System.Reflection.Assembly.GetExecutingAssembly()
.GetType("MyClass");
var constructor = type.GetConstructor(Array.Empty<Type>());
var instance = constructor.Invoke(Array.Empty<Object>());
var method = type.GetMethod("MyMethod");
var delegate = method.CreateDelegate(typeof(Action), instance);
delegate.DynamicInvoke(Array.Empty<object>());
Obnoxious and verbose and tossing basically all type safety out the window, but it does enable some pretty crazy interesting things. Like self-discovery and dynamic loading of plugins, or self-configuration of apps. Also often useful when messing with generics. I could dig up some practical use-cases, if you’re curious.
If you’re interested in detail, I can recommend this book: https://play.google.com/store/books/details?id=ncGVPtoZPHcC.
I think the big reasons for most people boil down to one or both of two things:
A) People having 0 trust in Google. I.E. people do not believe that paying for their services will exempt them from being exploited, so what’s the point?
B) YouTube’s treatment of its content creators. Which are what people actually come to YouTube for. Advertisers and copyright holders (and copyright trolls) get first-class treatment, while the majority of content creators get little to no support for anything.
Let’s assume the chicken has to reach a temperature of 205C (400F) for us to consider it cooked.
Remind me never to let this guy cook for me.
I mean, I’m paraphrasing, too.
Even better quote, I love using this one.
“So, with AI writing code for us, all we need is an unambiguous way to define, what all our business requirements are for the software, what all the edge cases are, and how it should handle them.”
“We in the industry call that ‘code.’”
I mean, REST-ful JSON APIs can be perfectly type-safe, if their developers actually take care to make them that way. And the self-descriptive nature of JSON is arguably a benefit in really large public-facing APIs. But yeah, gRPC forces a certain amount of type-safety and version control, and gRPC with protobuf is SUCH a pleasure to work with.
Give it time, though, it’s definitely gaining traction.
#4 for me.
Proper HTTP Status code for semantic identification. Duplicating that in the response body would be silly.
User-friendly “message” value for the lazy, who just wanna toss that up to the user. Also, ideally, this would be what a dev looks at in logs for troubelshooting.
Tightly-controlled unqiue identifier “code” for the error, allowing consumers to build their own contextual error handling or reporting on top of this system. Also, allows for more-detailed types of errors to be identified and given specific handling and recovery logic, beyond just the status code. Like, sure, there’s probably not gonna be multiple sub-types of 403 error, but there may be a bunch of different useful sub-types for a 400 on a form submission.
Anyone else this there’s actually nothing at all wrong with the “New” row of icons? Except for the triangle one, which is terrible in its “Original” version as well, as it indicates absolutely nothing about its app (I believe it’s Google Drive, right?). All the rest are clearly distinguishable, and have relevance to what the app does.
Case in point: Every single thing Microsoft is doing in Windows these days.
Yeah, you need a way to specify what you want with a high degree of both flexibility and specificity. We have a term for that in the industry, it’s called “writing code”.
The hell does “single-capacity” mean here? The article doesn’t specify.
You’ve never met an average ASP.NET developer?