Found someone nice. It was sheer chance, really. Met with a new neighbor and she had a crush on me. Was friends for a while. Years later decided to get into a relationship with her.
Found someone nice. It was sheer chance, really. Met with a new neighbor and she had a crush on me. Was friends for a while. Years later decided to get into a relationship with her.
I get very suspicious if a paper samples multiple groups and still uses p. You would use q in that case, and the fact that they didn’t suggests that nothing came up positive.
Still, in my opinion it’s generally OK if they only use the screen as a starting point and do follow-up experiments afterwards
Well, we are pretty big for insects
Because the ones that we hear about are the ones that are good enough to have even made it out of Japan. If a game was bad, it wouldn’t be localized to an English-speaking audience, and we wouldn’t even know it exists.
It’s the same sort of thinking as asking why (insert media here) was better in the past. The answer is simple - good songs, games, movies, etc. tend to be more memorable, and so we remember the good ones and forget the bad ones. To put it briefly, there’s survivorship bias.
Patriotic duck
Braid. Sounds like a dress up game but it’s a puzzle platformers about time travel. I always have to explain it every time I talk about it
For me, everything is a belief unless it satisfies the following criteria:
I find that the one that trips up most people is #3, since some people speak in technically true but overly broad statements and the listener ends up filling in the gaps with their own biases. The listener leaves feeling like their biases have been confirmed by data, not realizing that they have been misled.
In the end, according to my criteria, very little can be categorized as true knowledge. But that’s fine. You can still make judgements from partial or biased data or personal beliefs. You just can’t be resolute about it and say that it’s true.
The Thing (1982) has basically consistently been my favorite horror movie
I mainly do work indoors, so the brightness does not really matter that much to me. But as far as I can tell, the brightness is pretty normal for laptops - I don’t think it’s any brighter or dimmer than other laptops I’ve used in the past. According to this website that I found, brightness is 25 to 486 nits. Google search seems to say that average maximum brightness for laptops is somewhere around 300-400 nits.
My understanding is that the screen is generally what eats up most of the battery on device, so if you plan to have brightness turned up, it might be difficult to find a laptop with a long battery life.
The cpu is on the mainboard and can’t be removed, but you can replace the entire mainboard. Basically, you can upgrade, but you’ll have to upgrade a couple other things along with it
Just tested with normal power profile and screen brightness turned down - battery went down by about 50% after 3 hours. I think my laptop usually dies after 3 hours because I have the screen brightness up
Yes, but that’s my point, you see. Because Arm historically has been used for mobile and small devices, there’s been a strong incentive for decades to emphasize power efficiency. Because x86 historically has been used for desktops, there’s been a strong incentive to emphasize power. It’s only been very recently that Arm attempted to have comparable power, and even more recently that x86 attempted to have comparable power efficiency.
Sure, Arm is currently more efficient, but the general consensus is that there’s no inherent reason for why Arm must be more efficient than x86. In other words, the only reason it is more efficient is just because they’ve been focusing on efficiency for longer.
Both AMD and Intel’s current gen x86 cpu’s are, from what I can tell, basically spitting distance away from Qualcomm’s Arm cpu’s in terms of battery life, and rumor has it that both x86 companies should be able to match Arm chips in efficiency by next gen.
So if efficiency is a priority for you, I think it’s worthwhile to wait and see what the cpu companies cook up in the next couple of years, especially as both AMD and Intel seem to be heavily focused on maximizing efficiency right now
My understanding is that Arm chips don’t have any fundamental advantage over x86 chips. They’re more efficient simply because they’ve been optimized to be more efficient for so long. I’ve heard that upcoming Intel and AMD chips could be able to compete with the new Arm cpu’s, so if you’re not going to get a new laptop soon, it seems worthwhile to just wait and see
Yes, I don’t use the external GPU. I just use the AMD APU. Also I realized that AMD 7000 could refer to both the cpu and the GPU. Ah, AMD and their marketing
Kubuntu on Framework 16 AMD 7000 series here. Sleep is horrible - definitely drains your battery. Bag heats up, and I estimate maybe a 1% drain per hour. I’ve enabled hibernate though I rarely use it.
Battery is alright but not great. I get maybe 2-3 hours of active, light use from full battery.
No compatibility issues that I’ve noticed, though, of course, Linux has its fair share of minor non-hardware-related bugs.
Camera is serviceable but not amazing. Not sure about microphone but I assume the same thing. Speakers are somewhat odd in that the speakers are pointed to the side rather than toward the front, but again - serviceable.
“is” is correct here, due to the implied nouns. “Data” in this case is an appositive and serves to describe the implied noun, “word.” And “plural” is an implied noun for “plural word”
You can see it most clearly if you fill in the implied nouns:
(the word) data is a plural (word)
It seems to be a per-school kind of thing. I am late millennial/early Gen Z, and my school had computer classes where we learned how to use Windows and Microsoft office, how to touch type, the meaning of computer terminology, and what the functionalities are of basic computer parts (eg, “CPU is the brain of the computer”). And later on we started learning how to use Photoshop and Illustrator.
I’m always surprised when I hear that other people don’t have that sort of in depth tech learning in their schools, and worse so, that some people don’t even have computer class. It just always felt like what we learned in computer class was an essential skill
Technology needs to be actively taught and actively learned! If their school isn’t teaching it, maybe try subscribing to some online tech literacy courses?
Regardless of the reason, the end result is still the same, which is that new users are left with the idea that terminal is essential for using Linux.
You can say that you set up a distro without using terminal all you want, but as long as new users don’t know how to do that, my point still stands. Frankly, the fact that you even thought to bring up that point feels like, to me, extra proof that experienced users are highly dismissive of the new user experience.
Science is like going down a Wikipedia rabbit hole. There’s always more things to do and more things to check out. At some point you just have to draw the line and say that enough is enough. Other scientists are likely to ask why you stopped where you stopped, and so saying that “it’s outside the scope of the paper” is basically the nice way of saying that you stopped because you felt like it