I assume they’re going to use it to waste stupid amounts of electricity?
I assume they’re going to use it to waste stupid amounts of electricity?
Lemmy format allows having an actual dialogue
It’s great for seeing existing dialogue, but I think it falls short for long term discussion between more than two people.
On a non-threaded board (e.g. forums, github issues) you can watch a thread you’re interested in. On Lemmy/reddit you only get notifications for direct responses to your comments.
I think some sort of option to watch/unwatch whole subtrees of comments would help a lot.
There’s something like that near me, but it’s a Five Guys clone. Seems like a good idea if you can get established locally. Small menu, nothing complicated.
I’d just count calories and reduce the amount per day until you’re losing weight.
The time of day you eat things shouldn’t really matter. This will also teach you really quickly how to feel full on minimal calories. For me I just try to eat something like raw carrots when I want a snack.
I love that you have a very specific and arguably moral crime in mind, and it inspired this post.
Is there a reason you’re suspicious about that particular dependency, or are you just asking about dependencies in general?
Sounds like it’s not really a downgrade, but just an unreleased beta or test build? That seems a little less sketchy, and maybe it’ll be generally released at some point.
They are actually not that much bigger or different from mobile or game console GPUs, they just have a lot of cooling bolted to them. The cooling allows them to sacrifice efficiency, to be more power hungry and more powerful.
pretty soon you get really good at judging how many calories are in things.
This was the key for me. Understanding the cost of the food I enjoy let me cut back on rice and replace it with ice cream, for example.
Also when I’m logging food, it adds a bit of friction, especially for new foods, so I eat less just because of that. Usually that’s when I realise that I’m not eating because of hunger.
I think in theory it’s great. Email already has solutions for:
etc
Even if you can get past whatever VM detection they currently do, that’ll only work until they require remote attestation.
Sony has officially supported Linux distros for both PS2 and 3.
However, that’s not what the games ran on, so it still fits with your definition. It’s just interesting to think about given the current state of things.
https://en.m.wikipedia.org/wiki/Linux_for_PlayStation_2 https://en.m.wikipedia.org/wiki/OtherOS
The game overlay stuff is all just noise, so I think it’s only the last line (missing libcef.so) that’s a problem. Maybe you could share the contents of the shell script, and the output of readelf -d [path to]/LifeIsStrange
(i.e. the actual elf), ldd
on the same elf, and see if there’s a copy of libcef.so anywhere in the game files?
I’m not sure if libcef (embedded chrome) is meant to be loaded from the steam runtime, from the game files, or from the system. My guess is it can be fixed by altering LD_LIBRARY_PATH in the game’s shell script.
“cons cunt” is just Aussie slang for Lisp Programmer.
I just tried the latest version on my steam deck and it launches okay without overriding the proton version.
I’d run steam in a terminal and collect the output after you press play. I’m happy to take a look if you post it here.
Passenger boat Chicago’s Little Lady was hosting the 1 p.m. Chicago Architecture Foundation tour of the Chicago River.
all passengers were issued refunds.
The boat’s deck was swabbed by its crew, and service was resumed for its scheduled 3 p.m. tour.
This boat crew is the most metal thing in this entire post.
I’m not sure if the game supports more than one of vulkan/d3d11/d3d12, but those might have different CPU performance. Also you seem to be stuck on an old version of proton. If there’s any way to get it working on something newer that might help.
I don’t have the game go test, although I’ve been meaning to give it a try…
It sounds to me like it’s limited by CPU single-thread performance, which isn’t suprising at those frame rates. Looking at overall CPU usage doesn’t tell the whole story. It’s likely that some threads on your CPU are active 100% of the time.
You may have some headroom on GPU performance, so you could try increasing quality in a way that isn’t likely to impact CPU performance. I’m not sure what options the game has, but some good candidates would be: rendering resolution, texture resolution, anisotropic filtering. If you can do that, and it increases GPU use without dropping overall performance, it more or less confirms the CPU bottleneck.
It’s a totally valid approach. Most of the answers in this thread involve booting some non windows thing, and running a tool that will do exactly this behind the scenes.
Edit: there are also a lot of much more terrible answers that involve file-level copying that will definitely not work.
I keep seeing this sentiment, but in order to run the model on a high end consumer GPU, doesn’t it have to be reduced to like 1-2% of the size of the official one?
Edit: I just did a tiny bit of reading and I guess model size is a lot more complicated than I thought. I don’t have a good sense of how much it’s being reduced in quality to run locally.