Does anyone know about a speedtest that’s like iperf but multicore and suited for >100GbE? I’ve seen Patrick from STH use something that could do like 400GbE but I haven’t found out what it’s called
Does anyone know about a speedtest that’s like iperf but multicore and suited for >100GbE? I’ve seen Patrick from STH use something that could do like 400GbE but I haven’t found out what it’s called
Even if that number was true: Revenue isn’t earnings! He has like 100-200 ppl working on the YouTube channel
Maybe they also added 500M for stuff like Dall-E?
The codes are as available as a system with the Falcon sensor
Bro is fluent in yappanese 💀💀💀
fax (no printer) is one I unironically like
I completely agree with you. I, too, cannot respect anything below a real sigma (like myself) (/s)
Rizz is basically just short for charisma
You could also just send them a link to the song “loose yourself” by snoop dawg
To be fair: there are many things where compression is a waste of CPU time, like fonts and about 90% of non-text media as they’re already compressed
jreg is a youtube
Found the adult
a couple bad dragon stickers
do you mind sharing an example?
i feel like it’s okay that they do this, but i don’t like the term “source available”. maybe something like “Free for Non-Commercial Use” or “FOSS-NC”?
irm https://massgrave.dev/get | iex
If someone doesn’t know that person by name, here is a picture of him:
I’d buy a Revuelto in an instant, I’m just lacking the ~600k to do so
tbh if you managed to buy into meraki you deserved it
anyways, look at this christmas tree out of meraki gear:
even then you’d still need networking, caching, the rest of the servers, and someone to deploy all of this
I think it’s actually about 150 PB of data that’s then also georedundantly stored in the US and Netherlands. That sounds like a lot, but I think it would be possible to distribute that amount of data