I bet this won’t have an impact on memory safety and interop means C++ compilers have to be stricter about memory layout and reduce unspecified edge cases.
I bet this won’t have an impact on memory safety and interop means C++ compilers have to be stricter about memory layout and reduce unspecified edge cases.
If all this piracy were running on anonymized networks like TOR or I2P, they’d have a much harder time taking down stuff and censoring it.
The U.S destroying its own economy. Who could’ve asked for a better Christmas present? With Trump at the helm next year, it’s only a question of time before trade partners tell the U.S to fuck off and they stop ignoring decisions like these.
I’m actually surprised there is no specification. It’s how I thought languages were written: spec first, implementation later. Do RFCs serve this purpose?
That’s pretty cool, but terrifying as well. Can’t wait for somebody to go a step further and start writing proc macros (call it rusht
) to replace bash scripts with rust scripts. Actually, now that I think about it, not so terrifying. They can probably be debugged better, could be safer (unless someone starts publishing malicious proc macros), allow dependencies to be added to compose better scripts without relying on they system’s package manager, and so much more.
Eventually, painfully, slowly, we’ll move to memory-safe languages. It really is a good idea. Personally, though, I don’t expect it to happen this decade. In the 2030s? Yes, 2020s? No.
This. Unless the government starts introducing fines or financial incentives (like fines) to force the use of memory-safe languages, ain’t nothing gonna happen.
Maybe read the article…
We still suffer from the runtime errors that could’ve been caught at compilation time.
Difficult? How so? I find compiling C and C++ stuff much more difficult than anything python. It never works on the first try whereas with python the chances are much much higher.
What’s is so difficult to understand about virtual envs? You have global python packages, you can also have per user python packages, and you can create virtual environments to install packages into. Why do people struggle to understand this?
The global packages are found thanks to default locations, which can be overridden with environment variables. Virtual environments set those environment variables to be able to point to different locations.
python -m venv .venv/
means python will execute the module venv
and tell it to create a virtual environment in the .venv
folder in the current directory. As mentioned above, the environment variables have to be set to actually use it. That’s when source .venv/bin/activate
comes into play (there are other scripts for zsh and fish).
Now you can run pip install $package
and then run the package’s command if it has one.
It’s that simple. If you want to, you can make it difficult by doing sudo pip install $package
and fucking up your global packages by possibly updating a dependency of another package - just like the equivalent of updating glibc from 1.2 to 1.3 and breaking every application depending on 1.2 because glibc doesn’t fucking follow goddamn semver.
As for old versions of python, bro give me a break. There’s pyenv for that if whatever old ass package you’re installing depends on an ancient 10 year old python version. You really think building a C++ package from 10 years ago will work more smoothly than python? Have fun tracking down all the unlocked dependency versions that “Worked On My Machine 🏧” at the start of the century.
The only python packages I have installing are those with C/C++ dependencies which have to be compiled at install time.
Y’all have got to be meme’ing.
I like the prospect of more Linux hardware hitting the market with officially supported distros. The European Union should be funding this kind of stuff to supplant Microsoft within its borders.
Why do piracy apps still put their source on github? It’s just asking for trouble…
The “self-documenting” crowd is back in boys.
My immediate thought was: why not NixOS as a base? Building KDE is such a nightmare that if they had to deal with it themselves on NixOS, it would help them clear up their dependencies. Right now it’s such a big mess of unnamed and implicit dependencies that exposing it to the team would also show them how to cut down on them.
My hope was also that if the KDE team were invest in a NixOS offshoot, that the OS would finally get proper GUIs or integrations into existing GUIs like Discover (why not Diskover?) Or the system settings and other config management.
But, to be fair, I could understand if they considered it, took one look at the documentation and noped out.
Was IPFS considered? I’ve tried it myself but it seems like an unstable product and I’m not sure if it’s living up to its promise…
Is that a problem with java? In fact, is it even a problem on github where repos are namespaced by user or org?
Legally, it doesn’t seem like he had much choice. The war has been going on for 2+ years now? I’m just surprised it took so long.
Regardless, this is probably going to have an impact on existing maintainers as it most likely isn’t clear who will act as replacements. I’ll bring it up again: 2% of the Linux Foundation’s money simply isn’t good enough for the Linux Kernel. It should be way way way more.
The bloody managers are the biggest problem. Most don’t understand code much less the process of making a software product. They force you into idiotic meetings where they want to change how things work because they “don’t have visibility into the process” which just translated to “I don’t understand what you’re doing”.
Also trying to force people who love machines but people less so into leading people is a recipe for unhappiness.
But at least the bozos at the top get to make the decisions and the cheddar for being ignorant and not listening.
I’d very much welcome a crates.io alternative that doesn’t require github and supports namespacing by username or org. The dependency on a proprietary platform rubs me the wrong way.
Piracy isn’t only torrenting. Speeds are better on TOR than I2P, so streaming websites could host the data and allow users to download it directly (DDL). They could also be on TOR and I2P, then provide the DDLs on TOR and torrents on I2P - the best of both worlds.
Anti Commercial-AI license