It is time for the mainland to come back into the fold.
I agree the mainland should be allowed to maintain some amount of self rule during the transition.
It is time for the mainland to come back into the fold.
I agree the mainland should be allowed to maintain some amount of self rule during the transition.
You can list every man page installed on your system with man -k .
, or just apropos .
But that’s a lot of random junk. If you only want “executable programs or shell commands”, only grab man pages in section 1 with a apropos -s 1 .
You can get the path of a man page by using whereis -m pwd
(replace pwd
with your page name.)
You can convert a man page to html with man2html
(may require apt get man2html
or whatever equivalent applies to your distro.)
That tool adds a couple of useless lines at the beginning of each file, so we’ll want to pipe its output into a | tail +3
to get rid of them.
Combine all of these together in a questionable incantation, and you might end up with something like this:
mkdir -p tmp ; cd tmp
apropos -s 1 . | cut -d' ' -f1 | while read page; do whereis -m "$page" ; done | while read id path rest; do man2html "$path" | tail +3 > "${id::-1}.html"; done
List every command in section 1, extract the id only. For each one, get a file path. For each id and file path (ignore the rest), convert to html and save it as a file named $id.html
.
It might take a little while to run, but then you could run firefox .
or whatever and browse the resulting mess.
Or keep tweaking all of this until it’s just right for you.
“I’m not X but <position statement that clearly requires them to be X” and “I don’t want to Y but <proceeds to do exactly Y>” are used by people that mistakenly believe a disclaimer provides instant absolution.
On the other hand, I’ve never had anybody threaten to yuck my yum in exactly those terms, and I’m slightly intrigued by the prospect.
I was watching the network traffic sent by Twitter the other day, as one does, and apparently whenever you stop scrolling for a few seconds, whatever post is visible on screen at that time gets added to a little pile that then gets “subscribed to” because it generated “engagement”, no click needed.
This whole insidious recommendation nonsense was probably a subplot in the classic sci-fi novel Don’t Create The Torment Nexus.
Almost entirely unrelated, but I’ve been playing The Algorithm (part of the Tenet OST, by Ludwig Göransson) on repeat for a bit now. It’s also become my ring tone, and if I can infect at least one other hapless soul with it, I’ll be satisfied.
deleted by creator
It could be anything that makes it worth paying money for the accounts in the first place.
Unfortunately, looking from the outside, it’s difficult to tell if an account has been bought, hacked, or if the original owner just decided to become a scumbag out of nowhere.
For example, have a look at https://www.reddit.com/user/fakerht, a 4 years old account that, just 30 minutes ago, decided to promote a scam site that attempts to steal crypto by luring them with the promise of an airdrop.
deleted by creator
That mirrors the tension many reddit mods struggled with recently… It’s difficult to push back against Reddit without also punishing its active users in some real way.
The folks using Reddit are still real human beings. But I get that not everybody is going to draw the line in the same spot.
To push back on that a bit, many Reddit “aged accounts” are used to push scams to the great unwashed masses.
I’m not sure it’s morally okay to turn a blind eye from who’s buying those accounts or why.
I honestly don’t know. The only advice I’d have for the layman would be “just don’t do this”, but I understand that’s little more than an invitation to be ignored.
Running strange software grabbed from unknown sources will never not be a risky proposition.
Uploading the .exe you just grabbed to virustotal and getting the all clear can indicate two very different things: It’s either actually safe, or it hasn’t yet been detected as malware.
You should expect that malware writers had already uploaded some variant of their work to virustotal before seeding it to ensure maximum impact.
Getting happy results from virustotal could simply mean the malware author simply tweaked their work until they saw those same results.
Notice I said “yet” above. Malware tends to eventually get flagged as such, even when it has a headstart of not being recognized correctly.
You can use that to somewhat lower the odds of getting infected, by waiting. Don’t grab the latest crack that just dropped for the hottest game or whatever.
Wait a few weeks. Let other people get infected first and have antiviruses DBs recognize a new malware. Then maybe give it a shot.
And of course, the notion that keygens will often be flagged as “bad” software by unhelpful antivirus just further muddies the waters since it teaches you to ignore or altogether disable your antivirus in one of the most risky situation you’ll put yourself into.
Let’s be clear: There’s nothing safe about any of this, and if you do this on a computer that has access to anything you wouldn’t want to lose, you are living dangerously indeed.
Several times now, I’ve sent people I knew links to articles that looked perfectly fine to me, but turned out to be unusable ad-ridden garbage to them.
Since then, I try to remember to disable uBlock Origin to check what they’ll actually see before I share any links.
There are a near infinity of those out there, many of which just grab other scanlation groups’ output and slap their ads on top of it.
Mangadex is generally my happy place, but you’ll have to wander out and about for various specific mangas.
Several of the groups that post on Mangadex also have their own website and you may find more stuff there.
For example right now I’ve landed on asurascans.com, which has a bunch of Korean and Chinese long strips, with generally good quality translations.
The usual sticky points with all those manga sites is the ability to track where you are in a series and continue where you left off when new chapters are posted.
Even Mangadex struggles with that, their “Updates” page is the closest thing they have to doing that and it’s still not very good.
If you’re going to stick to one site for any length of time, and you happen to be comfortable with userscripts, Id’ suggest you head over to greasyfork.org, search for the manga domain you’re using, and look for scripts that might improve your binging experience there.
That sounds like an improbable attempt to leverage the notion that minors can’t enter into a legally binding contract into a loophole to get anything for free by simply having your kid order it.
I have a small userscript/style tweak to remove all input fields from reddit, so I’m still allowing myself to browse reddit in read-only mode on desktop, with no mobile access.
It’s a gentle way to wean myself off. I’m still waiting for my GDPR data dump anyway, so I need to check reddit fairly regularly to be able to grab it when/if it arrives.
One of my guilty pleasures is to rewrite trivial functions to be statements free.
Since I’d be too self-conscious to put those in a PR, I keep those mostly to myself.
For example, here’s an XPath wrapper:
const $$$ = (q,d=document,x=d.evaluate(q,d),a=[],n=x.iterateNext()) => n ? (a.push(n), $$$(q,d,x,a)) : a;
Which you can use as $$$("//*[contains(@class, 'post-')]//*[text()[contains(.,'fedilink')]]/../../..")
to get an array of matching nodes.
If I was paid to write this, it’d probably look like this instead:
function queryAllXPath(query, doc = document) {
const array = [];
const result = doc.evaluate(query, doc);
let node= result.iterateNext();
while (node) {
array.push(node);
n = result.iterateNext();
}
return array;
}
Seriously boring stuff.
Anyway, since var/let/const are statements, I have no choice but to use optional parameters instead, and since loops are statements as well, recursion saves the day.
Would my quality of life improve if the lambda body could be written as => if n then a.push(n), $$$(q,d,x,a) else a
? Obviously, yes.
ViolentMonkey is open source, TamperMonkey is not.
You don’t have to, but it’d be a lot cooler if you did.
I agree, but there’s a non-zero chance I don’t have a full picture of things yet, and maybe things aren’t that bad. Or won’t be that bad.
On the surface, inconsistencies like this seem like they might encourage users to group themselves on a few massive servers that have a lot of local content guaranteed to be consistent rather than spreading themselves across many small instances (power law graph goes here.)
But maybe not. I don’t know. Maybe the system naturally converges toward clusters of interests where each instance is primarily focused on a few things, and while the federation mechanism exists and is mostly useful, it is a secondary feature behind a primary use-case where folks preferentially engage with their local communities.
Overall, I wonder how much of all this is colored by expectations we’ve developed while using Reddit.
All this fediverse stuff is built on very different foundations than things like twitter or reddit, and while it’s easy to gloss over it because the UIs look superficially similar, they’ve made some fundamentally different trade-offs.
But maybe the consistency stuff could get better over time too. Maybe there’ll be a smoother experience to better flag when and why things are inconsistent (“instance X hasn’t sent us activity updates since T”, “instance X has partially defederated from us”, etc.), and maybe even offer targeted palliative measures rather than a generic disclaimer.
All this stuff is under fairly active development still, so there’s hope.
I’d say, let’s have everyone brainstorm the best way to go about this, and let a thousand flowers bloom!