Or if you are on a Boeing plane and a side panel/door spontaneously flies off off you don’t get sucked out
/s, but not really /s
Or if you are on a Boeing plane and a side panel/door spontaneously flies off off you don’t get sucked out
/s, but not really /s
Murica.
This was literally the overarching plot for the last season of curb
That assumes that an adversary has control of the browser
No it doesn’t, if they intercept an encrypted password over HTTPS they can resend the request from their own browser to get access to your account
The big reason you don’t want to send passwords over https is that some organizations have custom certs setup
What is the problem with that? The password is secure and only shared between you and the site you are intending to communicate with. Even if you sent an encrypted password, they wrote the client side code used to generate it, so they can revert it back to its plaintext state server side anyways
It is better to just not send the password at all.
How would you verify it then?
If not sending plaintext passwords was best practice then why do no sites follow this? You are literally posting to a site (Lemmy) that sends plaintext passwords in its request bodies to log-in
Client side verification is just security by obscurity, which gains you very little.
If someone is capable of MITM attacking a user and fetching a password mid-transit to the server over HTTPS, they are surely capable of popping open devtools and reverse engineering your cryptographic code to either a) uncover the original password, or b) just using the encrypted credentials directly to authenticate with your server without ever having known the password in the first place
deleted by creator
You are acting like someone checked off a “log passwords” box, as if that’s a thing that even exists
Someone configured a logger to write HTTP bodies and headers, not realizing they needed to build a custom handler to iterate through every body and header anonymizing any fields that may plausibly contain sensitive information. It’s something that literally every dev has done at some point before they knew better.
This is only true for steam keys sold on other platforms afaik
Top to bottom, then left to right.
Yes but if it’s first instinct is “go left” on 1-2, it’s pretty apparent the reward function could use some tuning
More, but not way more - they would be licensing window IoT, not a full blown OS, and they wouldn’t be paying OTC retail rates for it.
deleted by creator
lol. Did this in my old building - the dryer was on an improperly rated circuit and the breaker would trip half the time, eating my money and leaving wet clothes.
It was one of the old, “insert coin, push metal chute in” types. Turns out you could bend a coat hanger and fish it through a hole in the back to engage the lever that the push-mechanism was supposed to engage. Showed everyone in the building.
The landlord came by the building a month later and asked why there was no money in the machines, I told him “we all started going to the laundromat down the street because it was cheaper”
If you know the key is composed of English language words you can skip strings of letters like “ZRZP” and “TQK” and focus on sequences that actually occur in a dictionary
You don’t memorize RSA keys
No im saying if your password size is limited to a fixed number of characters, as is the case with RSA keys, words are substantially less secure
“can you string words to form a valid RSA key”
“Yes this is the most secure way to do it”
“No, it’s not when there is a fixed byte length”
-> where we are now
we are talking about RSA keys - you don’t memorize your RSA keys
if you rely on memorizing all your passwords, I assume that means you have ample password reuse, which is a million times worse than using a different less-secure password on every site
Sure but we aren’t talking about that
You memorize your RSA keys?
GPT is worse and it’s not even close.
My PC can serve up a hundred requests per second running an HTTP server with a connected database with 200W power usage
It takes that same computer 30-60s to return a response from a 13B parameter model (WAY less power usage than GPT), while using 400W of power thanks to the GPU
Napkin math, the AI response uses about 10,000x more electricity