The standard itself is 7-bit, since wires were deemed more valuable than endpoint logic for the teletype machines way back then. If you’re running it on an 8-bit byte machine you could do it either way, although I’m not sure what the point in parity checking individual characters is. Modern software uses 0.
I agree! There really should be no excuses at this point… you’d think.
Even in 2008 I already felt we were behind the loop, but apparently I was vastly underestimating how bigger companies just dgaf.
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !programmerhumor@lemmy.ml
Post funny things about programming here! (Or just rant about your favourite programming language.)
Rules:
Posts must be relevant to programming, programmers, or computer science.
No NSFW content.
Jokes must be in good taste. No hate speech, bigotry, etc.
Same woman?
right wing?
She must be running out of ideas.
Bey-once
0-255 was good enough for me an my grandpappy.
0-127, top bit is always 0.
Damn big-endians always looking down on the rest of us.
i thought the top bit was originally 0 or 1 depending on the evilness/odiousness of the rest of the number, as a parity check.
The standard itself is 7-bit, since wires were deemed more valuable than endpoint logic for the teletype machines way back then. If you’re running it on an 8-bit byte machine you could do it either way, although I’m not sure what the point in parity checking individual characters is. Modern software uses 0.
Just like pokemon
Yeah let’s drop wide characters, it was a bad idea. Let’s simplify… also let’s convert all porn back to ASCII art.
Wait Beyonce is a blonde white woman now? Is this what they mean when they say “it’s a different kind of white?”
It’s just skin tone, with the right lighting and color grading we all just look like people
You’ve been able to do diacritical marks with ASCII for over thirty years. It’s already standard. alt+0233
dunno if it’s the same character but alt+130 has been firmly drilled into my brain from my entire Pokémon childhood.
deleted by creator
I want to join the C-hive
What did i just read
Yeah I think I’m OOTL, can someone explain?
I think the simulation is broken again.
Good ol’
Alt
+1``3``0
.I guess Beyonce has no love for Extended ASCII.
or AltGr + ’ + e I love this shortcut
Compose
e
’
She’s like the opposite of Prince
That symbol has a unicode, “The Love Symbol”
Not true.
Source: https://parkerhiggins.net/2013/01/writing-the-prince-symbol-in-unicode/
Oh shit thanks
UTF-8? Anyone?
UTF-8 Random anyone?
Beyoncé
So… UTF-8 interpreted as ISO-8859-1? You have failed Unicode college >:(
¯\_(ã )_/¯
And here I was typing out
iso-8859-1
like a scrub to make sure I wasn’t misremembering the encoding when doing the analogous thing in python…If only you knew how many huge companies have no fucking clue…
Their employees have failed Unicode college >:(
I agree! There really should be no excuses at this point… you’d think. Even in 2008 I already felt we were behind the loop, but apparently I was vastly underestimating how bigger companies just dgaf.
deleted by creator
Sorry, no, but I can support it in UTF-16.
Maybe she was sick of trying to get a terminal displaying utf correctly.
So… she’s running DD-WRT :P
deleted by creator
Myths.