coder
The problem is that if you send a message just blindly, you can be tricked into sending spam to millions of addresses. I do one thing that prevents that, but does violate the standard, I verify there’s only 1 ‘@’ in the address… this technically prevents people with '@'s in their name, but they probably find it impossible to do anything with that address anyway.
Another benefit from working from home: I will happily spend my own money on a good chair, keyboard, etc. I spent 20 years working in an office and there’s no way I would’ve ever brought in my own chair during that time… I would’ve had to become the chair police to prevent it from getting “reappropriated”
Interesting. A year ago I was looking for something exactly like this for distributing data between multiple servers. Everything required a ton of overhead or was too big to use. I ended up just using json. I did discover that Brotli can compress 3 gigs of json down into just 70 megs nearly instantly.
Isn’t that what Gists are for? https://gist.github.com/
I thought it was well known that the studies about Dvorak being superior were fabricated by Dvorak himself… but apparently that’s forgotten knowledge.
Here’s a magazine article about it: https://reason.com/1996/06/01/typing-errors/
I spent 20 years working for my local newspaper. It was a ton of fun and I constantly got to do new things. I did everything from making a palm pilot game to accompany our coverage of the Sydney Olympics, to an Apache module for a custom cms to iPhone and Android apps.
Now I can’t say that working for a news company is a good idea in 2023, but the point is there’s probably a company local to you that needs a wide variety of programming and isn’t a “tech giant”.
I’m off two minds. On the one side, there is far too much reliance on black box libraries to do trivial things.
On the other, this complaint is decades old. Back in the late 80s there was a software developer for the apple iigs called FTA, which stood for Free Tools Association. They claimed that the tools in the os were too slow and you should code to the raw hardware.
Feels like you want JWT.
Rust is the only language I know of that is actively being used at the kernel level all the way through to the web app level. Compare that with Swift which is not only mostly tied to a single ecosystem, but even the “cross platform” stuff like libdispatch is littered with code like:
if #available(macOS 10.12, iOS 10.0, tvOS 10.0, watchOS 3.0, *)
My problem with C/C++ is the people behind the spec have sacrificed our sanity in the name of “compiler optimization”. Signed overflow behaves the same on every cpu on the planet, why is it undefined behaviour? Even more insane, they specify intN_t
must be implemented via 2s complement… but signed overflow is still undefined because compilers want to pretend they run on pixie dust instead of real hardware.
I spent about 10 of those in roles where my primary function was to write code. The other 10 have involved managing programmers, coaching them, consulting with organizations about how to manage them, running a codebase assessment practice and these days, well, actually content marketing.
Therein lies the biggest lie in development. There is no career path. I’ve been programming professionally for 25 years, and in all 25 of those years my primary function was to write code, because I turned down any promotion that would put me in management and away from doing what I love.
One of my oldest programming books is all about using machine language to program.
https://vintageapple.org/apple_ii/pdf/Apple_Machine_Language_1981_(raw-bw).pdf
At the time, even assemblers cost money. I remember saving up for Merlin which is an assembler for the Apple II.
My favorite test for ChatGPT is to ask it to write a function to divide two numbers in 6502 assembly. Not only is there no DIV instruction to rely on, but the 6502 is very register starved, so you get a lot of screwups. Here’s one example of what ChatGPT generated:
; Inputs:
; - A: dividend
; - X: divisor
DIVISION:
LDX #0 ; initialize quotient to 0
CPX #0 ; check for divide-by-zero error
BEQ DIVIDE_BY_ZERO
CLC ; clear carry flag
DIV_LOOP:
CMP X,A ; compare divisor with dividend
BCC DIV_DONE ; if divisor is greater, division is complete
SEC ; set carry flag
ROL A ; shift dividend left
INX ; increment quotient
JMP DIV_LOOP ; continue division
DIV_DONE:
RTS ; return from subroutine
DIVIDE_BY_ZERO:
; handle divide-by-zero error here
RTS
You can see it immediately overwrites the divisor with the quotient, so this thing will always give a divide by zero error. But even if it didn’t do that, CMP X,A
is an invalid instruction. But even if that wasn’t invalid, multiplying the dividend by two (and adding one) is nonsense.
I’m not great with gdb but I think using the x cmd shows them.