What’s funny is that’s how it started. Apple sold movies as early as 2007 before Netflix or Amazon video or whatever and expected you to host the files locally either on your computer or your AppleTV (which had a hard disk drive at the time) and stream it locally over iTunes. If you lost the file, that was supposed to be it.
Of course, you still had to authenticate your files with the DRM service, and eventually they moved libraries online and gave you streaming access to any files you had purchased.
Yeah, but we always run them in native formats, so it’s not a big load on the processor. We only watch the 4K stuff at home where it’s got a hardwired gigabit ethernet connection.
If you saw my other comment, I’m kind of talking myself out of this upgrade since I managed to get qsv working on my current rig.
That shouldn’t be the case. I’d look into getting this fixed properly before spending a ton of money for new hardware that you may not actually need. It smells like to me that encode or decode part aren’t actually being done in hardware here.
Right you are!
Dug into it a little more. There were some ffmpeg flags that weren’t being enabled by the latest release of Photoprism. Had to move to the test build. https://github.com/photoprism/photoprism/discussions/4093
While it’s faster than real time now, Photoprism still won’t start streaming until the preview is fully generated, so longer video clips can take a minute or two to start playing. It only has to happen once per file, but it’s still annoying. There’s a feature to pre-transcode video, but it’s only to get in to a streamable format. It doesn’t check bitrate/size until you actually start to play.
I might write a script to pre-generate the preview files, but either way, I don’t think I need to upgrade the server quite yet.
So my job (electrical engineering) has been pretty stagnant recently (just launched a product, no V2 on the horizon yet), so I’ve taken my free time to brush up on my skills.
I asked my friend (an EE at Apple) what are some skills that I should acquire to stay relevant. He suggested three things: FPGAs, machine learning, and cloud computing. So far, I’ve made some inroads on FPGAs.
But I keep hearing about people unironically using chatGPT in professional/productive environments. In your opinion, is it a fun tool for the lazy, or a tool that will be necessary in the future? Will employers in the future be expecting fluency with it?
Thumbnail looks exactly like Neptune
At my last job, every time they added or removed someone’s key card access, the system would reboot and everyone would be locked out for like two minutes.
We also had two floors that were connected by a fire stairwell, so you needed a card to re-enter the next floor.
At least twice my card stopped working in the middle of the word day while I was standing in the stairwell and I assumed that they just fired me and assumed I’d see my own way out.
Survived three layoffs at that company.