Too Much JavaScript? Why the Frontend Needs to Build Better
thenewstack.io
external-link
One company found that too much JavaScript costs them $700,000 per year, per kilobyte. Here's what Alex Russell says needs to change.

Prioritizing developer experience is not the reason we use front-end frameworks. People expect the web to work like a desktop app (no page reloads). The initial request might take a little bit longer, but in the end a well written front-end app will feel faster.

The problem is that people don’t worry about bundle size and cram every library off of npm into their website.

@asyncrosaurus@programming.dev
link
fedilink
English
2
edit-2
2Y

People expect the web to work like a desktop app (no page reloads).

Do users expect it, or do product owners expect it? Because from my experience, typical users dgaf if a site is a SPA or is SSR as long as it’s functional and loads quickly. When we did user surveys, the legacy Wordpress version scored just as well as the fancy schmancy React re-write. Only time SPA outscored a traditional web page is (obviously) heavily interactive components (e.g. chat, scheduling calendar)

I Cast Fist
link
fedilink
English
12Y

My personal bane are SPA with fixed scrolling. I’m on a fucking desktop, stop treating me like a fucking monkey incapable of scrolling exactly to where I want and fading text in only while in focus

Josh
link
fedilink
22Y

@variouslegumes @starman

You can get the benefits of fast page transitions no page reloads with turbo combined with a traditional server rendered stack.

https://turbo.hotwired.dev/

@o11c@programming.dev
link
fedilink
English
182Y

The solution is quite simple though: dogfood.

Developers must test their website on a dialup connection, and on a computer with only 2GB of RAM. Use remote machines for compilation-like tasks.

@masterspace@lemmy.ca
link
fedilink
English
-12Y

Server rendered sucks ass. Why would I want to pay for an always running server just to render a webpage when the client’s device is more than capable of doing so?

Centralization is just pushed because it’s easier for companies to make money off servers.

@sznio@lemmy.world
link
fedilink
English
12Y

Because it’s better to deliver a page in a single request, than to deliver it in multiple. If you render the page on the client you end up making a lot of requests, each one requiring a round trip and adding more and more delay.

@philm@programming.dev
link
fedilink
English
12Y

You don’t have to render everything on the server, a good hybrid is usually the way to go. Think SEO and initial response. I think lemmy-ui could will also benefit from it (google results)

@masterspace@lemmy.ca
link
fedilink
English
12Y

Yeah, it will give you the best of both worlds, but at a fundamental level I still hate that I have to pay for an always running server just for SEO, if I can get away with it I’d much prefer a purely static site that has to have its content pages rebuilt when they change.

Totally, pretty much all browsers include a way to simulate network conditions. Chrome also includes a way to simulate CPU slowdown.

@o11c@programming.dev
link
fedilink
English
62Y

and yet the very fact that you have to go out of your way to enable them means people don’t use them like they should.

Create a post

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person’s post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you’re posting long videos try to add in some form of tldr for those who don’t want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



  • 1 user online
  • 1 user / day
  • 1 user / week
  • 1 user / month
  • 1 user / 6 months
  • 1 subscriber
  • 1.21K Posts
  • 17.8K Comments
  • Modlog