Assume mainstream adoption as used by around 7% of all github projects

Personally, I’d like to see Nim get that growth.

I would LOVE for Nim to get more web stuff

Malbolge

R

I know is not considered a “proper” programming language by some, but I’ve been working with it for years for scientific data analysis and I love it

silas
link
fedilink
English
310M

I think we can all agree on JavaScript
/s

javascript

Factor!

I mentioned it in a reply but it deserves its own top-level answer.

Haskell. I think that more people being familliar with Haskell concepts would be good for programing culture and it would increase the odds of me being able to write Haskell professionally, which is something I enjoy a lot when writing hobby code at least. Having more access to tooling and a bigger eco system would be nice as well.

I’m not a 100% sure about my answer though. For one, I might grow to resent Haskell if I had to use it at work, and there’s also a risk that it would be harder to do cool innovative stuff with the language when more big companies depend on it.

Crystal. The language is killer but there’s a real lack of libraries for it.

I’d be interested in hearing what it is about the language that has gotten you so excited about it.

The TL;DR is that it’s compiled Ruby.

I am primarily a Ruby on Rails developer using Docker images to deploy. If you know anything about the Ruby on Rails ecosystem for the past few years Webpack, Node and Friends™ were pretty standard.

I’ve managed to rip out the entire JacaScript runtime from production (thank God) but Ruby containers are still pretty sizable on their own.

What I’m excited about with Crystal is that it’s still a high-level language but it brings in static typing, NULL checking at compile-time, etc.

When it comes to Docker I can compile my app in a build step then completely gut out the container and only ship what I need: the binaries and assets.

Memory is another huge advantage since I’m not shipping an entire interpreter.

Until recently I never heard of crystal. There is a humble bundle for programming that includes a crystal book. That was the first time I heard of it.

I will have to take a look at the language. Who knows when having knowledge of crystal will be useful

I Cast Fist
creator
link
fedilink
English
210M

What could be the “killer app” for Crystal is an equivalent of Rails, since its syntax attempts to be very similar to Ruby. Even supposing it maintains all of Rails’ inefficiencies, if it “just works” and has a very small learning curve for RoR veterans, adoption could grow steadily

I think Lucky framework as well as the other one (can’t think of the name now) are pretty solid. What gets me is the ORM learning curve simply because I don’t have tons of time to dedicate to learning it.

It’s also limiting not having lots of libraries (shards). Basically if you need to do anything outside the framework you’ll have to write it all yourself.

Obligatory shoutout

!crystal_lang@lemmy.ml

Nice! 😎

Many here will hate this, but I am looking forward to just using English language to program and let AI handle all the minutia.

@UraniumBlazer@lemm.ee
link
fedilink
English
-1010M

I don’t know why you’re being downvoted, but this could truly be the future of programming languages. We don’t have to manually compile everything to assembly today, do we? Imagine simply using English for pseudocode, with an AI compiler that writes the most performant code… How much would that speed up development time? Noone would need to know different languages… The learning curve for programming relatively basic shit would be low.

I dunno, but I’ve seen a lot of unecessary hate for AI in the left leaning communities…

A compiler has mostly fixed rules for translation. The English language often is ambiguous and there are many ways to implement something based on a verbal description.

Programming by using the ai as a “compiler” would likely lead to many bugs that will be hard to impossible to trace without knowing the underlying implementation. But hitting compile again may lead to an accidental correct implementation and you’d be none the wiser why the test suddenly passes.

It’s ok as an assistant to generate boilerplate code, and warn you about some bugs / issues. Maybe a baseline implementation.

But by the time you’ve exactly described what and how you want it you may as well just write some higher level code.

@UraniumBlazer@lemm.ee
link
fedilink
English
110M

A compiler has mostly fixed rules for translation.

Some compilers are simple, while some are complicated. An AI compiler would of course be very complicated. However, it still would have “fixed rules”. It’s just that these rules would be decided by itself. If u r a software dev, u r also an English-to-xyz-language-compiler. You do what your client tells u to do more or less correctly, right? Junior devs do what senior devs tell them to do kinda correctly, right? An AI compiler would be the same thing.

Programming by using the ai as a “compiler” would likely lead to many bugs that will be hard to impossible to trace without knowing the underlying implementation.

Bugs would be likely if your AI compiler was dumb. The probability of bugs would reduce drastically if ur AI compiler was trained more/on better data.

It’s ok as an assistant to generate boilerplate code, and warn you about some bugs / issues. Maybe a baseline implementation.

That is the state of AI today. What you are describing are the capabilities of current AI models. However, I cannot see how this is a criticism of the idea of AI compilers themselves.

But by the time you’ve exactly described what and how you want it you may as well just write some higher level code.

Again. The smarter your model, the more you can abstract your stuff.

Syntax has never really be an issue. The closest thing to plain english programming are legal documents and contracts. As you can see they are horrible to understand but that the only way to correctly specify exactly what you want. And code is much better at it. Another datapoint are visual languages like lego mindstorm or LabView. It’s quite easy to do basic things, but it doesn’t scale at all.

@UraniumBlazer@lemm.ee
link
fedilink
English
2
edit-2
10M

Syntax has never really be an issue.

But it has tho… For example, I do not know rust. I want to add the notifications functionality to Lemmy. Lemmy is in rust. To implement this relatively simply api, I need to learn rust to a degree. Then, I need to look at Lemmy’s file structure to understand the project further to actually do what I want to do. What if this all could be abstracted by me simply saying “post xyz to the expo-notifications server whenever someone messages someone.” An AI English-to-rust interpreter could easily do this.

The closest thing to plain english programming are legal documents and contracts. As you can see they are horrible to understand but that the only way to correctly specify exactly what you want.

This is what would define the smartness of the AI, wouldn’t it? Your project manager doesn’t tell you exactly what they want. You have the brains to interpret what they mean and do stuff accordingly, correct?

This requires many assumptions that you or any computational system have no formal reason to make. Having an interpreter that just guesstimates exactly how you want the program structured, is going to run into problems when you, say want to extend the program.

Downvotes seem to be from denial based on the comments.

“Hi computer! Write me a program that make money. I must just run it and I become rich.”

I Cast Fist
creator
link
fedilink
English
310M

Computer makes a blockchain. “Here you go! Keep running it in the background then just sell the mined coins!”

CIAvash
link
fedilink
810M

Raku

Ithkuill

I was going to say lojban but this works too

I’m obsessed with an extremely little known language called Grain. It’s not quite ready for production but it has an insanely intuitive functional syntax that I want to use noww.

Could you give some examples of what you like so much?

One of the most exciting things about Grain is that it compiles to WebAssembly.

That’s a cool feature.

What is the particularity that you talked about?
In my point of view it looks like JS/TS with arrow functions. 😁

it looks like JS/TS with arrow functions.

JS/TS already has arrow functions.

davawen
link
fedilink
210M

Interesting!
I see OCaml with rust syntax, for the web, which checks out the project goal of bringing functional patterns to everyday programmers.

Esperanto.

tun
link
fedilink
410M

I think OP means programming language. Not the languages used by human to communicate each other.

What disrupted the fun for me:

  • the rules for articles before languages, countries and their people
  • everything sounds the same / easy to be misunderstood
  • not nearly as internationally approachable as it could be, though obviously that’s almost impossible

I too wish Esperanto would gain mass adoption but my only qualm with it is the consonant clusters that aren’t friendly for non-European language speakers.

Alas, there are enough serious problems to fill a book.

Given that Esperanto was created before most of modern linguistics, this isn’t all that surprising. Programmers don’t much write in Plankalkül either.

circuitfarmer
link
fedilink
010M

deleted by creator

Match!!
link
fedilink
English
210M

As a regular person who speaks a non-indo-european language, yeah I thought that was obvious

I’d love to read more about that! Normally, I’d just do my own searching, but since you have actual expertise in the area, is there someone in particular I should search for who explains this?

I also want to clarify that I’m not skeptical; on the contrary, I can think of three reasons off the top of my head, as a layman who knows virtually nothing about Esperanto, just based on you identifying colonialism as an issue, but I was hoping to get an educated take on it.

Match!!
link
fedilink
English
1210M

Toki Pona

@lolcatnip@reddthat.com
link
fedilink
English
110M

A language that’s hard to say much in even if you know 100% of the vocabulary.

maegul (he/they)
link
fedilink
English
910M

Sorry to say, but once I realised how euro-centric, and to my ear/eye, latin-centric esparanto is I completely lost interest.

I don’t know if anyone has tried, but something which similarly draws influences from the languages that the vast majority of the world speak would be wonderful.

You made me think of that xkcd about standards.

Anyway, the eurocentrism argument, while perhaps true due to the Latin root, seems to be a little bit of a savior complex don’t you think? China itself pushed for Esperanto to be used as a business language internally late last century as I recall.

maegul (he/they)
link
fedilink
English
110M

savior complex

I don’t see that at all.

It’s about making a language that the maximum amount of cultures can see themselves in, can have at least some familiarity with, and feel like they’ve been acknowledged in the making of a global language … all of which is intended to get maximum buy in around the world to establish a truely international language rather than a Lingua Franca derived from hegemony.

Maybe China was interested in Esperanto for a bit, but I’m betting like most stories like that it’s heavily exaggerated or outright bogus.

@spiderplant@lemm.ee
link
fedilink
1
edit-2
10M

Someone already said that either the created language takes from too few source languages and alienates speakers of languages with no common characteristics or takes from every language family and becomes a horrible mess that’s hard to speak for everyone.

So if a world language is a bad idea no matter what languages you use as a source, why not have Esperanto or something similar for Europe/English speaking world and then a different language for Asia, and another one for Africa. You’ve reduced the number of translators needed and left most people with a language close to their mother tongue. You could also break the suggested regions in to smaller sections eg give Germanic Europe a common Germanic language. West/south Europe get Esperanto, east Europe sets a common slavic language. You still get languages that don’t neatly fit like Hungarian but its better for most language learners than the last example.

Personally I’d not propose universal languages as a utopian idea and instead promote indigenous languages such as Catalonian, Breton, Irish and promote learning many languages in a post work society.

Yeah we can invent yet another language, and go through the motions of including everyone. But by god make sure you don’t forget anyone. Let’s throw in Chamicuro, Warlpiri, Liki, Tanema, Ongota, and Dumi, just to make sure. Don’t want to upset anyone….

Or we could stop inventing new ways to accuse things of not being inclusive enough. It’s getting bonkers… Not saying Esperanto is the best language, and it has its flaws as others have so vehemently stated, but if inclusivity is the primary motive when designing a language, then I can almost certainly guarantee that new language will be much worse.

I mean English is basically the world language. It’s used by pilots, scientists, global finance, and diplomatic efforts. I’m gonna assume that almost no one would classify English as inclusive in its vocabulary. Unless you’re German, Dutch, or French of course. Esperanto is at least more accessible and easy to learn and carries Latin roots… shared with lots of languages. And it was invented by a member of a repressed minority in the old Russian Empire. What’s not to love?

My problem is not with inclusivity but with promoting uptake. If you are familiar with the grammar or phonetic sounds or some of the vocab, you are more likely to find that language easier to learn.

Both English and Esperanto share the same problems of universal languages that I mentioned. English does have the advantage of number of speakers but it is a mess of a language for people to have to learn.

Again to reiterate my counter to universal languages, why not learn and potentially help revive your local indigenous languages. In a world where universal translation exits on our phones everybody being able to speak the same language matters less.

Neo-Indo-European?

The issue is that modern languages are so diverse, you would wind up with a horrid, unusable patchwork.

Who cares if it’s European sounding, it’s still an interesting language that is relatively easy to learn, even for people from non-romance backgrounds.

I actually bought the second edition of Mastering Nim paperback the other day! Should be arriving tomorrow, hopefully.

I had fun dogfooding my interview problem in it, I feel like it’s basically step forward as far as modern syntaxes go.

I’ve heard of one I don’t know the name of that is trying to make it so you can just write the program with natural English kind of like how AI works off of prompts. Having grown up watching Star Trek and seeing how they would “write” holodeck programs by just giving the computer a detailed explanation of the program they wanted to run always made me wish we could do that IRL.

Inform 7 is the closest I’m aware of, for creating text adventure games.

Honestly, I prefer the control that comes with a more syntactically consistent grammar, but I definitely see a use-case for a higher level tool for non-programmers, or for prototyping.

If AI ever solves the DWYM problem, we’re in trouble. Fortunately, it’ll probably solve it the way a programmer does.

Hey, we’d still need programmers to keep the AI wrangled, wouldn’t we? Like how robot workers would still need maintenance people. 😟

hallettj
link
fedilink
English
110M

Just a guess: I think Inform fits your description

Rust! Memory leak free code would make our world a better place!

Rust doesn’t guarantee the lack of memory leaks anymore then java/C++ does, so sadly not sure if it would help here. :)

We can go further, I think it’s impossible to prevent memory leaks in a general purpose language

Not without a super fancy type system that has to be still found. I think the key issue is cyclic data-structures (e.g. doubly-linked list). The language somehow needs to have strong/weak pointers and automatically determining them is a very complex research question…

@Rin@lemm.ee
link
fedilink
110M

Help me understand your point of view. How does Rust not prevent memory leaks?

There’s built in functions to leak memory that are perfectly safe. You can also do one really trivially by making a reference count cycle. https://doc.rust-lang.org/book/ch15-06-reference-cycles.html

Rust only prevents memory unsafety - and memory leaks are perfectly safe. It’s use after frees, double frees, etc. It prevents.

And here you’re only talking about a subset of memory leaks, by inaccessible memory. You can also leak memory by pushing new elements in a channel while never reading them for example.

You are absolutely correct that rusts safety features don’t extend to memory leaks, but it’s still better than most garbage collected languages unless you abuse Rc or something, and it does give you quite fine-grained controll over lifetimes, copying and allocations on the heap which in practice means that rust is fairly good about memory leakages compared to most languages.

Reference counting is a GC though ?

It’s a bad one sure and will leak memory in cases of a cycle which most tracing GC are able to do.

It’s main advantage is that there are no GC pauses.

https://en.m.wikipedia.org/wiki/Reference_counting

I think you know what I mean when I contrast Rust with GC’d languages, we can call it opt-in garbage collection if we’re being pedantic.

How would rust fare any better then a tracing GC? Realistically I’d expect them to use more memory, and also have worse determinism in memory management - but I fail to really see a case where rust would prevent memory leaks and GC languages wouldn’t.

If you just Rc everything (which I’d count as “abusing Rc”) Rust is significantly worse than a language with a good GC. The good thing about Rust is that it forces you to aknowledge and consider the lifetimes of objects. By default things are allocated on the stack, but if you make something global or dynamically handled (e.g. through Rc) you have to do so explicitly. In Rust the compiler has greater compile time information about when things can be freed which means that you need less runtime overhead to check things and if you want to minimize the amount of potentially long-lived objects you can more easily see how long objects might live by reading the code as well as get help by the compiler to determine if a lifetime-based refactoring is sound or not.

@philm@programming.dev
link
fedilink
8
edit-2
10M

At this point, I think it’s almost mainstream, and it’s still growing fast (and it’s getting better, rust-analyzer is really awesome these days, I was there at the beginning, no comparison to today…))

I may be biased, but I think it’ll be the next big main language probably leaving other very popular ones behind it in the coming decade (Entry barrier and ease of use got much better over the last couple years, and the future sounds exciting with stuff like this)

Create a post

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person’s post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you’re posting long videos try to add in some form of tldr for those who don’t want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



  • 1 user online
  • 1 user / day
  • 1 user / week
  • 1 user / month
  • 1.11K users / 6 months
  • 1 subscriber
  • 1.21K Posts
  • 17.8K Comments
  • Modlog