• 10 Posts
  • 27 Comments
Joined 9M ago
cake
Cake day: Feb 10, 2024

help-circle
rss

Not with this grammar. There’s this parser-generator-immedate called BNFC that uses it’s own flavor of BNF (Labeled BNF) to generate Yacc/Lex (or ANTLR when can), an abstract syntax tree, etc, but I don’t like it. There are no EBNF parser generators AFAIK. One could, possibly, feed this to ChatGPT and ask for a Yacc/Lex pair in return, or even a manual parser! I may do that, but I first have to clean this up and add stuff that aren’t there.

ChatGPT has changed langdev a lot for me. I automate a good portion of the processo with it. But one needs solid specs to feed to it.

As I said I wish to implement the frontend myself, basically the lexer/parser. But I kinda get bored with LP because it’s too time-consuming. Plus LR(1) can only be generated, it’s only LL(1) which can be hand-written. I have not decided yet. I wish to focus more on the backend, because that is where you can do innovative shit and perhaps, write a paper on it.

Also, I’m going to leave C23 to people who have years of experience. ANSI C is the lower denomniator of C. I am using C99 standard, which should be able to compile a good portion of code bases. C99 is the last required POSIX standard for C. That’s when C went under ISO.

Thanks.


EBNF Grammar for ANSI C (+ Guide on reading EBNF)
This is EBNF grammar for ANSI C (C99) and it contains almost every rule. It may be missing stuff, please tell me if you notice something missing. I am writing a C compiler, with my backend and hopefully my own frontend in OCaml. That is why I wrote this grammar. I also have written the AWK grammar, but it's not uploaded anywhere. Tell me if you want it. Thanks.
fedilink



Free accessible endpoints for info on movies and TV shows? (Writing a script [or a program, haven’t decided yet] for my brother)
Hey. My brother asked me if I could write him something that iteratively scans his directories for movies and scrapes the web for data on the movie. I know there probably exist dozens of such thing, the aim is to be minimal here. I need public-exposed endpoints that allow for scraping of their data. I just need the endpoints, I don't need anything else. I am writing the script in Ruby but I may switch to C if it's too slow. It's pretty IO-bound anyways so it doesn't matter what I use. So if any of you knows a good publicly-exposed, free API to grab data about movies and TV shows, tell me. I will share the program/script later on for everyone to use. Thanks.
fedilink

This since I don't have a blog and I don't know how to make a blog I will post my way of defining a grammar using EBNF and Regular Definitions in Gist form. I have used my method with ChatGPT to generate Lex and Yacc files before. One language I have been implementing for about several months is AWK. I am now making an AWK to C translator (the more uninformed would call it a 'transpiler' but this term is NOT correct at all, it's not very theoretical, transpilers are just [syntax-directed] translators). So anyways please read this document if you wanna learn about grammars, Chomsky type 2 and type 3, Lexical and Syntactic grammar, etc. I also have a table that teaches you how to read EBNF. For the lazy, I will cite the table here: | Enclosed Right | Enclosed Left | Meaning | |----|----|--| | Single Quote | Single Quote | Single character | | Double Quote | Double Quote | Byte-sequence string | | Right Bracket | Left Bracket | What's within is optional | | Right Curly | Left Curly | What's within is repeatable | | Right Paren | Left Paren | What's within is grouped | Thanks.
fedilink



Consider this: it's much easier to parse S-Expressions than JSON! Plus, in statically typed languages, this script could excel, because it declares the type first. Remember that you can always use my S-Expression parser in C to parse them: https://gist.github.com/Chubek/d2f0ac9067521716d2ab31c93948e885 PS: Is it just me, or 'JSON to Sexp' sounds like 'Jason 2: Sex Pest'?
fedilink


I do believe other people have pointed out what went on that caused this to happen. This thread was a joke, but I did learn a lot from it.


This makes so much sense! The other guy said they were planning an S-Expression language like Scheme… I think, had Netscape supported Scheme, the trajectory of the craft would change. At least we would not get so many ‘durr parenthesis’ memes. Just how hard is it to use [Neo]Vim when you’re writing S-Expressions? it keeps highlighting the paranthesis and brackets balance as I write. What text editor do people who hate S-Expression LISP-like languages use, Emacs? Lol.


Frankly one can learn any imperative language once one learns one. It’s the standard library of a language and the quirks of the library that is the real challenge .The syntx of the language doesn’t boggle anyone.


I used to use IE when I was like 12~13 — I think I switched to FF when I was 13.5 and never looked back. Just the tabs man. I use Pop_OS! these days and a few months ago I accedentally enabled tiling, then it I realized it has tabs. I am as happy as I were back then. Tabs are a concept that were thought of too late.

btw this document mentions JScript. I don’t know WTF is that but when I google normal JS comes up.


Good info dump. Can’t image S-Expressions in web dev today really. Also, I did not mean it maliciously, this is a stupid thing after all.


You can use ASDL to describe the tree grammar of your language. After doing that, translate the code to C by running it through my program. Now you have a C file that contains a bunch of constructors, type defs, macros, etc ,that describe the AST of your language. Two examples, a basic one, and one for regex, has been provided. There's an e simple one for m4 in the man page. Some people may need further explanations so let's go ahead and do it: When you pass a program to the compiler or the interpreter, or another application does that for you, it first gets 'lexed', that is, every token is scanned and categorized. Then, these tokens are used to 'parse' the program. That is, the chunks of token are used to define the structure of your program, based on a pre-conceived grammar. For example, in my ASDL implementation, you can view the grammar for ASDL in both Yacc format and in EBNF format (in `companions/GRAMMAR.ebnf`). Now, the parser takes 'semantic' actions whenever it successfully parses a chunk of code. In the early days of computing, people just printed Assembly code! But now, with optimizing compilers and stuff like that, there was need to represent the language in form of a tree. The parser tree was first used. But parser trees are dense, so 'abstract' form it was used, which is what this program makes. You can see the abstract structures that represent ASDL itself, in `absyn.c`. After you translate your code to AST, you can translate it to DAGs ,or Directed Acyclic Graphs, to get the flow of the program. Then a control flow graph. If your language is interpeted, you can translate it to your VM bytecode. So this is what ASDL is. A tool for constructing compilers and interpeters, and other DSLs perhaps? I explained why I created this. Basically, the original implementation was useless. Thanks.
fedilink

I give it half-baked code and ask it to complete it. Like say a few days ago, I wanted to implement NFA and Thompson Consturction. So I wrote this:

struct Transition {
   // implement this
  Transition *next;
};

struct NFA {
  // implement this
};

// and so on and so forth

This is how you get good results from it. Do half the work.


Wow are you from the future? Because I just had this exact same thought, that JS is just that ‘process’, so I read the ECMA-262 standard and I posted the new thread about something funny I found in it. In fact I said something that closely resembles what you said. It’s just freaky!


Turns out, Java and JavaScript are not ‘car’ and ‘carpet’ — ECMA-262 2025: ‘ECMAScript syntax INTENTIONALLY resembles that of Java’
I linked to the anchor where it says that, right to the bottom of the section 4.3. Will people just STOP saying JavaScript was 'never intended' to have 'nothing to do with Java'? They clearly meant JavaScript to be to Java what AWK is to C, at least syntax-wise. I was born one year after JS was conceived (the standard says 'invented', invented my ass! Who 'invents' a language?) so I am too young to have been around in the early days of web. But it seems like people back then wanted Java to be lingua franca of web, a bit like PostScript in the thread I posted a few hours ago. They named it JavaScript to assure people that it's the interpreted, scripting form of Java. Now don't say 'JS and Java are like car and carpet' you will look like an idiot. Also if you are wondering why I am reading the standard, it serves two purposes. First is, I wanna implement it one day in the future. Second is, I know shit about web scripting and I wanted to make myself a blog and I miserably failed. So I am learning it. I know nobody asked, but one person might be wondering why someone would do this to himself.
fedilink

It’s a term that generally refers to the more ‘mathematical’ side of programming, as opposed to the more ‘practical’ side of things. I believe it means ‘Data Structures and Algorithm’ so its a generalization of ‘meaty’ programming. Think, someone who writes shell scripts does not need ‘DSA’ to oil his grind, but someone needing to write a compiler does.

Now we live at an age where you can generate a shell script with a simple prompt, and hell, you could piece-meal a compiler, but it’s not as straightforward. If one wishes to make moola in the post-LLM world, one needs to have strong theoretical and constitutional foundation.

At this point, any employer who hires more than 1 person for the ‘simple tasks’ is doing charity. And charity ain’t what employers known for!

That is not to say, don’t use AI in your work. I believe AI is the BEST way to learn DSA. In fact I straightened a lot of my misconceptions using ChatGPT. Like, I have written 2 compilers and abandoned them because I never meant to finish them, it was an excuse to prompt ChatGPT with more complex requests and do some reckon on my knowledge. I managed to do ‘basic’ SSA with ChatGPT, and anyone who has read a compiler book or taken a class knows SSA is not easy. I generated the SSA, and confirmed with the SSA book that I had. IT was very decent. But ther SSA book was very ‘crude’ and ChatGPT’s example was really, well, ‘uncrude’? So it was a GIANT help in me understanding SSA.

So if you need to learn DSA, I can recommend these three steps:

1- Learn a functional language or LISP-like language, Scheme, Racket, OCaml, SML, Haskell, etc. These langauges are extremely fluid and scientific.

2- Read books. Just go on Libgen or Zlibrary and read books. I recommended SIpser’s above (or was it in another post?) but there’s dozens. I remember a book called “Grokking Algorithms” which was really good. I read this book when I was sick with Covid, Steven Skiena’s “The Algoritmh Design Manual” and it seems like everyone just LOVES this book. I mean it, try it, it’s the best. If you can, buy it, because it’s very precious. Also, keep a copy of CSLR/CLSR or whatever as a reference on your desk, PC everywhere.

3- Piece-meal ChatGPT into designing you a complex application. It really helps if you got an aim. One thing I recommend is Genealogy software. Make a Genealogy DSL perhaps. Or a simple compiler.

My advice may not be sound, but some aspects of these were helpful for me.


I think you are irresponsible towards your future if you are a gainfully employed self-taught programmer, and don’t invest in formal education. If you say ‘I don’t have time!’ well, consider this, even night classes in junior colleges teach you stuff you don’t know. Go to them, get your associates. I am in the process of getting into a contract where I do some thankless jobs for someone I know, he in exchange pays me to go to college. I am 31 – as I said in the other thread, THERE IS NOTHING WRONG WITH BEING A LATE-COLLEGER!

I have been to college, I have studied 3 subjects for a total of 9 semesters, I have no degree to show for any of them :( I quit English lit, French lit and “Programming” all after 3 semesters. But when I was studying French lit, there was a guy in our class who was SIXTY-FIVE YEARS OLD! He wanted to learn French to open up some a commerce consulting office, focusing on import/export from France.

What I wanted to do was to ‘write’, keep in mind, ‘write’, not ‘draw’ bande dessine! But now that I am older and hopefully wiser, I have a set goal in mind. I am going to go this ‘boutic’ college near our home to study Electronics Engineering and when push comes to shove and China makes its move, start a chipset engineering firm with a production wing.

Just like how electronics is math with physics, programming is the virtual aspect of it. it’s ‘applied math’. I understand enough discmath because I studied enough of it both in college, and high school (since I was math/physics elective) so I have managed to understand some very rough and old papers.

You can always self-study if you haven’t got the time. Here’s a book which is kind of a meme, but it’s still very good: https://fuuu.be/polytech/INFOF408/Introduction-To-The-Theory-Of-Computation-Michael-Sipser.pdf

This is the 2nd edition though, 3rd is out — I think 4th is coming. The best goddamn book, regardless of its meme status.


I remember that from ages ago. It was very, very cringe-worthy. I think it’s the fault of whatever shit bootcamp he attended to learn his ‘craft’ for using imperative languages to teach DSA, if they taught him DSA at all? That is why in real universities they mostly teach DSA with functional languages. I studied ‘programming’ (not CS, ‘programming’) in a college which by American standards would be considered ‘junior’, for 3 semesters, and they, too, did not touch on functional languages. It was not until I had dropped out that I learned how to do FP using SML, Ocaml, Haskell, Scheme etc. That was when I ‘understood’ DSA for real.

Truly, this person is an embarrassment. I am currently working on an implementation of Appel et al’s ASDL (github.com/Chubek/ZephyrASDL, check the new_version branch!) and it now emits decent C code, so it is time to write the man page (it currently has a sucky one) and the TextInfo. I COULD potentially have ChatGPT make both, it is very decent at doing so. But see, I am not a moron. So I use write them in Markdown and have Pandoc make the final markups.

Truth is, if you know your tools, if you know how to use a DSL-driven approach to programming, you won’t need to use AI to generate your code. You don’t need to go to Harvard to know DSA, neither a junior college like mine. Just grab CSLR and read it on your spare time!

I think this person is the guy behind Homebrew. I don’t know why he thought Homebrew is an accomplishment that would give him immunity from knowing basic DSA — which in real world, is what programming is about. Package managers have existed since the dawn of time. Linux has like dozens of package managers. His package manager being ‘unique’ on MacOS does not mean it’s an accomplishment, it means MacOS sucks cocks!

Basically, when you start a project, think: can I turn this project into a paper? If you can’t, then it’s worthless. He could have just strapped Pacman to be built on MacOS and called it a day!

These days, employers are looking for people with theoretical, constitutional knowledge. Any monkey can code. I let ChatGPT aid me with a lot of my C code, it’s very decent at generating C. Like I need a Makefile, it makes it for me. I need a print help function, it makes it for me. There’s nothing wrong with using AI to ‘help’ you, but don’t use it in vital shit!

But, if the code that AI generates is as good as the vital backbones of your software, maybe save up some money and go to college if you haven’t, a 2-year degree at a junior college will be good. I am going back to college next year and I am 31! There’s nothing wrong with being an old student.

Just don’t expect people to suck your dick when you don’t know how recursion works!


Good point me lad. The plain-text based approach indeed makes scraping much easier. And plus, if we send a ‘process’, the process can be easily malicious, even if we don’t elavate its access.

Like imagine today. I tell you to wget a shell script and pipe it to shell to install my software from my remote (FPT, Git etc). This almost always needs sudo. I can’t imagine how many 12 year olds would be fooled to sudo rm -rf *?

That is provided that a 12 year old would even know how to do that. I know several people who began their UNIX journey when they were as young as 7~8, but there’s a reason these people earn 500k a year when they are 30! I can’t imagine if your normie aunt would really feel like using a UNIX pipeline to check her emails.

HTTP ‘just werks’. Derpcat told me this in 2010 when I told her I hate HTTP in 2010. IT JUST WERKS. Kay’s solution, although extreemly unbaked, would not allow my mom to read her Intagram feed.

Besides money, the computation cost is also high. Kay used to use mini-computers, us poor people used micros (if i were a poor person when mini/micro distinction existed, today it’s just clusters vs Jimmy’s gaming rig, oh, where art thou, DEC?)

But again, nobody has given a it a thought. THAT IS THE ISSUE. Academic text on alternatives to web, AFAIK, is rare. Part of it is the ‘just werks’ thing, but also, academia just does not care about web.

I think if people who are smarter than me would give this a ‘thorough’ thought, they will come up with a good solution. Web won because it was ‘open’, it was easy to navigiate, as opposed to pesky newsgroups and the such. You can still go to the first website to see this: http://info.cern.ch/ (browse this with Lynx or W3M, it’s the best way to do it! Don’t use FF or Chrome).

I dunno!


Note: I call the scientists in this post by last name, not because I think I am their 'peer' but because that's how the English language works, and if I put 'Mr' before every last name, I'll sound like Consulla asking for Lemon Pledge! I am 30, will turn 31 in less than 20 days. I am the same age as the year of the Eternal September. I was born with web, but I hope Web dies before me! Also, if you don't know who Alan Kay is, don't be distraught or feel like you're an 'outsider' (especially if you are not much into the 'science' side of programming). Just think he's a very important figure in CS (he is, you could look him up perhaps?) Now, let me explain what Kay told me. Basically, Kay thinks WWW people ignored works of people like Englebert and the NLS system, and it was a folly. Dough Englebert, before UDP was even though of and TCP was a twinkle in the eyes of its creators, before Ethernet was created, back when switches were LARGE min-computers and ARPA was not DAPRA, tried his hand at sending media across a network (the aforementioned ARPA, you may know it from /usr/include/net/arpa), he even managed to do video conferencing. That was in **the 1960s**! He came up with a 'protocol' and this 'protocol' is not a TCP/IP stack we know today, it was completely different. I don't think he even called it that. The name of this 'protocol' was 'Online System' or NLS. Englebert's NLS was different from the 4-lays abstraction we know and love today. It was different from web. It was like sending 'computations' across. Like a Flash clip! Kay believes that, WWW people should not have sent a 'page', they should have sent a 'process'. He basically says "You're on a computer, goddamit, send something that can be computed! Not plaintext!" Full disclosure, yes, Kay is too brutal to Lee in this answer. I don't deny that. And his 'Doghouse' analogy is a bit reductive. But I digress. Kay believes the TCP/IP stack is sound. I think anyone who has read a Network Theory book (like *Computer Networking A Top-Down Approach* which I have recently perused through) doesn't subject this. But he believes people are misusing it, abusing it, or not using it right. In the speech which I am referring to at the question title, Kay said: [Paraphrasing] "This is what happens when we let physicists play computer [...] If a computer scientist were to create 'web', he would do a pipeline, ending at X" X refers to X Windowing System used in UNIX systems, it's a standard like Web is. The implementation of X11 on my system is Xorg. It's being slowly replaced by wayland. So what does this mean? Well Kay says, 'send a process that can be piped'! Does it sound dangerous and insecure? WELL DON'T ELEVATE ITS ACCESS! Imagine if this process-based protocol was too, called web, and the utility to interface with it was called 'wcomm', just like wget is. To view a video: Imagine PostScript was strong enough to describe videos with. So we could get a video from Youtube, render it, and watch it: ``` $ ~ wcomm youtube.com/fSWmufgTp6EQ.ps | mkmpg | xwatch ``` So what is different here? How is it different than using a utility like ytdl and piping it to VLC? Remember, when we do that, we are getting a binary file. But in my imaginary example, we are getting a 'video description' in form of PostScript. ==== So anyways, as I said, I am not super expert myself. But I think this is what Kay means. As Kay says himself, PostScript is too weak for today's use! But I guess, if Web was not this 'hacky', there would be a 'WebScript' that did that! Thanks.
fedilink

You don’t need any garbage 3rd party packages. Just use ctypes.

Read the documentation for ctypes. You can use any native object in Python the way you do in C using ctypes.

Whatever you do, don’t trust some god-forsaken 3rd party package. I think people have forgotten to be clever and resourceful these days. A binary object is itself a package, why would you use s another package to communicate with it?

Here’s how you can import libc into Python:

from ctypes import CDLL

libc = CDLL(None)

Just pass it path to the binary object that holds the symbols for GPIO.

Some tricks. Here’s how you can use libc to make a syscall in Python:

libc = CDLL(None)

TIME_NR = <syscall nr for time(2) on your arch>

libc.syscall(TIME_NR)



So I am going to explain the concept of macro preprocessors to you — m5.awk is a macro preprocessor, so is m4 so is GPP so is C’s CPP, so is my Ekipp and so is my AllocPP.pl. They all work like this:

1- Frank Frankis (hereby referred to as FF) creates a file called my-file.randextrandext here meaning that macro prerpcoessors work on all kind of files, be it a C code, a Python code, an HTML template, your tax files, your notes to your GF, your angry letter to your boss, etc;

2- There are dozens and dozens of uses for a macro preprocessors, but let’s say FF wants to obliterate two birds with one sniper shot, he wishes to write a manual for his new Car XAshtray ™, in HTML and Markdown, but he wants it to be contained within one single file – and he wishes to reuse certain fragments of text, certain HTML tags, certain Markdown syntaxes, to be ‘reusable’, just like a ‘function’ is reusable piece of code in an imperative language (but not a functional language!) or how a template works in C++, etc.

3- FF enters the file with his favorite text editor. He defines these ‘functions’, which are the ‘macro’ part of a macro preprocessor. Think what ‘macro’ means. It means ‘big picture’ basically. I think the better term here is ‘meta’. These two words have a close relationship in the English language, don’t they?

Now let’s see what macro preprocessor FF uses. Since GPP is really similar in syntax to C’s preprocessor (at least with default settings) let’s use GPP. I would use my Ekipp here but I honestly have forgotten the syntax (believe it or not, creating a language does not mean you are good at it).

#ifdef __HTML__
#define TITLE <h1>My Car XAshtray Manual</h1>
#define SUBTITLE <h5>Throw your ash on me</h5>
#define BOLDEN(text) <b>text</b>
#elif __MARKDOWN__
#define TITLE  \# My Car XAhtray Manual
#define SUBTITLE \#\#\#\#\# Throw your ash on me
#define BOLDEN(text) **text**
#else
#error "Must define a target language"
#endif

FF keeps defining these. Now comes writing the actual manual.

Keep in mind that GPP stands for ‘Generic Preprocessor’, it’s a text macro prerpcoessor and not a language preprocessor like CPP (C’s preprocessor) is. m4 and Ekipp are like that. My AllocPP.pl is a language preprocessor, it preprocesses C. So now, this means FF can now freely treat my-file.randext as a text file, with a ‘language’, the Macro Preprocessor language, defining the output (I’ll talk about what I mean by ‘output’ soon).

So he writes his manual:

TITLE
SUBTITLE

Hello! Are you okay? Why did you buy this ashtray? BOLDEN(ARE YOU OKAY?). In this manual I will teach you how to use my ashtray...
...

Replace the last ellipse with rest of the text, the Car Ashtray manual.

Now, FF needs to ‘preprocess’ this text file. This is, well, the ‘preprocessing’ part of a macro preprocessor. It’s like compiling a program with a compiler, except you are compiling text-to-text instead of text-to-binary.

gpp -D__MARKDOWN__ my-file.randext > my-manual.md
``
But what happened at `-D__MARKDOWN__`? I think you have already guessed. In the 'program' we asserted if '__MARKDOWN__' is 'defined', then then define those macros as Markdown, else HTML. We can also define a macro with a value:

gpp -DMyMacro=MyValue my-file.randext > my-manual.md


Now, GPP has more built-in macros like `#ifdef`, they are called 'meta macros` (as opposted to the macros you yourself define). There's `#include` which includes a file. There `#exec` which executes a shell command. Etc etc.

Now, you can read more about GPP on its Github. I was in touch with its maintainer, Tristan Miller, very recently when I showed him my Ekipp. He has made a new version of GPP so don't install it from your package manager like apt, install it from source because the release is very recent and these packages take ages to be updated. GPP is just one C file, very neat and clean. Read the man page (`man 1 gpp`) for more info.

m4 and m5 or Ekipp etc, as I said, are too, generic text preprpocessors. My Ekipp has this feature where, you can treat any program like PHP works:

#! delimexec $ ‘‘awk “{ print $1; }”’’ | <== foo bar ==>


This will run the AWK program in the file.

You can install my Ekipp using these commands:

sudo apt-get install libreadline-dev libgc-dev libunistring-dev wget -qO- https://raw.githubusercontent.com/Chubek/Ekipp/master/install.sh | sudo sh


Bring up `man 1 ekipp` to learn about it.

Keep in mind that Ekipp has some bugs. I will have to completely rewrite it honestly but I am busy making an implementation of ASLD (github.com/Chubek/asdl) and I am working on an implementation of AWK and later C so a macro preprocessor does not bite me really.

Thanks.





I forgot to mention, tools like Bison and Autoconf use m4. Andrew Appel calls languages that need preprocessing ‘weak’. The specs for D also call C ‘weak’ for needing to be preprocessed. Preprocessing is almost an abandoned practice, at least in the world of programming, because Scheme-like ‘hygenic macros’ have taken their place. Some languages like Rust and Zig are attempting to reinvent the wheel by re-inventing macros, but Scheme had it right in the 70s. Truly, C and Pascal’s preprocessor were out the door in the 70s, let alone, now.

Here’s another preprocessor that is 'neat: GPP.

I have made my own preprocessor, it’s very nasty and has a lot of bugs because I was an idiot when I made it: https://github.com/Chubek/Ekipp

I have half a mind to remake Ekipp. I don’t know what purpose would it serve. But it’s just neat, I think preprocessors are neat, as useless as they are.


It does not really need to explain what it ‘is’ because

1- a paper has been released on ACM explaining it, you can download it for free from here: https://dl.acm.org/doi/pdf/10.1145/353474.353484

2- It’s name is m5, most people in the UNIX world know what m4 is. If you have Linux, bring up info m4.

Imagine this, when you use C, there is a preprocessor that does all the #include and #defines, m4 is like that and m5 is like that too. Preprocessing used to be big, but now it’s not. If you wanna see an example of preprocessing, look at this preprocessor I made for C: https://gist.github.com/Chubek/b2846855e5bb71a67c7e3effc6beefd6


It’s a preprocessor. Like m4 and GPP. I thought that is clear from the name?


Well this is neat…
This is a script called m5.awk that I randomly found after reading a paper about it on the ACM website. The paper's worth reading too but the script is just a piece of work. Commented to the brim, clean, and most importantly, useful as hell. it could be a replacement for m4. It allows you to embed AWK in text files, and besides that, it allows you to use several preprocessng features that it offers via macros. Give it a try.
fedilink

I hate this word arrangement in the title. It reminds me of painfully oblique ‘breadtube’ videos. “Getting naked near an elementary school and eating whipped cream off your boyfriend’s ass is good, ‘actually’!” I think the first person who used it might have been the chick who got hit by hotdogs in that old GIF.


As I said in the other thread, it's still buggy. But it does work on the examples. Zephyr ASDL is a domain-specific language for making abstract syntax trees. You may uses it to make any sort of tree really. Thanks.
fedilink



I kinda like that I am as far away from the web world as possible. The only time I deal with anything web-related making a static blog software for myself. As Alan Kay puts it, web is ill, because it has been a hack from day -1. Every time you make a web application, you are using a mule to carry a city worth of cargo on its back.

Web was created for static pages. Use it for static pages. The only website that does not do this and I use is Youtube. I only then visit HackerNoon and Lemmy instances, both of which come out of Web’s tube of shit proud.

i wish I could post my conversation (barely a conversation, what do I have to converse with someone who’s worked with magnetic core, I just agreed) with Alan Kay on Quora, but I tried to login, and it was so bloated, I could not find the conversation.

I apologize if this hurts your feelings that your 'lil protocol sucks. Read about Dough Englebert’s NLS.

Thanks.


I subconsciously knew this, I currently am making a simple data exchange format to use with a program, and I am using PEG to create a parser. Chances of errors happning in this DXF is really low, but if the parser can’t parse it, it’s invalid.


So apologies if this is on GIthub. I am currently working on my own software to make a git host with indexing. So Mukette is a markdown pager. Like nroff that pages troff, this pages markdown. Right now it has issues with Github markdown because I could not find specs for it. Work sfine with the kind of MD in example-document.md. Generally when LP i invold you have to test all the aspects and generalize it. I hope you enjoyed tihs little program. I spent about 10 days on it. I initially wanted to parse it with PEG, but then I realized, MD is fully type3 and not type2, I can just lex to 'larse' it. lol. Thanks.
fedilink