AI will blow my mind when it solves an unsolved mathematical/physics/scientific problem, i.e: "AI, give me a proof for (or against) the Riemann hypothesis"
Actually, it happened _long_ before that - 2018 was when I became aware of this technique, but I'm sure there's previous art: https://nullprogram.com/blog/2018/07/31/ (Prospecting for Hash Functions for those who already know).
That said, this is really brute forcing, not what the OP is asking for, which is providing a novel proof as the response to a prompt (this is instead providing the novel proof as one of thousands of responses, each of which could be graded by a function).
For C and C++* projects under ~100k lines I wouldnt bother with incremental builds - I have a 70k C project with a single translation unit that builds in under 1s on my machine.
* C++ requires some discipline to not explode build times, but it can be done if you dont go nuts with templates and standard headers.
C doesn't have "alias analysis" in the standard. It has an (informally specified) memory model which has "memory objects" which have a single type, which means treating them as a different type is undefined behavior.
This enables security analysis like valgrind/ASan and secure hardware like MTE/CHERI so it's very important and you can't get rid of it.
However, it's not possible to implement malloc() in C because malloc() is defined as returning new "memory objects" and there is no C operation which creates "memory objects" except malloc() itself. So it only works as long as you can't see into the implementation, or if the compiler gives you special forgiveness somehow.
C++ has such an operation called "placement new", so you want something like that.
You can definitely implement malloc in C. It does nothing special in its most basic form but cough up void pointers into its own arena.
It gets complicated when you have virtual memory and an OS involved but even then you can override the system malloc with a simple implementation that allocates from a large static array.
No, returning parts of an array does not implement malloc as described in the standard. That's not a new memory object, it's a part of an existing one.
The standard is written to accommodate obsolete tagged memory architectures that require special support. They aren't relevant today and data pointers are fungible regardless of where they originate.
But in practice it's not always true on Apple A12 or later because they support PAC (so pointers of different type to the same address can be not equal bit-wise) and is even less true on very latest Android because it supports the really big gun MTE. And MTE is great; you don't want to miss out on it. No explainer here because there's no Wikipedia article for it(!).
Also becomes not true on any system if you use -fbounds-safety or some of the sanitizers.
Yeah I never got the aversion to operator overloading either.
"+ can do anything!" As you said, so can plus().
"Hidden function calls?" Have they never programmed a soft float or microcontroller without a div instruction? Function calls for every floating point op.
The problem is not that + calls a function. The problem is that + could call one of many different functions, i.e. it is overloaded. Zig does not allow overloading plus() based on the argument types. When you see plus(), you know there is exactly one function named “plus” in scope and it calls that.
operator + is overloaded even in plain C: it will generate different instructions for pointers, floats, integers, _Complex, _Atomic and the quasi standard __float128. Sometimes it will even generate function calls.
Not if `plus` is a pointer. Then `plus()` is a conditional branch where the condition can be arbitrarily far away in space (dynamically scoped) and time. That's why I think invisible indirection is a mistake. (C used to require `(*plus)()`.)
I hate annoying distractions, be it popups, beeps, notifications, alerts, auto brace/quotes/etc and autocomplete prompts. I turn all of that off in every program because I want to be in total control of the computer.
When it comes to autocomplete specifically I initially turned it off because I was mainly a C++ programmer and C++ autocomplete has just never been good enough and a half-working autocomplete is just worse than nothing at all because you end up stalling and waiting for the prompt.
But eventually I just grew to hate it for all languages because it interrupts flow and "pipelining".
> I hate annoying distractions, be it popups, beeps, notifications, alerts, auto brace/quotes/etc and autocomplete prompts.
This was also my first reaction when I saw someone else using an IDE. My way of writing code is to get my ideas written into the editor first. Only after I have written the code, I run the compiler to see where things don't fit. The people who were using the IDEs were really astonished that I could understand the output from the compiler on the command line.
Programming is a lot more fun when you sketch out your ideas first before you make sure that all the details are correct.
AI will blow my mind when it solves an unsolved mathematical/physics/scientific problem, i.e: "AI, give me a proof for (or against) the Riemann hypothesis"