Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There are many arguments to be made in favor of C++, such as talent availability, experience from other porting projects, the possibility of doing incremental porting, etc.

My impression is that some projects are already experimenting with or using C++ in their C code bases, so C to C++ is quite likely.



Incremental porting is possible in Rust. There are a few ways to go about it, and it is pretty easy to link Rust and C code together.


Most languages have some C interop, and that satisfies the minimal definition of "incremental".

However, using C and C++ together in a project is especially easy. If I were to do this, I would first get the code base compiling with a C++ compiler (this already brings some extra type safety) and is not particularly difficult.

Then I'd start replacing C code blocks with safer C++ code. This could mean changing a function, some parameter-passing conventions, replacing char* with std::string, etc.

This has the biggest chance of success I feel, and there's already success stories and strategies available that describe this method. E.g: GCC.


Rust <-> C interop is pretty strong. AFAIK, there is equivalent call overhead between Rust & C as there is between C & C++ (i.e. none). You're right that incrementally porting to Rust would be a little bit more overhead than C++, simply because you'd need two compilers, and instead of rewriting function definitions in-place you'd need to write a fresh function in Rust and delete it in C. But at link time everything is sane and once you set up the build, it's really not hard.

I'm not suggesting that C -> Rust is easier than or quite as easy as C -> C++, just that it is much easier than C -> most other languages, and that it is close enough to C -> C++ that it is worth investigating. It is definitely more robust than the minimal definition of "incremental."


But if the goal of the rewrite is safer tooling then Rust wins.


Or Python, or Ada, or Ruby, or Go, or Java, or Lisp, or any of the myriad of other memory safe programming languages.

When it comes to memory safe languages, your choices do not boil down to "Rust or nothing".


No, in this case it pretty much does boil down to Rust or C++. This is a low-level tool, not a webapp.


That still gives you Lisp, Ada, Go, Haskell, and Java, if you make the (imo incorrect) assumption that "low level" tools can only be written in languages which compile down to bytecode.

Of course, Mercurial gives lie to this assumption.


Java? You won't get any performance out of it compared to something like C.


I don't like Java, but don't underestimate the performance of the JVM. Unless you're a crazy perf wiz, then your average C code won't beat your average Java code. It's fast enough for short processes, and for long processes the JIT is pretty darn good. Also note that a JIT can perform runtime assumptions and optimize code based on what is currently the case, which an AOT compiler cannot.

It takes a lot more than a toolchain to write fast code.


In what scale?

What it matters is if it is fast enough for the use case being targeted.

As side note I remember when C compilers for home computers generated worser code than junior Assembly programmers.


A command line tool like Git can be written in any programming language that has implementations capable of generating native code.

Even then, Mercurial is implemented in Python and quite usable


Git is not a "low-level" tool.

The fact that a Java rewrite of git actually exists demonstrates the falsity of this statement.


@falcolas -- I think you mean machine code not bytecode. And w.r.t. Mercurial afaik the project's hot paths are all written in Cython extensions, and there's ongoing work to improve the Python part by working with the PyPy folks. So, there's definite technical advantages in using Python for greater developer productivity, but there's also a cost.


I was not implying it did. I was responding to the C++ suggestion specifically.


The goal is writing safer code, but that also comes with some costs. The interesting question is which costs can the project afford.

Otherwise yes, it could even be rewritten in Ruby, as falcolas suggested...


My impression is that projects that mix C and C++ tend to use the C-like subset of C++ (with classes), so they wouldn't gain any safety advantage from moving to C++.

Many other languages support C linkage in a way that's comparable to C++.


Using std::vector, std::string and smart pointers would already bring large benfits, so even a C-like subset could work fine.

The point is that code like in that function is a nightmare to write correctly and test in plain C.


I agree with you, the only problem being that even in 2016 there are shops that prohibit the usage of STL.

The real way out is to eventually change to a language where safety is opt-out and not opt-in, like in C++.


C++ is far safer than C precisely for the reasons listed. The standard is safe by default (it's not opt-in). If you want or need backward compatibility with C, then you can use the more error prone C constructs for that. Otherwise, pure C++ code is safe.


It is opt-in, because it depends on the STL, which many shops forbid.

I have always been on the C++, in the C vs C++ wars, but I am also aware of all those developers that just code C with a C++ compiler, hence opt-in.


I'd hope people are not using the STL nowadays, in favor of the C++ Standard Library which is part of the C++ standard.


People like myself that know C++ since the "C++ Annotated Reference Manual" tend to keep using STL to designate the standard library, but I guess you already knew that.

If it makes you happy I can use the ANSI C++ section number instead.


Wasn't sure if that's were you were going. So, who in this day and age, forbids the use of the standard library? That seems pretty bizarre. What would be the motivation behind such a policy?


The arguments against it are usually that it is too bloated, too slow, makes use of templates and exceptions.

So any place that is against templates and exceptions, usually rules out the standard library on those arguments.

Then you have the software houses, whose C++ code is actually C with a C++ compiler that use the bloat and slow arguments against the library.

I don't remember them by heart, but there were a couple CppCon 2014 talks where this type of arguments was discussed.


Whenever someone complains to me that the standard library is slow or "bloated" (whatever that means), I ask if they've ever profiled their code vs. the standard library version. 9 times out of 10 they have not, and are operating out of mythology rather than measurement.


That is my feeling as well.

I do like to use C++ a lot on personal projects, but there I can make full use of C++ best practices.

At work, I tend to avoid using it, because most C++ developers I have met on my career, actually use it as Better C, keeping all safety loopholes from C.

I had my share of spending weeks tracking down memory corruption issues.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: