I remember reading an article a few years ago from, if I remember correctly, Coverity. One of their challenges for customer acceptance was that many organizations write invalid code that is preprocessed in custom ways. This means that Coverity can't understand the code to analyze it. In all cases, the default customer result is to say that the tool is useless. Doubly so if the invalid bit is buried deep in the system where few of the developers go.
To compensate, they had to build a robust capacity to recognize invalid syntax, and work around it to analyze the parts that they could analyze.
The described block rewriting extension is a perfect example of what threw them.
Another amusing challenge that they encountered is that they could correctly report on complex race conditions, but the developers who created those race conditions tended not to understand the problem reports. The result is that the engineer would wrongly become confident that the tool was crap. The "solution" was to deliberately not report provable bugs!
If this is enough to trigger anyone else's memory of an article that was going around 4 or 5 years ago, I'd love to read it again. :-)
While kudos to getting this to work, seems like a huge amount of effort and voodoo when instead one could have used a different language in the first place.
Thanks. That said, I'm not sure the amount of effort is that huge: having a working rewriter was a matter of days, which makes it far less expensive than rewriting a codebase of several hundreds of thousands of C to a new language.
What language would you happen that has both ridiculous portability (you can use it from any language) and performance? Sounds like a fairy tale to me.
Go doesn't embed at all, so right there that's out.
Rust is still very unstable, and I wouldn't recommend it for production use.
I'm not sure why you would use Swift unless you're targeting iOS, it seems like a terrible idea to base your code on proprietary languages without ridiculous, .NET levels of support. I also don't believe swift is easily embedded with its huge runtime ass.
I'm not sure that Swift belongs on that list. It was announced back in June and there still no compiler for any non-Apple platforms. It's safe to say, at this point, that portability isn't one of Swift's goals.
Interesting question. I'm not intimately familiar with D, but it does seem like it's designed as a saner alternative to C++. How many C++ alternatives are out there? From my memory D seemed like one of the first major attempts to get away from C++. Maybe they didn't go far enough to meet peoples expectations. Or maybe they were too far ahead of their time.
Anyway, when you said that the first thing that popped into my mind was Ewan McGregor going all "you were the chosen one" and actually I think that kind of describes my feeling about D. At first it sounded very exciting, but then it seemed like there really wasn't a payoff and I ended up feeling kind of bleh about it. I wonder how many other people feel similarly. Maybe D's problem is kind of an emotional one?
Extensions are how new features come to an established language... C11 standard mostly standardised stuff that was already supported as extension by most compilers. Most of the time, extensions are created because features are missing from the language (alignement requirement for example), compiler could do a better job at optimising the code with better hints (noreturn, restrict, strict-aliasing, ...), or we could simply make the job of the developper a bit simpler/safer (nested functions, _Generic, blocks, ...).
IMHO, the main benefit lost by using bleeding-edge extensions is portability, which may or may not be an issue.
I don't get it, they are interested in closure and have a codebase in C, why not simply using C++11 with lambda? This would make the code portable to GCC/MSVC...
Arguably they started before C++11 was officially released, but clang and gcc supported lambda before the official release, and also since then they could have used a rewriter to transform their blocks into lambda to help for the conversion.
First, we originally chose the C language because it is much much simpler. One can master the language without too much pain, and you end up having much more control on what you are actually doing (there's little chance a line of code does not do what you read from it... while in C++ you may have hidden behavior behind even the simpler operation such a + or *).
Secondly, and this is probably a matter of taste, C++11's lambdas are just awfully designed. Their syntax overloads, with a totally different meaning, some tokens such as []. As in many situations, C++ design committee tends to chose the most complicate possible design, without taking readability into account (maybe conciseness is the main goal of their syntax choices?). On the other, the blocks syntax makes is very clear you are dealing with a function-like object with very similar syntax. The choice was made to have a clean and readable syntax.
Third point, C++11 just didn't exist in 2009. There were drafts but support from compilers was just nascent. RHEL in 2009 was at version 6 (very young release) which ships with GCC 4.4 (and GCC 4.7 as an experimental toolchain). The most common RHEL was version 5 with GCC 4.1. RHEL officially supports C++11 since RHEL 7 which ships with GCC 4.8 and was release on June 2014.
To compensate, they had to build a robust capacity to recognize invalid syntax, and work around it to analyze the parts that they could analyze.
The described block rewriting extension is a perfect example of what threw them.
Another amusing challenge that they encountered is that they could correctly report on complex race conditions, but the developers who created those race conditions tended not to understand the problem reports. The result is that the engineer would wrongly become confident that the tool was crap. The "solution" was to deliberately not report provable bugs!
If this is enough to trigger anyone else's memory of an article that was going around 4 or 5 years ago, I'd love to read it again. :-)