Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I remember one person who responded to me about a DSP that had sizeof(char)==32, and how it was wonderful that the C Standard accommodated that so code could be portable.

The support for 32-bit char doesn't automatically make code portable between architectures with different char sizes. But it's what makes it possible to write a C compiler at all for those weird architectures. It certainly doesn't make code that assumes things portable to architectures where those assumptions do not hold.

> The C and C++ standards would do the programming world a favor by standardizing on: 1. 2's complement 2. fixed sizes for char, short, int and long 3. dumping any character sets other than Unicode

This would be a great favour to embedded developers everywhere, because it would finally free us from the C legacy. Rust and Zig look promising, and at some point I even thought D might work but now I understand why it wouldn't. I wonder if you're aware of the standard sized type aliases int8_t, int16_t etc and what's your opinion of them.



> it's what makes it possible to write a C compiler at all for those weird architectures.

Only if one is pedantic. There's no practical problem at all customizing C compiler semantics for weird architectures. After all, everybody did it for DOS C compilers.

> I even thought D might work but now I understand why it wouldn't

People do use it for embedded work. I don't know why you wouldn't think it would work.

> I wonder if you're aware of the standard sized type aliases int8_t, int16_t etc

I am. After all, I wrote stdint.h for Digital Mars C.

> and what's your opinion of them.

There are three of each:

    typedef long int32_t;
    typedef long int_least32_t;
    typedef long int_fast32_t;
1. Too many choices. I have experience with what happens when programmers have too many choices, with the differences between them being slight, esoteric and likely not substantive. They blindly pick one.

2. People have endless trouble with C's implicit integral conversions. This makes it combinatorically much worse.

3. int32_t makes sense for the first hour. Then it become annoying, and looks ugly. `int` is much beter.

4. `int` is 32 bits anyway. No point in bothering with stdint.h.


> Only if one is pedantic. There's no practical problem at all customizing C compiler semantics for weird architectures. After all, everybody did it for DOS C compilers.

Resorting to non-standard extensions locks you in with that specific compiler. This is the exact reason why standards exist in the first place.

> People do use it for embedded work. I don't know why you wouldn't think it would work.

The lead developer seems to have strong knee-jerk reactions to things that he does not understand and limited understanding of what microcontrollers are or what embedded software does. Who really cares about porting diff to a system that doesn't have filesystem or text console?

I agree that integral type promotions are a great way to shoot oneself in the foot, but the other explanations are not really convincing. If you did read the comments you are responding to, you should already know that int is not always 32 bits.


You're already locked in with a specialized compiler for those unusual architectures, and despite being Standard conforming, it still isn't remotely portable.

> you should already know that int is not always 32 bits.

I also wrote a 16 bit compiler for DOS, with 16 bit ints. I know all about it :-) I've also developed 8 bit software for embedded systems I designed and built. I've written code for 10 bit systems, and 36 bit systems.

> Who really cares about porting diff to a system that doesn't have filesystem or text console?

I infer you agree the software is not portable, despite being standard conforming. As a practical matter, it simply doesn't matter if the compiler is standard conforming or not when dealing with unusual architectures. It doesn't make your porting problems go away at all.

I went through the Great Migration of moving 16 bit DOS code to 32 bits. Interestingly, everyone who thought they'd followed best portability practices in their 16 bit code found they had to do a lot of rewrites for 32 bit code. The people who were used to moving between the 16 and 32 bit worlds had little trouble.

C++ is theoretically portable to the 16 bit world, but in practice it doesn't work. Supporting exception handling and RTTI consumes all of the address space, leaving no space left for code. Even omitting EH and RTTI leaves one with a crippled compiler if extensions are not added to support the segmented memory model.

How do I know this? I wrote one. I lived it.


> I also wrote a 16 bit compiler for DOS, with 16 bit ints. I know all about it :-) I've also developed 8 bit software for embedded systems I designed and built. I've written code for 10 bit systems, and 36 bit systems.

I'm not sure how this means that int is 32-bit.

> I infer you agree the software is not portable, despite being standard conforming. As a practical matter, it simply doesn't matter if the compiler is standard conforming or not when dealing with unusual architectures. It doesn't make your porting problems go away at all.

I guess we could venture into arguing what "the software" and "portable" here means. What I mean that I was working on standard-conforming C codebase that worked correctly on both architectures I mentioned above. This is what I consider portable. Having standard-conforming compiler for both does not make the problems go away, but it makes things much easier than having two non-conforming almost-but-not-exactly-C compilers or totally proprietary languages.

I know that DOS was bad. It's been more than 20 years now. Let's get over with it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: