>>976628 (OP)
Integer overflow is as much of an error as out of bounds indices and addresses.
>>976648
>Because c only has "unsafe", it is designed to be as safe as possible.
Bullshit. Null-terminated strings, array decay, and lack of error handling make C less safe, as well as slower. The whole problem of "security holes" and having to constantly stay updated so you don't get exploited is caused by C. What really sucks is that most of these problems were solved in the 60s and have very simple solutions, but C can never use them.
>This includes ergonomics (rust shills find a dictionary), compiler warnings, and external tools.
All of that sucks compared to real languages. The only way you could possibly think C is "ergonomic" is if you're comparing it to C++.
http://archive.adaic.com/pol-hist/history/holwg-93/holwg-93.htm
>When Bell Labs were invited to evaluate C against the DoD requirements, they said that there was no chance of C meeting the requirements of readability, safety, etc., for which we were striving, and that it should not even be on the list of evaluated languages. We recognized the truth in their observation and honored their request.
Why am I retraining myself in Ada? Because since 1979 I
have been trying to write reliable code in C. (Definition:
reliable code never gives wrong answers without an explicit
apology.) Trying and failing. I have been frustrated to
the screaming point by trying to write code that could
survive (some) run-time errors in other people's code linked
with it. I'd look wistfully at BSD's three-argument signal
handlers, which at least offered the possibility of provide
hardware specific recovery code in #ifdefs, but grit my
teeth and struggle on having to write code that would work
in System V as well.
There are times when I feel that clocks are running faster
but the calendar is running backwards. My first serious
programming was done in Burroughs B6700 Extended Algol. I
got used to the idea that if the hardware can't give you the
right answer, it complains, and your ON OVERFLOW statement
has a chance to do something else. That saved my bacon more
than once.
When I met C, it was obviously pathetic compared with the
_real_ languages I'd used, but heck, it ran on a 16-bit
machine, and it was better than 'as'. When the VAX came
out, I was very pleased: "the interrupt on integer overflow
bit is _just_ what I want". Then I was very disappointed:
"the wretched C system _has_ a signal for integer overflow
but makes sure it never happens even when it ought to".
It would be a good thing if hardware designers would
remember that the ANSI C standard provides _two_ forms of
"integer" arithmetic: 'unsigned' arithmetic which must wrap
around, and 'signed' arithmetic which MAY TRAP (or wrap, or
make demons fly out of your nose). "Portable C
programmers", know that they CANNOT rely on integer
arithmetic _not_ trapping, and they know (if they have done
their homework) that there are commercially significant
machines where C integer overflow _is_ trapped, so they
would rather the Alpha trapped so that they could use the
Alpha as a porting base.