[ / / / / / / / / / / / / / ] [ dir / animu / cyoa / hikki / kind / lds / loomis / strek / wooo ][Options][ watchlist ]

/tech/ - Technology

You can now write text to your AI-generated image at https://aiproto.com It is currently free to use for Proto members.
Name
Email
Subject
Comment *
File
Select/drop/paste files here
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Expand all images

[–]

 No.868312>>869399 >>869586 >>870395 >>870462 [Watch Thread][Show All Posts]

If you compile this with gcc -O2, at least for version 6.3 and 6.4 that I've tested, for some reason it optimizes away setting the variables to 0 and gives you a garbage result from adding up uninitialized stack memory. clang, at least 3.8.1, doesn't do that. gcc -O1 also doesn't do that. Is this a compiler bug or some bullshit rulefaggotry where they've decided to break code because the spec says they can somewhere?


#include <stdint.h>
#include <stdio.h>

struct Foo {
uint32_t a;
uint32_t b;
uint32_t c;
uint32_t d;
} __attribute__((packed));

uint16_t foo() {
uint32_t sum = 0;

Foo f;
f.a = 0;
f.b = 0;
f.c = 0;
f.d = 0;

for(size_t i = 0; i < sizeof(f) / sizeof(uint16_t); ++i) {
sum += ((uint16_t *) &f)[i];
}

return sum;
}

int main() {
uint16_t sum = foo();
printf("%u\n", sum);

return 0;
}

If you're curious as to why I'm looking at this, the pseudocode in RFC 1071 for calculating checksums does this and I've run into several programs at work that are now broken because of gcc fucking it. It's 30 year old code and worked fine until now and it's literally RFC compliant.

 No.868331

>is this a compiler bug

It's just the optimizer fucking up your day. It's probably assuming some undefined behavior can't possibly happen and as a result the assignments are dead code which get eliminated. Probably your cast to uint16_t. Try casting to char * which is guaranteed to be able to alias any type.


 No.868336>>868339 >>868343

The struct is already packed by definition you fucking moron, invoking undefined behaviour using compiler extensions (why are you doing this anyway you subhuman) is part of your mistake. You could've googled this and found the answer in a minute, tops


 No.868337

I don't know why, but gcc 6.4 says

#‘target_mem_ref’ not supported by expression#

5.5 says nothing, but doesn't work either. I'm guessing a bug. I couldn't locate it on the bugtracker but didn't try much.


 No.868339

>>868336

Packed isn't the reason genius.


 No.868341>>868344

Also, try compiling with -fno-strict-aliasing.

https://gcc.gnu.org/onlinedocs/gcc/Optimize-Options.html


 No.868343

>>868336

It's derived from something where packing matters, qt. This is a simplified version of a TCP pseudoheader checksum.


 No.868344>>868347 >>869210 >>870404

>>868341

To add further, I recommend that everyone disable this strict aliasing horseshit. Just add the flag to your makefiles and forget about it. It's just a moronic optimization based on retarded features of the C standard and everyone will be better off without it. It causes more harm than good.

By doing this we join the ranks of Linus Torvalds:

https://www.mail-archive.com/linux-btrfs@vger.kernel.org/msg01647.html

http://lkml.org/lkml/2003/2/26/158

>Why do you think the kernel uses "-fno-strict-aliasing"?

>The gcc people are more interested in trying to find out what can be allowed by the c99 specs than about making things actually work.

>The aliasing code in particular is not even worth enabling, it's just not possible to sanely tell gcc when some things can alias.

>I tried to get a sane way a few years ago, and the gcc developers really didn't care about the real world in this area.

>I'd be surprised if that had changed, judging by the replies I have already seen.

>Type-based aliasing is stupid. It's so incredibly stupid that it's not even funny. It's broken.

>And gcc took the broken notion, and made it more so by making it a "by-the-letter-of-the-law" thing that makes no sense.

>I know for a fact that gcc would re-order write accesses that were clearly to (statically) the same address.

>Gcc would suddenly think that

unsigned long a;

a = 5;
*(unsigned short *)&a = 4;

>could be re-ordered to set it to 4 first (because clearly they don't alias - by reading the standard), and then because now the assignment of 'a=5' was later, the assignment of 4 could be elided entirely!

>And if somebody complains that the compiler is insane, the compiler people would say "nyaah, nyaah, the standards people said we can do this", with absolutely no introspection to ask whether it made any SENSE.

GCC BTFO


 No.868347>>868348

>>868344

I'll probably disable it. The bugs strict aliasing optimization creates are non-obvious and security risks. The code I was fixing broke in a way that rejected every packet, but it could just as easily have accepted every packet allowing session hijacks without anyone having noticed the code's behavior had changed.


 No.868348>>868354 >>868803

>>868347

Yeah, it's scary as fuck. Compilers simply can't be trusted anymore; one bad move and they start deleting side effects from code like it was nothing. I also force integer overflows to be well-defined two's complement behavior by default because there's no actual reason they shouldn't all work that way in 2018.

I'm not actually sure if the strict aliasing is the cause of your problem but it looks like it to me. Please confirm if it works with the flag when you are able.


 No.868354>>869664

>>868348

Yeah, it works with the flag and the assembly shows it's not eliding the assignments.


 No.868803>>869403

>>868348

>because there's no actual reason they shouldn't all work that way in 2018

There is an actual reason for that. Enforcing well-defined results in case of overflow prevents many optimisations from happening that would break if overflow happened. Don't do that unless you actually need that behavior or you can needlessly hurt performance.


 No.869194>>869403

>dereferencing a pointer that aliases another of an incompatible type is undefined behavior

You have only yourself to blame OP.


 No.869210>>869403

>>868344

>the compiler is allowed to assume that incompatible types don't alias

>write code that aliases incompatible types

>get undefined behavior

<WOOOOW BUGGED C MECHANICS


 No.869213

If you want to bitbang, you use assembly, not high level languages.


 No.869279

Personally I think it's good that compilers break code. If they produced code that did what you expected that would be insane.

And who the fuck are you to expect anything. You think there's some kind of standard here? Well there is and it says compiler writers can do whatever and users should stfu.


 No.869385>>869396

The C language and the compiler were made by 'shit-for-brains weenix unie "programmers"'. That's why your code doesn't work.

Pointers are supposed to be "hard" because of C and all the bullshit that comes with it. Their problem is trying to learn pointers using a language that can't do pointers correctly. Ctards regurgitate the "what's really going on" and "portable assembly" memes so people wind up believing C bullshit is how computers actually work. The only way to "fix" that problem other than real education is by making a language that sucks more than C and blaming all of its bullshit on the machine. Pointers also do not have to work in a way that conforms to C standards. You won't get a Lisp machine from the C way.

>inb4 blame the programmer, not the tool

I "blame the tools" because the makers of these tools are the source of the problem. There is such a thing as defective and shitty products. A calculator that makes 00011 equal to 00009 is a defective calculator. AT&T is a phone company that was doing UNIX to make extra money on the side, so they didn't care. The switch statement is more bullshit.

These stupid shit-for-brains weenix unie "programmers" have
managed to break that mainstay of Western enlightenment and
education, the dictionary. And I'll bet they excuse their
behavior by saying, "Well, it's all Greek to me"!

I suppose it's only appropriate that the invading hordes
scraped the librarian at Alexandria to death with shells.
They must have had a premonition that UNIX was coming.

I feel compelled to submit the following piece of C code:

switch (x)
default:
if (prime(x))
case 2: case 3: case 5: case 7:
process_prime(x);
else
case 4: case 6: case 8: case 9: case 10:
process_composite(x);

This can be found in Harbison and Steele's "C: A Reference
Manual" (page 216 in the second edition). They then remark:

This is, frankly, the most bizarre switch statement we
have ever seen that still has pretenses to being
purposeful.

In every other programming language the notion of a case
dispatch is supported through some rigidly authoritarian
construct whose syntax and semantics can be grasped by the
most primitive programming chimpanzee. But in C this highly
structured notion becomes a thing of sharp edges and lose
screws, sort of the programming language equivalent of a
closet full of tangled wire hangers.

And I hacked the renderer code to throw cpp the proper
"-DFRAME=%05d" to spit out numbers the way I wanted
them. Why did I want the leading zeros? I don't know, I just
thought it was cleaner.

So I fired up the animation and let it run for a while
(days).

Well, the output was quite amusing (or at least it would
have been if I didn't need the results for my thesis defense
a week later). The object would go down for a few frames,
then jump up and go down a little, then go back to where it
maybe should have been, then jump up....

After a little headscratching, I realized that the leading
zeros in my frame numbers were causing cpp to treat them as
octal values. How precious.

But still, if I say "#define FRAME 00009" then "#if
FRAME==00009" should still fire (or it should at least whine
at me). Well, 00009==00009 does trigger, but so does
00009==00011.

Huh?

Well, some C library thinks that the nine in 00009 isn't
octal, so it interprets it as 9 decimal. And 00011 is a fine
octal rep of 9 decimal. So, both "#if FRAME==00009" and
"#if FRAME==00011" fired and I applied two translate calls
to my object geometry. And(!), it's not that having a
decimal digit makes the whole number decimal. The string
00019 gets interpreted as 00010 octal plus 9 decimal = 17
decimal. Lovely, not.


 No.869389

9 is a perfectly acceptable digit in a formally defined octal system. It has a clear meaning. A 9 on the nth position in the representation adds 9•8ⁿ⁻¹ to the value of the number, same as any other digit.

It means you can represent the same number in multiple ways, but that's nothing new. 00.777… = 01, after all.


 No.869396>>869415

>>869385

>A calculator that makes 00011 equal to 00009 is a defective calculator.

The problem wasn't with `bc`, and a C compiler is not a calculator. Your UNIX hater is too stupid to understand octal.

>First quote

A complete nothing-burger

>Second quote

A poorly constructed switch statement, but solid code nonetheless. Can you actually explain the problem? No? Keep copy-pasting, buddy.

>Third quote

As before. You got completely fucking BTFO in the last thread you copy-pasted this garbage, and you just ignored it to move on.

Can you stop trying to derail and/or derailing every thread? Can you fuck off back to OpenVMS or wherever it is you came from? We're working with technology people actually want to use, it's not something you'd be used to.


 No.869399>>869403

>>868312 (OP)

>bullshit rulefaggotry where they've decided to break code because the spec says they can somewhere?

if it's UB then it's fucking UB.

if you don't like it, pick a different language.


 No.869403>>869406 >>869407 >>869411 >>870282 >>870349

>>868803

Correct behavior is far more important. The optimizations you're talking about are of the dangerous, security hole-generating nature:

>number can't possibly overflow

>therefore some condition is always true

>therefore I can kill some code the programmer was counting on

What's the point of optimizing programs if the code ends up broken? When the first step towards debugging something is disabling optimizations, you know your compiler is doing it wrong.

>>869194

>>869210

>>869399

>hurr two pointers of different types that refer to the same address can't alias no matter what

Just because it's undefined behavior doesn't mean the compiler has to go full retard. In fact, if you supply the flag that makes the compiler define what was previously undefined, you will find that it's much more reasonable. All of a sudden, the compiler starts generating perfectly acceptable working code again.


 No.869406>>869418

>>869403

>Correct behavior is far more important.

You are using a language that's full of UB because otherwise it wouldn't be efficient enough when ported to other ass-backwards architectures. Correctness never mattered, only muh fast.


 No.869407>>869418

>>869403

You seem to be under impression that C is a general purpose language. It's not. It's specifically designed to run fast while providing bare minimum high level functionality to warrant using it over assembly. If you want a language that's correct, use ADA or Fortran, but don't complain if you can't do certain things or it can't run quite as fast.


 No.869408>>869410 >>869415 >>869433

Remember children, write your language standards like the Forth one rather than like C.

C: this is 'undefined behavior' which means gcc devs are permitted to break into your house and rape your daughter and it's your fault, you fucking failure. Bad things happen because of original sin by which I mean nasal demons. The purpose of 'undefined behavior' is to punish bad and sinful people via utterly senseless and stupid 'optimizations' that might theoretically improve code automatically generated by a retard. Nasal demon worshippers pretend that this is where C's speed comes from.

Forth: this is 'system-dependent behavior' which means compilers should document the target architecture appropriate thing that happens. The purpose of 'system dependent behavior' is to help programmers know when their code's behavior might vary from target to target, and when it won't.


 No.869410>>869414

>>869408

I wish this board had IDs, then I could just filter your posts. I mean it's entirely obvious that it's your posts without any IDs, but I'd appreciate the luxury of not reading any of that horseshit.


 No.869411>>869418

>>869403

>Just because it's undefined behavior doesn't mean the compiler has to go full retard

It is allowed to do so.

If you don't understand this fact and its implications, who is the retard here? :^)


 No.869414>>869420

>>869410

IDs are per-thread. That was my first post in this thread.

C is not fast because of dead code elimination. It's not 'low level, with lots og machine-specific behaviors' so that people can write code in it that compilers ignore the intent of. In a low-level language if I write an assignment I want a fucking assignment to happen. You can use Mercury if you want the order of yout operations to not even matter except for data dependencies (fake dats deoendency in the case of I/O).


 No.869415>>869420 >>869421 >>869433

>>869396

just so you know the 0009 stuff doesn't compile now on GCC. It's an obvious error and has nothing to do with octal.

Also not sure how you think that switch statement is "solid".

>>869408

eh isn't that exactly the same idea.

>this code *could* work differently on some architectures

>hurr durr therefore we can make it work differently whenever we want

I'm not seeing how Forth would escape the same fate.


 No.869418>>869422 >>869471 >>870282 >>870364

>>869406

>>869407

Well, that's just not the case anymore. The code will still be amazingly fast even with no strict aliasing and two's complement integers. That's why I recommend that everyone use those flags by default; less undefined behavior in the language is a good thing.

If, after much profiling, you determine that some function can benefit from those optimizations, you can put it in a separate translation unit and then adapt your makefile to compile it separately without the flags. Easy.

Broken "optimized" code shouldn't be the default.

>>869411

>hurrr you can therefore you should

I can underatand it just fine, moron. That's why I pointed out the solution to OP's problems and to the general problem of "compiler people live in their own idiotic dream world where completely broken code is considered correct because some standard says so". The flags force the compiler to define behavior the standard left undefined.

The compiler could just as easily have made those flags the default while hiding the aggressive optimization options that break stuff behind flags. It chose to generate broken code by default instead. It's imprudent, irresponsible and indefensible.


 No.869420>>869432

>>869415

In C, switch statement is a prettifyed goto, and you should treat it as such. It however comes with ability to compile into computed jump table. That code is a perfectly valid way of using it.

>>869414

Compiler optimization is by definition restructuring your code and removing unnecessary parts. If you don't want it, you should disable optimization.


 No.869421>>869423

>>869415

I blame the better-written standard for Forth escaping this fate in practice, but there's also

1. Forth isn't a popular intermediate target for other languages that generate lots of dead code because that simplifies their own compilers.

2. It isn't as prestigious to have commits to a Forth project (there's only one good open source Forth and its devs already hate how gcc keeps jerking them around with performance regressions due to thread related) so CS grads who think computers don't have bit-widths don't get brownie points for adding optimizations that will only ever A) do nothing at all, or B) break code but it's OK cause nasal demons. That's why C is fast so deal with it.

3. Some massive faggot didn't make a joke about nasal demons in the Forth community.


 No.869422>>869424 >>869433

>>869418

It's literally their job to produce a compiler that follows the specs (the standard) to the letter. And there you are, blaming them for doing exactly what they're supposed to be doing, because it stops you from violating the specs.


 No.869423

>>869421

>nasal demons

>these shitty pastas

This is how I can tell you're samefagging.


 No.869424>>869429

>>869422

What I still don't get about this relentless stupidity is where it comes from. You come off like a shill, but nobody's paying you.

One theory: some fag knows C. Knowing C isn't impressive--to the point that it is assumed: if you know anything, you should at least also know C. This fag then, rather than try impress through other knowledge, instead argues that C is actually difficult af.

Another theory: these people are all larpers.


 No.869429>>869431 >>869438

>>869424

Actually correct theory: you're a faggot autist sperging at everyone who doesn't scold C the way you do.

C language has a standard, and compilers follow the standard. And that's that. If you don't like what compilers do, take it to the C committee or whoeverthefuck, because compiler makers won't do shit for you, not if it would violate C specifications.


 No.869431>>869444

>>869429

>complaining about nonsensical code generation is asking compilers to violate the standard

Even high nasal demon theory doesn't mandate that gcc devs rape my daughter. It just says they ''can"'. I prefer that they compile my code properly like they did when RMS ran things.


 No.869432>>869437

>>869420


if(prime(x) || x == 2 || x == 3 || x == 5 || x == 7)
process_prime(x);
else
process_composite(x);

Isn't that the exact same.


 No.869433>>869440 >>869449

>>869415

>>869408

The idea of undefined behavior is not that it varies between implementations, it's that it's not even part of the language. Writing a program with undefined behavior in it just makes it a semantically invalid program.

It would be reasonable if C compilers generated code that did something reasonable in the face of undefined behavior, but that's not what happens. What they do is pretend that undefined behavior is literally impossible and optimize based on that. The compiler sees something that's undefined behavior, realizes that it's impossible and can't happen and then deletes it based on that. For example, it might optimize away your own check for integer overflow if the integer overflowing is undefined behavior. Then, after that code is gone, they might realize there's no point to the surrounding code and delete them too. And so on, and so forth.

That's why optimizers are regarded with healthy suspicion. They're trying to prove things about your program and optimize based on that. When you give them a general statement of fact like "X can't possibly happen, ever", it's like giving a chimp a machinegun.

>>869422

Nobody really cares about the specs. Specs don't compile or run code. The standard is just some platonic ideal that doesn't actually exist. GCC introduced a lot of extensions to the language, which are enabled by default. This "GNU C" is actually what a lot of people consider to be the C language, and it actually has a lot of useful features such as the ability to take the address of a label. Microsoft C compilers are in such a sad state it's not even funny. Plan 9 C compilers are also not shy about the fact they're compiling their own little derivative of the C language.

You need to literally and explicitly tell GCC to follow the standard rather than it's own custom C language; other compilers might not even support the standard correctly. Not even GCC fully supports all the standards, as the feature matrixes reveal. So don't come to me talking about how compilers follow the letter of the standard. They couldn't care less. The real reason they like to pretend they care is it allows them to win their little benchmarks competitions at the expense of your code.


 No.869437

>>869432

nvm I forgot about side effects in the if statement.


 No.869438

>>869429

>If you don't like what compilers do, take it to the C committee or whoeverthefuck, because compiler makers won't do shit for you, not if it would violate C specifications.

OR... OR... You can pass in a bunch of flags and it will kill off much of the optimizer-induced brokeneness. The compiler programmers know they're breaking your shit on purpose, that's why they introduced those flags in the first place.


 No.869440>>869441

>>869433

Undefined behaviour != unspecified behaviour.


 No.869441>>869446

>>869440

Yeah? What's your point?


 No.869444

>>869431

Undefined behavior is when compiler doesn't do anything specific about the thing you're trying to do. So it can go about it one way or the other, randomly. Having edge cases like that undefined is part of the reason C is fast. And you as a programmer should avoid such edge cases.


 No.869446>>869456

>>869441

why are you in this thread?


 No.869449>>869456

>>869433

The behavior is undefined in the first place because it's impossible to resolve it reasonably. i = i++ + ++i for instance. That's not to mention, writing "reasonab'e" code generators for all those edge cases would bloat the compiler immensely and will never possibly cover 100% of them anyway.


 No.869456>>869476 >>869612

>>869446

Because I wanted to help people not get fucked by compilers? Do you actually have anything to say or do you just want me to go away? OP's problem is solved anyway, and not thanks to any of you pedantic standards-quoting "hurr its your fault" fucks.

>>869449

>autism over sequence points

Nobody actually writes code like that. People do write code with pointers aliasing other pointers regardless of types, it's a generally useful concept that gets a lot more use than cramming 5 increments into one sequence point. It's a much simpler problem too and it can be solved by, for example, simply not assuming that multiple pointers to the same address can't alias. They're obviously aliasing each other, the compiler can see that but it ignores that fact because the standards allow it. That's indefensible.


 No.869471>>869542

>>869418

>The compiler could just as easily have made those flags the default while hiding the aggressive optimization options that break stuff behind flags. It chose to generate broken code by default instead. It's imprudent, irresponsible and indefensible.

It won't win compiler benchmarks then.


 No.869476>>869478 >>869542

>>869456

Compiler can't know if it's the same address because to figure it out it would have to run the program it's trying to compile, which besides being idiotic notion is biting firmly into the halting problem, and even then it's not a guarantee that in case A it's aliased and in cases BCDXYZ - not. And doing runtime checks would destroy pointer performance. Smart people design shit like that, they thought of it already. Crack open a fucking math book, nigger.


 No.869478>>869542

>>869476

Also. If you need to alias incompatible types so badly, you can use char * as a middleware. At compile time this shuffling is free, and char is automatically compatible with everything.


 No.869542>>869558 >>869573 >>869632 >>869638 >>869669

>>869471

But it will prevent security holes from being unknowingly introduced into perfectly good code that used to work but now doesn't because of the shiny new optimizations present in new compiler versions.

>>869476

How the fuck can it not know that the pointer you cast to another type aliases the original pointer?

unsigned long a;
a = 5;
*(unsigned short *)&a = 4;

How the fuck can it not know that the newly-created unsigned short * points to a? That's fucking stupid. It's statically verifiable, compilers already keep track of data much more complex than that. You expect me to believe the compiler can't do simple static analysis a human could do just by eyeballing the code? I'd rather believe it's because, as you said, it would "bloat" the compiler with "unelegant" case-specific solutions. You can't solve stuff like the halting problem in the general case with a general algorithm, but it turns out you can get pretty close in the real world.

Whatever you say Mr. "Math Wiz". Everyone should still disable strict aliasing anyway, "nigger". Just like the kernel people do. I seriously doubt you're smarter than Linus Torvalds.

>>869478

The types are not "incompatible". Every single type out there is a multiple of char by definition. Only packed stuctures can have odd sizes. Why is casting to char * OK but casting to int8_t * is undefined when you dereference the pointer even though CHAR_BIT == 8 in any system that touches the internet? Why can't you cast to multiples of char? Because some standard says only char * can alias other types? That's idiotic. Casting to char * just stops the compiler from assuming the pointers can't alias, precisely because char * is the only exception to the strict aliasing rule. Disabling strict aliasing does that for all types. It significantly improves the language.

Just disable the autistic compiler behavior and start programming in a better version of C that does exactly what you expect it to do instead of playing autistic games with the compiler writers to see who knows the standards best.


 No.869558>>869587

>>869542

That's a trivial scenario. Things more complex than that will quickly exhaust it's ability to keep track of it. Then you'll have a situation where it works half the time and half the time it doesn't, which is worse.

Char * was added as an exception because you javaniggers can't write code that doesn't violates aliasing rules. Also char is not the same as int8_t, just like int is not the same as int64_t. Char is a type that's exactly one machine word, int8_t is a type that's exactly 8 bits. Just because most machines in practice use 8 bit words doesn't mean the distinction should be simply dropped. There are plenty of machines that use 16 bit words, and a number of odd ones that use 10 bit words and somesuch. This isn't about you bitching and moaning that writing proper code is too hard for your brainlet self, it's about the principle that C as a language is not tied to specific hardware with specific quirks.


 No.869573>>869587

>>869542

Also just to be clear. Aliasing rules exist because the shit is wholly dependant on hardware quirks such as endianness. At which point you're jacking off hardware ports and not doing the data processing, so you should be using assembly anyway, not C.


 No.869579

sum += ((uint16_t *) &f)[i];
should be
uint16_t tmp;
memcpy(&tmp, ((uint16_t *) &f) + i, sizeof(tmp));
sum += tmp


 No.869586

>>868312 (OP)

GCC developers' decisions aside, you're a fucking moron, as evidenced by the fact that you didn't even bother trying to compile the code you're posting.


 No.869587>>869590

>>869558

>which is worse

Something that works many times is worse than giving up and saying "oh well we just can't solve this generally, let's just pretend it's impossible"? My sides.

>Char * was added as an exception because you javaniggers can't write code that doesn't violates aliasing rules

The rule sucks and programming is better, easier and safer without it. The restrict keyword is a far better solution to strict aliasing and its optimization problems just by virtue of being opt-in.

>Char is a type that's exactly one machine word

>implying CHAR_BIT == 64 on x86_64

God damn you are confused. Char is not a word, it's the smallest addressable amount of memory, which is 8 bits in the vast majority of relevant architectures today. Everything else is literally expressed as multiples of those 8 bit chars. Saying the types are "incompatible" is just plain ignorant.

>There are plenty of machines that use 16 bit words, and a number of odd ones that use 10 bit words and somesuch.

So what? I seriously doubt this is the case here. Just because some dead architecture from the 80s used retarded shit like 7 bit chars doesn't mean we should still be making concessions because of that architecture in 2018. Nobody gives a single fuck, and if it's not x86_64 or ARM, chances are whatever architectural quirks unique to those platforms are irrelevant to the vast majority of C programmers today.

>This isn't about you bitching and moaning that writing proper code is too hard for your brainlet self, it's about the principle that C as a language is not tied to specific hardware with specific quirks.

Yeah you can fuck off with your "principles". Nobody really cares about supporting niche architectures they don't even have hardware for to begin with. This is the real world, not the platonic ISO world with its decades of deliberation.

>>869573

>Aliasing rules exist because the shit is wholly dependant on hardware quirks such as endianness

Unless the OP specifies otherwise, I assume he's targeting x86_64. Should he say otherwise, we'll discuss architecture-specific requirements like alignment and endianness.


 No.869590>>869602

>>869587

If you don't intend on writing portable code, use assembly, dipshit.


 No.869602>>869607

>>869590

>hurr just make things needlessly harder on yourself for literally no reason

Fuck off.


 No.869603

If you intend to write code, don't


 No.869607>>869609

>>869602

>i want to write bare metal code

>i don't want to use assembly

>the language i want to use won't let me write bare metal code


 No.869609>>869634

>>869607

>making some reasonable assumptions means your code is hare metal

>better switch to assembly and take an order of magnitude hit to productivity and maintainability

>all to please one pedantic standards-quoting autist on /tech/ who can't handle it when "javatards" disable strict aliasing and enjoy a better, safer language


 No.869612>>869664

>>869456

hey the reason i had no input here cause i dont know that much but what that anon pointed out i understood, maybe try to be more clear next time


 No.869632>>869664

>>869542

>But it will prevent security holes from being unknowingly introduced into perfectly good code that used to work but now doesn't because of the shiny new optimizations present in new compiler versions.

It was, obviously, not a perfectly good code. If it was pefrectly good code then it would have had followed C standard. GCC would not mess things up if you follow C standard.

Your code relied on GCC implementation details and it fucked you over.

>Everyone should still disable strict aliasing anyway, "**". Just like the kernel people do. I seriously doubt you're smarter than Linus Torvalds.

You are acting like people who do something just because Google does it. If something works for one project (Linux kernel) it does not mean that wholly different project would have same benefit. Breaking standards of your chosen language should not be taken lightly because you might end up being dependent on just one compiler.


 No.869634>>869664 >>869672

File (hide): 8d1f0049f2dafed⋯.jpg (76.78 KB, 1066x600, 533:300, 1512479537170.jpg) (h) (u)

>>869609

>i can't write software that's safe so i need to disable compiler optimizations


 No.869638

>>869542

>I seriously doubt you're smarter than Linus Torvalds.

I'm not, but I'm not smarter than the GCC developers either. You can find an intelligent informed believer for almost any opinion.

I agree with you otherwise.


 No.869664>>870361

>>869612

Are you OP? >>868354 pointed out the code works as expected with aliasing turned off.

>>869632

>muh standard

Perfectly good code works and makes sense. It's got nothing to do with a piece of paper. The only concern is getting the compiler to generate the code you want. The only reason the compiler failed to do so here is some arbitrary rule that says only char pointers can alias other pointers. We disabled that and it worked.

>You are acting like people who do something just because Google does it.

I referenced and quoted the mail where Linus explains _why_he thinks type-based aliasing is garbage. I read the mail, considered it and decided I agree with him. No strict aliasing really should be the default for all compilers.

It's not just "hurr Linus said it it must be true" he presented good reasons why it is the case.

>Breaking standards of your chosen language should not be taken lightly because you might end up being dependent on just one compiler.

Are you seriously worried about depending on a GCC flag? GCC, the free software compiler that supports everything under the sun? This is not msvc we're talking about here.

>>869634

>letting the compiler delete your code based on unreasonable assumptions

You call this "safe code"? Many a security hole has been exploited due to these unexpected and unauditable optimizations.

>hurr but disabling strict aliasing disables optimizations

So what? You think the "portable" memcpy version or whatever other contortions you have to perform in order to maintain standards compliance are going to somehow be better optimized?


 No.869669>>869678

>>869542

>But it will prevent security holes from being unknowingly introduced into perfectly good code that used to work but now doesn't because of the shiny new optimizations present in new compiler versions.

You are free to create your superior C compiler. The whole world is waiting for you to save us.


 No.869672

>>869634

Kill yourself, dumb cuckchanner.


 No.869678

>>869669

I don't need to. I'm quite happy with GCC with the stupid-by-design feature turned off, thanks.


 No.870224>>871066

OP here. If you want to see how fucked this gets, consider this. Even if you try to communicate to the compiler via union that types at the same memory address actually are at the same memory address so the optimizer should fuck off, if you pass the address between functions in the same translation unit and it inlines the code (which is completely up to the compiler as "inline" is a suggestion) it can still break the code. Try changing the "#if 0" in this code to use a function that gcc will probably inline vs one it's been forced to not inline. Totally different result.


#include <stdint.h>
#include <stdio.h>
#include <stdlib.h>

#if 0
static uint32_t __attribute__ ((noinline)) calc(uint8_t *data, size_t size) {
#else
static uint32_t calc(uint8_t *data, size_t size) {
#endif
uint32_t sum = 0;

uint16_t *x = (uint16_t *) data;
for(size_t i = 0; i < size / 2; ++i) {
sum += x[i];
}

return sum;
}

struct Foo {
union {
struct {
uint32_t a;
uint32_t b;
uint32_t c;
uint32_t d;
};
uint8_t u8[16];
};
};

uint16_t foo() {

uint32_t sum = 0;

Foo f;
f.a = 0;
f.b = 0;
f.c = 0;
f.d = 0;

sum = calc(f.u8, sizeof(f));

return sum;
}

int main() {
uint16_t sum = foo();
printf("%u\n", sum);
}


 No.870282>>870351 >>870794

>>869403

>Correct behavior is far more important.

>>869418

>less undefined behavior in the language is a good thing.

Then what the fuck are you doing using C in the first place? This language has literally the opposite design goals. Use the right tool for your job, faggot.


 No.870349>>870794

>>869403

>The optimizations you're talking about are of the dangerous, security hole-generating nature

No, not really.


int32_t x, y;
//...some other code...
if(x>0)
{
y = (x+10)/4;
}

>but wait, compiler-kun! Divisions are expensive, can I count on you to optimise

(x+10)/4

>to

(x+10)>>2

>?

<but programmer-kun, that only rounds le result correctly when le dividend is positive! Surely, x is guaranteed to be positive inside that block, but that doesn't mean le x+10 is! Here, have this optimisation instead, it rounds correctly when le overflow happens!


if(x>0)
{
int32_t t = x+10;
y = t+(t>>31&3)>>2; //arithmetic shift, shift in the sign bit
}

>FFFFFFFFFFUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUU!!!!!!!!!!'

Don't enable well-defined integer overflows, it turns your compiler into a redditor.


 No.870351

>>870282

C was fine for decades before some faggot language lawyers decided that they could wreck a corner case via an autismal reading of the spec. Rather than what we did in the '90s and just fix the compiler to work around perceived mistakes in the standard then roll the fixes into the standard, they're implementing the mistakes as rigorously as they can. It's wrong and dumb.

That example code is literally 30 years old and only now has become a problem and only on one compiler.


 No.870361

>>869664

no i am not OP. im just some loser who found this thread. i dont know alot but like i said what he said made sence but your comment i read seemed like it was just someone who had very little to say, so i just thought you didnt know what you were talking about... but i see i was wrong. you should have added more to your comment to explain...sorry


 No.870364>>870794

>>869418

>Broken "optimized" code shouldn't be the default.

It isn't. You explicitly asked the compiler to -O2.


 No.870369>>870383 >>870794

I'm glad the gcc team doesn't work on the kernel. "We NEED to implement this security hole in IPv6 because it's in the standard!"


 No.870383>>870393 >>870493 >>870794

>>870369

The right thing to do is implement the standard security hole and also include a switch that will let the user turn it off.


 No.870393

>>870383

You obviously work at Microsoft.


 No.870395>>870400 >>870794

>>868312 (OP)

get ready... are you ready? are you sure?

https://www.rust-lang.org/en-US/

>mic drop

>walks away


 No.870400

>>870395

poor b8 tbh


 No.870404>>870408 >>870794

>>868344

and yet the linux kernel still doesnt compile properly in clang because of all the gcc specific shit he put in there...


 No.870408>>870432

>>870404

That's hardly intended though.


 No.870432

>>870408

failure to plan is planning to fail


 No.870462>>870494 >>870794

>>868312 (OP)

Give me a rundown of this thread.

What do I have to do to my CFLAGS to save my Autism Box™?


 No.870493

>>870383

I'm glad you don't work on the kernel, either. But thanks for sharing your bad opinions.


 No.870494

>>870462

-fboku-no-pico


 No.870794>>870797 >>870997 >>871026 >>871179

>>870282

>hurr why use C

Because it's good and has nice compilers with flags that let you turn off the retarded language defects. Without strict aliasing getting in the way, I can finally use things like uint8_t to safely alias every other type and be sure that it's unsigned, unlike char whosed signedness is implementation-defined.

>>870349

What? Doesn't that mean the code operates correctly? What's your point?

My experience with compiler overflow behavior involves loops, it assumed a condition was always true due to lack of overflow or something, then deleted it and the surrounding code.

>>870364

So if you turn on optimizations you should expect the compiler to break your program? You think this is reasonable? How long until security best practices start banning optimization techniques? Some mobile platforms already banned dynamic recompilers. Is this the future? We can't have anything nice because of the autism of compiler writers and their broken standards?

>>870369

Thak god. GCC and glibc people also break binary compatibility between minor compiler/library versions all the time. If they were kernel developers, nobody would ever upgrade to new kernel versions. Those people simply don't give a single fuck about you or me or any of us. Torvalds also made hilarious comments on the matter. Wish I could find the emails.

GNU and its developers are cancer. The only reason their programs are still widely used is nothing better exists.

>>870395

Nah.

>>870383

Holy fuck, just NO. Standards suck, fuck them. Just do your own thing, make it good and make it compatible.

>>870404

So? Clang will support it eventually. One of their goals is compatibility with GCC. Besides, those are free software compilers. It's not like they're depending on msvc features.

>>870462

Should probably disable all optimization if you like security. Other than that, -fno-strict-aliasing, especially for software you're writing.


 No.870797>>870808 >>870820

>>870794

>Wish I could find the emails.

Found them.

http://yarchive.net/comp/linux/gcc_vs_kernel_stability.html

>The gcc people very much have a "Oh, we changed old documented behaviour - live with it" attitude

>together with "That was a gcc extension, not part of the C language, so when we change how gcc behaves, it's _your_ problem" approach

>So yes, there's a huge attitude difference.

>The gcc people have a BAD attitude.

>When the meaning of "inline" changed

>from a "inline this" to "hey, it's a hint"

>the gcc people never EVER said "sorry"

>They effectively said "screw you"

>I know this is why I don't trust gcc wrt inlining

>It's not so much about any technical issues

>as about the fact that the kernel tends to be a lot heavier user of gcc features than most programs

>and has correctness issues with them

>AND THE GCC PEOPLE SIMPLY DON'T CARE.

>Comparing it to the kernel is ludicrous

>We care about user-space interfaces to an insane degree

>We go to extreme lengths to maintain even badly designed or unintentional interfaces

>Breaking user programs simply isn't acceptable

>We're _not_ like the gcc developers

>We know that people use old binaries for years and years

>and that making a new release doesn't mean that you can just throw that out

>You can trust us

>THAT is what makes me worry

>I don't know if this is why Andrew doesn't trust inlining

>but I suspect it has similar roots

>Not trusting it because we haven't been able to trust the people behind it

>No heads-up, no warnings, no discussions

>Just a "screw you, things changed, your usage doesn't matter, and we're not even interested in listening to you or telling you why things changed"

>There have been situations where documented gcc semantics changed

>and instead of saying "sorry", the gcc people changed the documentation

>What the hell is the point of documented semantics if you can't depend on them anyway?

This is how the GNU faggots treat the goddamn Linux kernel developers. Our problems mean nothing to them.


 No.870808>>870819

>>870797

It's high time that Linux move away from GNU for good.


 No.870819>>870820 >>870843 >>872091

>>870808

I agree. "GNU/Linux" retards act as if glibc is some kind of gatekeeper for the kernel system calls. Picture related. It's all lies.

I often talk about scrapping glibc and coming up with a liblinux that provides access to all Linux system calls in the most low-level manner possible, enabling applications to simply target Linux directly with 0 dependencies.

It's not a new idea either:

https://lwn.net/Articles/711013/

https://lwn.net/Articles/711053/

>maybe the kernel developers should support a libinux.a library that would allow us to bypass glibc when they are being non-helpful

https://lwn.net/Articles/655028/

>he suggested creating another library specifically for them

>It would be called something like "libinux-syscalls" (so that one would link with "-linux-syscalls")

>Functions relegated to this library should be simple wrappers, without internal state, with the idea that supporting multiple versions of the library would be possible

The only reason a liblinux doesn't exist is apparently nobody stepped up and volunteered to do the work. It is quite possible though; here's a simple Linux-specific program that works without libc:

https://ideone.com/Oe5wuP

I have a small liblinux library but it's very incomplete and only works on x86_64 and ARM. I won't release it in this state. Anyway, I'll probably be developing these ideas on the kill unix thread (>>870218). Here's some more resources in case anyone else is interested:

http://www.muppetlabs.com/~breadbox/software/tiny/teensy.html

https://filippo.io/linux-syscall-table/

http://man7.org/linux/man-pages/man2/syscalls.2.html

http://man7.org/linux/man-pages/man2/syscall.2.html

https://lwn.net/Articles/604287/

https://lwn.net/Articles/604515/

https://lwn.net/Articles/604406/

https://lwn.net/Articles/615809/

http://nullprogram.com/blog/2016/09/23/

http://nullprogram.com/blog/2015/05/15/


 No.870820>>870827

>>870797

>>870819

>29 Dec 2005

You all glow to much.


 No.870827>>870833

>>870820

If you have more recent articles/mails that either support or disprove my ideas, I welcome them.


 No.870833>>870853

>>870827

>If you have more recent articles/mails that either support or disprove my ideas, I welcome them.

I don't have time to waist on decade+ threads.

It's 2018 Torvalds would have ditched GCC a long time ago if it was that much of a problem or if they hadn't changed/compromised.


 No.870843>>870853

>>870819

"liblinux instead of libc" to make your code unportable. wtf


 No.870853>>870926

>>870833

Who knows? Maybe they haven't switched because it's the best compiler available right now; clang, the only real alternative, doesn't support the GCC extensions. Maybe the GCC people cleaned up their act. Maybe one day Linus will get pissed off so much he'll make his own compiler and it will become the git of compilers. Or not. I don't know.

All I know is GCC, especially G++, still breaks ABI and they still have ludicrous program-eating optimizations, and glibc is still a major pain in the ass to upgrade to the point distros keep multiple versions because it's easier.

Doesn't seem things have changed. Maybe people just adapted. Just like us, with -fno-strict-aliasing.

>>870843

>muh portability

Portable software is lowest common denominator software. It supports only the features common to all platforms. Programs come out wrong and aren't really native to any one platform. They feel weird in the hands of users. Graphical programs suffer most from this. Only companies looking to cheapen software development could possibly like this.

Portability is overrated and a relic from the olden Unix days where there were tens of architectures and different commercial Unix vendors, all subtly incompatible with each other. That's the context GNU was born in. Now it's mostly just Linux and a couple dominant hardware architectures. You wouldn't lose much by targeting Linux directly; chances are you were going to use Linux anyway. Take advantage of all the great things Linux has to offer.

>but muh BSD

>but muh macintosh

>but muh Windows

If you actually care about those operating systems, go ahead and make specific versions of your programs for those operating systems. Use their system calls and features. Make use of everything they have to offer to make the application as great as possible. Don't stand on top of some lowest common denominator portable runtime just to make things easier for yourself. That's just lazy, and it means you don't actually care enough to make your program a first class citizen of those operating systems.

I cringe hard every time I see some crappy mingw port of some random Unix software. I often have to install an entire mingw/cygwin/msys2 environment just to compile that garbage. It doesn't even link to the correct msvcrt.dll. Only GNU faggots who don't _really_ care about Windows could possibly think these half-assed abominations are acceptable.


 No.870926>>870987

>>870853

Industry targets Linux directly. Almost all advanced networking functionality comes to Linux first and gradually is copied by the BSDs (mainly funded by Apple since they can't use Linux).


 No.870987

>>870926

Yes. That's exactly how things should be. If industry cares about Linux, then it should make Linux software and improve the Linux kernel. If what Linux has is so good that the BSD/mac OS/Windows people are interested in it, they can come and study the code and implement it themselves for their users.

The GPL gives everyone the freedom to study the software for a reason. Linux programmers shouldn't have to waste time and braincells figuring out a shitty way to support all 3 platforms simultaneously. The same is true for Windows developers or BSD developers. Mac OS developers already think this way and the applications there are better for it.


 No.870997>>871001

>>870794

>x is fine and good and I like it!

>x needs to become the opposite of what it is to be inclusive of my autismal opinions

literal SJW mindset, guys


 No.871001>>871026

File (hide): a13c799f2bae813⋯.jpg (37.46 KB, 700x487, 700:487, 14_year_olds.jpg) (h) (u)

>>870997

>using a compiler flag makes me a SJW

You're trolling me, right?


 No.871026>>871059

>>870794

>I can finally use things like uint8_t to safely alias every other type and be sure that it's unsigned, unlike char whosed signedness is implementation-defined.

What's stopping you from using unsigned char you insufferable faggot?

>What? Doesn't that mean the code operates correctly? What's your point?

The point is that you made the compiler produce bloated, sub-optimal code because of a retarded overflow corner case that will never happen in reality.

>So if you turn on optimizations you should expect the compiler to break your program? You think this is reasonable?

If your program invokes platform-specific undefined behavior, then yes, of course. If you rely on specific quirks of the platform you're targeting, it is your responsibility to nudge the compiler not to break your brittle kludges.

>How long until security best practices start banning optimization techniques?

I believe that's already the case. The most risky optimisations are only enabled starting at -O3 since the consensus is that you don't enable that unless you need every inch of speed and are willing to deal with the fallout. It would actually make sense IMHO to introduce an -Osec optimisation level that would enable every -O2 optimisation except those with a significant risk of widening bugs into exploitable security holes. We already have -Og with a similar reasoning behind it.

>>871001

Not using, demanding that it should be the default when it goes against the spirit of the language.


 No.871059>>871126

>>871026

>hurr why not use unsigned char

Because it's not the correct type, moron. When one wants to access data as if it was an array of octets, the correct type to use is pointer to uint8_t by definition and it's sad that one must go the extra mile and disable strict aliasing for this to work correctly. <inttypes.h> was created for a reason, and that reason is the regular implementation-specific data types of C had become a problem:

>This difference in int size can create some problems for users who migrate from one system to another which assigns different sizes to integer types

>because the ISO C standard integer promotion rule can produce silent changes unexpectedly.

>The need for defining an extended integer type increased with the introduction of 64-bit systems.

Char, short, int, and long are pretty much deprecated. For the love of god don't write new code using this shit in 2018. Even freestanding environments have access to <inttypes.h> so fucking use it. OP had the good sense to use these types and wrote perfectly good code but his reward was punishment in the form of some language-lawyering compiler "optimizing" his code.

Also, nobody actually uses unsigned char. That's the kind of garbage that people ask confused stackoverflow questions about. Then after some language lawyer shows up with his standards citations, everybody shares a moment of understanding then goes right back to dealing with all the code, functions and interfaces that have already been written using unqualified char and will never, ever be changed. Nothing can be done about this.

>bloated, sub-optimal code because of a retarded overflow corner case

So you're telling me the compiler was turned into a "redditor" because the code it generated works perfectly under all cases? It's a "redditor" compiler because it generates correct but suboptimal code with a few extra instructions? And this is supposed to be bad? It's not bad, it's great. That's the exact kind of effect I was lookging for when I enabled the flag. It should be the default. If after much profiling you find a hot segment of code where you _know_ overflow isn't a problem, you can compile it separately with the flag disabled and reap your speed gains.

>hurr signed overflow never ever happens in reality

You're full of bullshit.

>hurr undefined behavior means the compiler can rape your wife

It could just as easily do the safe and sensible thing instead.

>platform-specific undefined behavior

>specific quirks of the platform you're targeting

Undefined behavior is not platform-specific behavior. It'd be great if it was, because (1) something very specific would happen when you invoke it, and (2) the compiler would have to document exactly what the fuck happens in every supported platform. If that was the case, this thread might not even have existed. Even Wikipedia gets this right:

>Documenting an operation as undefined behavior allows compilers to assume that this operation will never happen in a conforming program.

There's even a simple example of signed overflow being assumed to be an impossible occurrence, the compiler deleting the overflow check and eliminating a conditional function call.

>The most risky optimisations

>fallout

Optimizations shouldn't involve risk calculation. What the hell is up with that? It should be absolutely safe to enable optimizations. Most people do as a matter of course, they want the sanic speed; if they didn't, they'd be writing Javascript.

"Consensus?" Only among compiler people. Having the behavior of your program change under your feet when you increase some arbitrary optimization level is ludicrously stupid, doubly so when it ends up inserting security holes into your program. There shouldn't even be different optimization levels each with some nebulous associated "risk", optimization should just be safe by default and that's it.

>-Osec optimisation level

This is a great idea. Good luck getting it into GCC.

>hurr it shouldn't be safe by default

>people should blacklist shitty compiler behavior instead of whitelisting the good behavior they do want

>MUH SPIRIT OF THE LANGUAGE

You're the reason people blame C for security holes.


 No.871066

>>870224

Yeah, I agree it's fucked. My guess is the compiler gives approximately zero fucks about your union because the function is supposed to take an uint8_t * rather than an actual struct Foo. So it's probably still optimizing based on the assumption that x can't possibly alias data even though you literally cast data to uint16_t * and assigned it to x. It probably deleted x[i] and all surrounding code because dereferencing x invokes undefined behavior and therefore it "can't happen".


 No.871126>>871172

>>871059

Why write int_fast32_t when you can write long?


 No.871165

Does anyone use -fno-strict-aliasing, -fwrapv, -fno-delete-null-pointer-checks, or other flags systemwide on Gentoo? What is the performance like?


 No.871172>>871329

>>871126

long isn't guaranteed to be 32 bit. It's guaranteed to be at least 32 bit.


 No.871179

>>870794

>So?

just saying its pretty ironic


 No.871204

You're invoking undefined behavior by casting to an incompatible type, mask and bitshift instead.



#include <stdint.h>
#include <stdio.h>

uint16_t foo() {
uint32_t sum = 0;

uint32_t f[4] = { 0, 0, 0, 0 };

for(size_t i = 0; i < 4; i++) {
sum += f[i] & 0xFFFF;
sum += f[i] >> 16;
}

return sum;
}

int main() {
uint16_t sum = foo();
printf("%u\n", sum);

return 0;
}


 No.871329>>871358

>>871172

same for int_fast32_t, moron


 No.871358>>871378

>>871329

Not him, but int_fast32_t at least tells you exactly what it is.


 No.871378

>>871358

It doesn't tell you it's long, though.


 No.872091>>872235

>>870819

Stop the autism and just use any other libc, like Alpine does with musl. The GCC thing is a completely different problem, Linux loves to depend on non standard compiler behavior, so the roots behind that Linus rant are also its curse, they can't move away from GCC since they depend heavily on their non-standard features.

I agree with the sentiment of moving away from GNU though.


 No.872235>>872280

>>872091

>Stop the autism and just use any other libc, like Alpine does with musl.

I like musl, especially the well-written source code, and think it would be an improvement if everyone used it. I just don't agree with this "kernel interface is libc" thing on principle.

The actual Linux kernel-to-user space interface are the assembly entry points; they're architecture-dependent and programming language-agnostic. They shouldn't be hidden away behind some libc whose maintainers don't even like adding Linux-specific system calls to its own API because of muh Unix compatibility. Also, why C? What about other languages? It's a polyglot world out there; they could easily use JIT compilation to dynamically generate code to perform system calls. Why must they be prevented from doing that just to avoid screwing up glib's global state?

I think this deserves more attention.

>I agree with the sentiment of moving away from GNU though.

That's just the thing. If I ask for the manual pages about the Linux system calls, they give me the glibc pages that document the system call wrappers. It's insane if you ask me. It's about as insane as giving me documentation on systemd when I ask for the init system documentation. Why isn't there a page detailing the requirements Linux imposes on all init systems? I'm accessing the Linux kernel manuals yet it gives me user space manual pages.

These things are for some reason deeply entrenched in Linux. So entrenched, the kernel people apparently assume you're using them. It's kinda hard to move away from GNU when people program on top of glib rather than Linux.


 No.872280>>872397

>>872235

People use GCC, GlibC, and GNU extensions because they actually like these features. If they didn't think it was useful, they would obviously not be using it. Nobody is writing software in the C language without the background knowledge that GNU extensions to the C language are not officially ratified by the ISO C language body.


 No.872397

>>872280

I don't mind GNU extensions to the language. Actually I think they're extremely useful. I don't really care very much what C standards have to say, I care about what compilers do.


 No.875305

https://stackoverflow.com/q/48731306

>Signed integer overflow has undefined behaviour.

>The optimizer was able to prove that any value of i greater than 173 would cause UB, and because it can assume that there is no UB, it can also assume that i is never greater than 173.

>It can then further prove that i < 300 is always true, and so the loop condition can be optimized away.

Undefined behavior claims yet another unsuspecting victim.




[Return][Go to top][Catalog][Screencap][Nerve Center][Cancer][Update] ( Scroll to new posts) ( Auto) 5
113 replies | 3 images | Page ?
[Post a Reply]
[ / / / / / / / / / / / / / ] [ dir / animu / cyoa / hikki / kind / lds / loomis / strek / wooo ][ watchlist ]