[ / / / / / / / / / / / / / ] [ dir / agatha2 / animu / ausneets / b2 / choroy / dempart / freeb / vichan ][Options][ watchlist ]

/tech/ - Technology

You can now write text to your AI-generated image at https://aiproto.com It is currently free to use for Proto members.
Email
Comment *
File
Select/drop/paste files here
Password (Randomized for file and post deletion; you may also set your own.)
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Expand all images

[–]

 No.1039291>>1039302 >>1039316 >>1039609 >>1039855 >>1052686 [Watch Thread][Show All Posts]

Countering "Trusting Trust"

schneier.com/blog/archives/2006/01/countering_trus.html

web.archive.org/web/19961220000813/http://www.acm.org/classics/sep95/ this is an archive of the dead link in the top of the article

>Way back in 1974, Paul Karger and Roger Schell discovered a devastating attack against computer systems. Ken Thompson described it in his classic 1984 speech, "Reflections on Trusting Trust." Basically, an attacker changes a compiler binary to produce malicious versions of some programs, INCLUDING ITSELF. Once this is done, the attack perpetuates, essentially undetectably. Thompson demonstrated the attack in a devastating way: he subverted a compiler of an experimental victim, allowing Thompson to log in as root without using a password. The victim never noticed the attack, even when they disassembled the binaries -- the compiler rigged the disassembler, too.

>This attack has long been part of the lore of computer security, and everyone knows that there's no defense. And that makes this paper by David A. Wheeler so interesting. It's "Countering Trusting Trust through Diverse Double-Compiling," and here's the abstract:

>An Air Force evaluation of Multics, and Ken Thompson's famous Turing award lecture "Reflections on Trusting Trust," showed that compilers can be subverted to insert malicious Trojan horses into critical software, including themselves. If this attack goes undetected, even complete analysis of a system's source code will not find the malicious code that is running, and methods for detecting this particular attack are not widely known. This paper describes a practical technique, termed diverse double-compiling (DDC), that detects this attack and some unintended compiler defects as well. Simply recompile the purported source code twice: once with a second (trusted) compiler, and again using the result of the first compilation. If the result is bit-for-bit identical with the untrusted binary, then the source code accurately represents the binary. This technique has been mentioned informally, but its issues and ramifications have not been identified or discussed in a peer-reviewed work, nor has a public demonstration been made. This paper describes the technique, justifies it, describes how to overcome practical challenges, and demonstrates it.

There have been cases out in the wild of this, a high profile one only a couple years ago, but I have to dig it up. Most people, even in the security industry, don't know about this type of attack, and almost no one even tries to even mitigate it. It's tough enough to get people, even developers, to compile their own code; but now they have to compile it from multiple possibly new compilers and then do extra work beyond that.

 No.1039301>>1039304 >>1039790 >>1055546

There was a ?RedHat repo compromise about a decade ago where just the GCC packages were targeted. Everyone suspected it was a Thomson backdoor attack.


 No.1039302

>>1039291 (OP)

Even pretending this article has a shred of evidence couldn't you just make your own compiler and assemble it by hand or with an assembler to compile a modern compiler, then proceed to compiler from that one?


 No.1039303>>1039750

How your compiler may be compromising application security

Researchers at MIT develop a tool to identify code that your compiler may inadvertently remove, creating vulnerabilities

archive.fo/NNele

>Compilers are great at taking your hand crafted human-readable program, translating it into machine code and, in the process, optimizing it so it runs as efficiently as possible. Sometimes, though, as new research from MIT points out, in their zeal to optimize your code, compilers can go too far and remove code that they shouldn’t, which can make the system or application more vulnerable.

ie The compromised compilers deliberately make software insecure, but with plausible deniability.

>Four researchers in MIT’s Computer Science and Artificial Intelligence Laboratory, in a paper which is to be presented next week at the ACM Symposium on Operating Systems Principles, looked at the problem of optimization-unstable code, which is code that gets removed by a compiler because it includes undefined behavior. Undefined behavior is code which can behave unpredictably, such as dividing by zero, null pointer dereferencing and buffer overflows. Unlike other code, compiler writers are free to deal with undefined behavior however they wish. In some cases, they choose to eliminate it completely, which can lead to vulnerabilities if the code in question contains security checks.

>The MIT researchers studied a dozen common C/C++ compilers to see how they dealt with undefined code. They found that, over time, compilers are becoming more aggressive in how they deal with such code, more often simply removing it, even at default or low levels of optimization. Since C/C++ is fairly liberal about allowing undefined behavior, it is more susceptible to subtle bugs and security threats as a result of unstable code.

>"As compilers improve their optimizations, for example, by implementing new algorithms… or by exploiting undefined behavior from more constructs (e.g., library functions), we anticipate an increase in bugs due to unstable code."

>The good news is the researchers have developed a model and a static checker for identifying unstable code. Their checker is called STACK, and it currently works for checking C/C++ code. The idea is that it will warn programmers about unstable code in their applications, so they can fix it, rather than have the compiler simply leave it out. They also hope it will encourage compiler writers to rethink how they can optimize code in more secure ways.

>STACK was run against a number of systems written in C/C++ and it found 160 new bugs in the systems tested, including the Linux kernel (32 bugs found), Mozilla (3), Postgres (9) and Python (5). They also found that, of the 8,575 packages in the Debian Wheezy archive that contained C/C++ code, STACK detected at least one instance of unstable code in 3,471 of them, which, as the researchers write, “suggests that unstable code is a widespread problem.”

>They’ve made STACK available for you to download and try. Go give it a try and see how much unstable code your application or system contains.


 No.1039304

>>1039301

Maybe that is what zeromq is for then. Backdoored the compilers decades ago to insert zeromq/TANGO code into everything. IDK though and have no proof.


 No.1039305>>1039310 >>1039322 >>1039350 >>1039469 >>1039799 >>1040578

> Simply recompile the purported source code twice: once with a second (trusted) compiler

Fucking genius. That's why these guys are working for the air force, and plebs like you and I are stuck here.

No seriously, a real solution would be to have a series of bootstrapping programs. The first would be a simple assembler written in machine code. The second would be a simpler compiler written in assembly. The third would be the complete language written in the simple language. To verify the program, you would first check that the machine binary operates as expected, then check that the assembly operates as expected, then check that the compiler works as expected.

Actually, there was a blogpost I saw on hn that talked about an attack that subverted literally everything: the compiler, assembler, linker, then wireshark, so you couldn't see the packets it was sending out, then the router so you couldn't check with that. They dragged out an old router from before the botnet and saw the packets being transmitted on it. Then they wondered why the botnet had given itself away in the first place, to make them do all this debugging, and they realized it was using them as penetration testing, and now it would patch the exploit they found and try again. Spooky


 No.1039309>>1039613 >>1040578

More on mitigation

dwheeler.com/trusting-trust/

David A. Wheeler’s Page on Fully Countering Trusting Trust through Diverse Double-Compiling (DDC) - Countering Trojan Horse attacks on Compilers

>Here’s information about my work to counter the “Trusting Trust” attack. The “Trusting Trust” attack is an incredibly nasty attack in computer security; up to now it’s been presumed to be the essential uncounterable attack. I’ve worried about it for a long time, essentially since Ken Thompson publicly described it. After all, if there’s a known attack that cannot be effectively countered, should we be using computers at all? Thankfully, I think there is an effective countermeasure, which I have named “Diverse Double-Compiling” (DDC).

>Fully Countering Trusting Trust through Diverse Double-Compiling (PDF version, HTML version, OpenDocument text version) is my 2009 PhD dissertation explaining how to counter the “trusting trust” attack by using the “Diverse Double-Compiling” (DDC) technique. This dissertation was accepted by my PhD committee on October 26, 2009.

Anyone have anything more recent than this decade old paper? I'm learning about mitigating this attack, which has bothered me severely for a long time. Anything on Diverse Double-Compiling being used, or anything on improved methods?


 No.1039310>>1039311 >>1039319

>>1039305

>all that effort

>not just using a osciliscope and manually going through serial data on the line between the proccesor and RAM

y


 No.1039311>>1039314

>>1039310

1. it would be easier to use fancy tooling

2. they didn't expect things to be subverted so deep

also, how do you know the osciliscope isn't botnetted too?


 No.1039314>>1039317 >>1039318

>>1039311

Because you can build the oscliscope with an appropriate sampling rate yourself dumbass. Then you can either decode the binary data on the line manually or make a SOC to do it that is seperate from the scanner part of the scope.


 No.1039316>>1039350

>>1039291 (OP)

>everyone knows that there's no defense

Huh?


 No.1039317

>>1039314

You are essentialy building a line of trust from the osciliscope on up to the compiler and so on. Who the fuck is going to go through that effort when they can just write the software themselves for hardware that hasn't ever touched botnet software or can be nuked of botnet software? Or atleast who would go through that effort except 1337 hax03s for isis and or the ghcq and or some other secret service.


 No.1039318>>1039322

>>1039314

>you can build the oscliscope with an appropriate sampling rate yourself

Modern cpus run at billions of hertz. Building such a thing at home is non-trivial

>you can either decode the binary data on the line manually

Modern cpus run at hundreds of millions of operations per second. An operation that takes a tenth of a second is millions of operations, which you propose decoding manually

>make a SOC to do it that is seperate from the scanner part of the scope

If your using an off the shelf SOC, then you can't guarantee it isn't broken either. If your making it at home... good luck.


 No.1039319

>>1039310

You are looking at $thousands, probably a lot more, to accurately measure at GHz bandwidth.


 No.1039322

>>1039318

You don't have to go through billions of instructions. Just only use it for simple programs with simple software/an OS which still have many instructions but you can underclock/volt your proccessor which can be low power/clock to narrow it down for repeatability. So that simple software would have to be like what >>1039305 described eventually working up to something like LLVM/clang or gcc.


 No.1039350>>1039453

>>1039316

Sorry for posting controlled people, but I was hoping someone would call me a faggot and post better articles. It was good enough to get the thread started I think though.

So which compilers to use?

>>1039305

How to implement your techniques?


 No.1039453

>>1039350

>implement

Literally write your own compiler and stop being a nigger


 No.1039469>>1039750

>>1039305

For this to work you would also need to trust your firmware (leaving you back at stage 1), and your hardware (leaving you somewhere in the late 1970s)

A PDP-11 would probably be a good choice if we're talking about C code as in the Thompson hack for the following reasons:

>extremely well documented with full schematics, earlier models are implemented in 7400 series TTL so are open down to the gate level

>early UNIX tools would fit your requirements, especially those that were still written in assembly

>cross compilers exist for x86 and such

With ~12MHz CPUs, 4MB RAM and HDDs in the 10s of MBs, you could probably cross-compile a minimal *NIX for other architectures and slowly bootstrap the rest. We still have no way of checking any newer commercial CPUs for back doors though


 No.1039487

I read that article a long time ago and if I remember correctly, their solution is basically to use a trusted compiler. The only serious approach to a defense against the Thompson attack I know of is Ada, which gives you pretty massive control over the compiler to make binary audits easier. However I never really used that so a more knowledgeable adafag is probably better suited to explain the details of this.


 No.1039491

>>1039488

>waaah stop contributing to threads

>everyone I don't like is the same person

Get help.


 No.1039496

>>1039495

Seriously, get help.


 No.1039609>>1039616 >>1039781

File (hide): 68623ddd3d18468⋯.jpg (1.22 MB, 1152x864, 4:3, matrixback2.jpg) (h) (u)

>>1039291 (OP)

OP a good payload runs completely in memory and never touches the hard drive unless you instruct it to. A good payload will leave as little forensic evidence as possible.

Altering programs on the target machine will eventually bring attention to what the attacker is doing. Here are some thoughts though.

Why not just insert a few lines of code into the source code?

Going to the trouble of backdooring an a compiler is not a bad idea. This suggests you already have code execution on the target machine. Sure if target is a developer/maintainer and you can backdoor entire software repository and entire userbase.

But if you have code execution on the target machine why not just replace the kernel?

I think what I'm getting at is that hacking can be a lot of work and there methodologies that make it easier.

Here's the thing with a backdoored program. There's a cryptographic checksum of the legit program and people check them or last least they should check them.

Heuristic detection should pick up programs that are doing what they are not supposed to be doing. Things like svchost running as user on Windows. Things that are dead giveaway.

Anyhow there's a cool tool for backdooring binaries called Shellter. You can use this to backdoor a Windows binary. What it will do is read through about the first 100,000 instructions in the program and insert raw output from a metasploit payload.

https://www.shellterproject.com/

But really it's a matter of getting your instructions to execute on the target machine.

Linux Mint repositories were hacked a few years back and basically the entire distro was compromised. Not sure how the attacker got in or what kind of tricks he was using.


 No.1039613

File (hide): 78048eb39e5421d⋯.jpeg (10.86 KB, 300x168, 25:14, ackerman.jpeg) (h) (u)

>>1039309

The 1337ness of this.


 No.1039616>>1039712 >>1039750

>>1039609

You are a faggot and a script kiddy. How do you know this all encompassing malware didn't change the hash output of the *insert hash library here* because the compiler that compiled it was backdoored? You don't, it could change the hash output to be the same and you would never know the difference short of examing every bit going across the line and checking the difference using a physical tool that also didn't report false output.

>I think what I'm getting at is that hacking can be a lot of work and there methodologies that make it easier.

Think about this methodology. You hack a compiler to insert a backdooring running in 64kb of memory that can be accessed over every interface of every OS ever at that time. You did this decades ago when no one cared about compiler security/looking for program running in ram as rouge. From there you just kept the malware up to date with the latest and greatest of everything it could be accessed with using your computer that doesnt have that malware on it.

That's decades of computer hacked just because a compiler kept spreading your 64kb malware into everything it compiled, from one hack. All you had to do is make sure you can update that malware for a new OS i.e openbsd or a new architecture i.e NEON instructions in ARM. Or from up and coming detection software i.e antivirus or wireshark. That malware is called tango/corbra/zeromq and very possibly has been in software for years.


 No.1039694

Fascinating topic OP. I'm reading through your paper and all the other articles anons have posted on this now.


 No.1039712>>1039713

>>1039616

>You are a faggot and a script kiddy.

No u.

You're the one imagining an attack that requires every single compiler in the world to be compromised in an inter-compatible way.


 No.1039713

>>1039712

>Missing the point this much.

Look at this glow-in-the-dark.


 No.1039750>>1039774 >>1051910 >>1051916 >>1062746 >>1062859

This kind of hack is another example of the monopolistic mentality and Bloatware is Proprietary. Weenies want one kernel (Linux), one instruction set (RISC-V), and one compiler (GCC) for everything. Unlike academic researchers or 1970s computer companies, there are no plans for the future beyond that. This monopolistic mentality creates a monoculture that lets these kinds of hacks spread to every computer without detection with no way to prevent these backdoors in the future. The culture of UNIX, which rewards shoddy hacks, also makes them easier to hide because any exploit can be considered a "clever" hack, just like ncurses has "clever" memory leaks.

Bloatware is Proprietary allows backdoors to hide in multi-million line programs without anyone noticing that the source code and object code don't match. Backdoors can be hidden in plain sight in the source code itself. Even worse is that shitty "languages" like C are so broken that nobody can tell whether a backdoor was an intentional "optimization" (another respectable CS term these weenies took a big shit on). How would you know if a debugger had a backdoor or just an "honest" bug?

>>1039303

The problem is that the C standards committee doesn't know what C is because it's too poorly specified. It's both underspecified in the sense that basic meaningful operations are undefined for no reason (well, it's because the standards committee was afraid to make anyone rewrite code, even though ANSI C needed a lot of changes anyway) and overspecified in the sense that it is designed specifically for a PDP-11 memory model and is difficult to run on segmented and tagged architectures, preventing 60 years of CS research from being used.

>>1039469

That hack was actually implemented in PDP-11 UNIX, which is just another reason to avoid anything UNIX. "V7" tools are also so shitty that you would be better off making your own OS. They actually give the PDP-11 a bad name, like UNIX does for all hardware it runs on. The weenies who copied the vacuum cleaner slogan "nothing sucks like a VAX" for DEC's computer weren't running VMS.

>>1039616

The solution is simply to have a variety of compilers and architectures based on a variety of real standards, not weenie "standards" based on whatever one OS, compiler, or browser does. This kind of hack wouldn't have been possible in 60s FORTRAN and COBOL simply because there were so many implementations.

Nice theory, but I'm afraid you are too generous. When I was
porting a Scheme compiler to the RT, I managed to make adb
-- nothing like a fancy, source-level debugger, or anything,
just dumb ol' adb -- dump core with about 6 or 7
keystrokes. Not arcane keystrokes, either. Shit you would
tend to type at a debugger.

It turned out that the symbol table lookup code had a
fencepost error that barfed if a particular symbol happened
to be first in the list. The C compiler never did this,
so... it's good enough for Unix! Note that the RT had been
around for *years* when I found this bug; it wasn't raw
software.

The RT implementation of adb also had the interesting
feature of incorrectly printing the value of one of the
registers (r0). After I had spent a pleasant, relaxing
afternoon ineffectively trying to debug my code, and
discovered this, I remarked upon it to my RT-hacking
friends. They replied, "Oh, yeah. The debugger doesn't print
out r0 correctly." In order to use adb, it seems, you just
had to know this fact from the grapevine.

I was much amused at the idea of a debugger with large,
obvious bugs.

I recently managed to wedge our laserwriter server by
queueing up a large number of short files to it. My
officemate flamed me for doing so. I replied, which led to
the following illuminating interchange:

>> How was I to know the idiot system would vapor-lock on me?

>> Because it's Unix?

And that about sums it up for Unix.

When the revolution comes, and the mob drags Bill Joy
screaming from his Ferrari at the factory gates, I, for one,
will not be there to urge clemency.


 No.1039774>>1039779 >>1039867 >>1052850

>>1039750

>someone makes a thread about a compiler/toolchain exploit that could happen on basically any operating system

>an exploit that could be defeated by reproducible builds (which already exist) and comparing the hash using a program from a different compiler (which also exist, there's more FOSS compilers out there than GCC and LLVM)

>IT'S UNIX I TELL YOU

>AND C

>AND NOT HAVING DIFFERENT COMPILERS FOR EVERY ARCHITECTURE

This post is so stupid I'm actually wondering if someone is collecting this fag's quote collection and impersonating him for laughs.


 No.1039779>>1039780

>>1039774

>This post is so stupid I'm actually

No, you're just too stupid to understand it.


 No.1039780

File (hide): c64ab959483e537⋯.png (67.62 KB, 635x627, 635:627, (you).png) (h) (u)


 No.1039781>>1039785 >>1039789

>>1039609

>developer gets compiler compromised

>checksums always look correct

That's the point of this thread, and you've missed it entirely. Unless the developer is trying to mitigate this attack (he's not), he's likely already been compromised. It's not standard practice yet to compile code using multiple compilers, which is why this threat happens to be so dangerous right now. Now, what if the software is proprietary? You literally have no recourse of any kind because you can't compile the code yourself. This means this attack is likely extremely widespread. This makes the checksum worthless in detecting that attack. How the fuck did you not get that all this was about needing more than checksums?

What happens if the developer of the compiler is a spook, it compromised? Everyone using that compiler will be unknowingly aiding the spooks. For whatever reason, you're thinking in targeted attacks against random nobodies, which literally never happens. The attacks that concern us are those trying to reach millions of people.

>but my antivirus will protect me

AV software works mainly on virus signatures, and notoriously bad at anything but that. If the malware has not been detected by an actual person in order to add the signature to the AV, the infection will likely never be detected.


 No.1039785

>>1039781

>It's not standard practice yet to compile code using multiple compilers

It's fairly common for software targeting multiple platforms, but that's usually limited to GCC, LLVM, and (((MSVC))). Smaller compilers like TCC and QBE are less likely to be compromised but there's a performance tradeoff.


 No.1039789

>>1039781

>That's the point of this thread

And it's bullshit, because binaries will calculate the checksum correctly, any program compiled before the attack will checksum correctly, and because it's impossible to compromise all possible implementations of a certain function in a consistent way.

The entire thread is bullshit.


 No.1039790

>>1039301

>Red Hat

Figured easily.

(((it glows)))


 No.1039799

>>1039305

How was the router even subverted?

I think the wireshark could be if it's in the same host but if you have a logging router then that's where you can check it unless the malware is capable of checking the MAC vendor and preventing the leak with the said MAC vendors suspected for modified firmware.

also that is some spooky shit considering how every communications commission around the world literally does stuff like this

remember the guys at defcon who hacked sim cards? they're gone now

they wouldn't want you looking at the shit inside so they put all these organized ecosystem for id-ing all citizens and technologies with beacons for human goyim

not even talking about jew bullshit here, just the entire mankind possibly was only meant for gold mining and nothing else. planted originally by alien archons because somehow they think earth is dirty and full of diseases that they might catch while also a spiritual energy loosh farm because spirits don't just vanish - it travels like any other particle

now you know why life is so frail and shit here

bioweapon makers are based


 No.1039855

>>1039291 (OP)

Your DNA is backdoored.


 No.1039857

>There have been cases out in the wild of this, a high profile one only a couple years ago, but I have to dig it up.

muh trusting trust isnt a groundbreaking attack. its already an obvious consequence of running malicious code

>Most people, even in the security industry, don't know about this type of attack,

then why do i hear about it every fucking day for the last 15 years?


 No.1039867>>1039887 >>1040216

>>1039774

How does having reproducible builds close this vulnerability? I am a brainlet so no bully, but aren't the compilers themselves compiled? Potentially by another compromised compiler? Wouldn't the only solution would be to have a chain of custody down that deblobs everything to do with the compilers upstream?


 No.1039887

>>1039867

You have the original person who bootstrapped the compiler tell you what the hash should be. Now your problem is making sure your hardware can be trusted.


 No.1040086>>1040093

>modern era

>still falling for the compiler meme

just learn to compile in your head and write down the hex code bro


 No.1040093>>1040261

>>1040086

>reflections on trusting hex editors


 No.1040095>>1040110

what do you even need to compile the compiler? now they just do it with gcc but they could not do that for the first version


 No.1040110

>>1040095

You use an existing compiler. Generally the first compiler is written in an existing language. Once you have a working compiler, you can write a new compiler in the new language, and compile it with the old compiler.

See for example the history of T, which recounts the first version being written in maclisp: http://www.paulgraham.com/thist.html


 No.1040216>>1040314

>>1039867

Worst comes to worst, you could always write an assembler in machine code, then a C compiler in Assembly and go from there.


 No.1040261>>1040474

>>1040093

>reflections on trusting your own senses, brain, reality, and the universe


 No.1040314>>1040325

>>1040216

>then a C compiler in Assembly

The vulnerabilities are in C itself. You don't even need a backdoor when you have a language where the most basic standard functions are exploitable.


 No.1040325>>1040329 >>1040344

>>1040314

C is as exploitable as the programmer is retarded. But if you're so afraid, write a Rust compiler instead, or whatever meme language of the week's turn it is.


 No.1040329

>>1040325

>C is as exploitable as the programmer is retarded.

So you're saying everyone who has ever used C is retarded, especially its creators. Why would I want to use a language created by retards?


 No.1040344

>>1040325

You have never written a nontrivial correct C program.


 No.1040474

>>1040261

Calm down descartes. Remember your cogitos.


 No.1040578

>>1039305

>>1039309

> recompile the purported source code twice: once with a second (trusted) compiler

uhhh why not just compile it once with the trusted compiler?


 No.1041111

At some point you have to trust something, determining who/what you're placing that trust in and what vulnerabilities/attack vectors could abuse that trust is very important.


 No.1051873

bumping for interest


 No.1051890>>1051931

Maybe I misunderstand the paper, but for this to work doesn't your payload first have to spread to most compilers in use? Until most people's main compiler is actually compromised it, there's a pretty big chance of being detected by programs from compilers that haven't been subverted yet. For example people who go ages without updating their compiler.

If this attack had really happened it must have taken over the whole planet a long time ago, otherwise somebody would have detected it. But how come there's been no damage done by such a wide scale subversion, no money missing, no networks mysteriously compromised. If you pwned the whole world would you really not do something big with it?

As others have pointed out, it's technically possible to write your own compilers from scratch and researchers actually do it. Some of them could have stumbled on the virus by accident. News of such widespread infection would be huge so we would all hear about it. Especially people working on low level stuff in esoteric systems where the payload isn't very compatible are likely to notice it. And in any case, doesn't the idea rely on some hacker decades ago coming up with a compiler virus that never screws up and manages to stay compatible with all new technology? Seems a bit much.

If one person tried this attack then they couldn't be the only one. The virus could subvert your software to prevent detection of itself, but would it also prevent detection of a rival virus? How would the programmer even know about the rival virus? What if the rival virus comes out after he started his infection? Just generally this seems like a very long term attack where a lot could go wrong and lead to detection. If it's such a great attack a lot of people would be doing it. Somebody would eventually screw up and get caught. Then security researchers would go hunting it in the wild. It doesn't sound that hard to detect if you know what to look for and if the virus hasn't completely infected every compiler out there yet. Plus you would expect there to be multiple viruses so you're implying they somehow maintain a perfect pact of silence with each other. How would they even communicate? It's not like there's a virus writer conference all hackers go to where everybody announces their latest virus.


 No.1051897

>Most people, even in the security industry, don't know about this type of attack

What the fuck are you talking about, everyone knows this, even fucking winfags, even a whole bunch of non-programmers know.

>no one even tries to even mitigate it.

You linked to people who were working on how to mitigate it...

>but now they have to compile it from multiple possibly new compilers and then do extra work beyond that.

No they don't as your solution isn't implemented yet.

I don't get your stupid whiny bitch post. Is this a very elaborate shill to turn nerds into defeatist console faggots doing mundane smalltalk on reddit and counting upvotes?


 No.1051906

...and whoever can hack gcc devs/repos can probably also hack my desktop environment's, which also gets them root once i reboot.

Still something worth looking into.

Another mitigation method is what was invented (or made popular?) by the bitcoin community and is now also available for Tor Browser, and that is deterministic compilation.

Meaning you build something and check the hash and if the hash is good then you can start distributing what you got.


 No.1051910


 No.1051916>>1051923 >>1052850

>>1039750

tl;dr:

>This kind of hack wouldn't have been possible in 60s FORTRAN and COBOL simply because there were so many implementations

What makes you think that if FORTRAN and COBOL were useful programming languages that hadn't died out they wouldn't be as monopolized as is the case with others now?

And also a few years ago Apple was shilling their shitty clang compiler everywhere so it's not like there is only one.

There's also the trusted, reliable and verified Microsoft® Visual Studio™ and Borland C++ (if that's still alive).

It's also not like everyone compiles the very same version of GCC so idk how much that Attack even scales.

Imho there are way worse issues, like how Linus did nothing but talk shit when the deep state basically killed pax/grsecurity for normal users.

Or how Linus insists that broken sha1 is fine for git because, as he says, nobody puts binary data (images, etc.) in git repositories.


 No.1051923>>1051932

>>1051916 [continued]

In fact I think OP's "problem" might not even be an issue if Linus got his shit together.

Git commits can be PGP signed so if the latest commit is signed by NSA instead of GCC then I know I shouldn't compile it.

With sha1 however, NSA can modify any commit and keep a valid tree because git signatures sign the hash, not the diff.

All you get is some binary data in the modified commit, otherwise bruteforce would be too expensive.

Also: The demoscene will notice when their 4k files are 400k.


 No.1051931

>>1051890

It's a cautionary tale about trusting your compiler/chip architecture/OS. It's not referring to any contemporary attack of this nation. Its purpose is to get you to think about this kind of attack.


 No.1051932>>1051939

>>1051923

then you just have to trust that only the right person has the key. the nsa could have a copy of it and you would never know


 No.1051939

>>1051932

Yes, this is all nothing new. Snowden brought those thoughts into mainstream culture.

All sorts of obscure attacks are possible.

OpenVAS told me to firewall certain types of ping packets because people could get my exact system time which makes it easier to bruteforce pseudorandom numbers.

I still don't understand what you want to tell us.


 No.1051940>>1052200

For people who are interested in it.

A processor is not a trusted black box for running code; on the contrary, modern x86 chips are packed full of secret instructions and hardware bugs. In this talk, we'll demonstrate how page fault analysis and some creative processor fuzzing can be used to exhaustively search the x86 instruction set and uncover the secrets buried in your chipset.

https://www.youtube.com/watch?v=KrksBdWcZgQ


 No.1052200>>1052206

>>1051940

We could all go back to 68k...


 No.1052206

>>1052200

Of course it's possible but people just don't care. People are too addicted to their high performance processors to downgrade to lower speed chips. I've got a Lemote Yeelong 8101B for the sake of my freedom and a Thinkpad x240 as my low priority shitposting machine.


 No.1052686>>1052697

>>1039291 (OP)

Terry solved this backdoored-compiler problem.

You simply roll your own. You write your own compiler, and your own OS.

Problem solved. QED.


 No.1052697>>1052703

>>1052686

The trusting trust problem also includes the possibility of the processor itself being malicious to the owner. This scenario is referenced in the paper.


 No.1052703>>1052738 >>1052873

>>1052697

Which is probably why Terry never implemented networking.

Building your own CPU is tought though. sub-micron lithography isn't as cheap and easy as 3d printing.

Universities and others do have lithographic systems that can be used by students however.

They are usually terribly outdated, but they are probably a more secure option than multi-project wafers, althought the latter is still better than buying mass produced crap.

https://en.wikipedia.org/wiki/Multi-project_wafer_service


 No.1052738

>>1052703

Terry didn't implement networking because God told him not to do it. God also told him to use 16 colors and a very low screen resolution.


 No.1052850>>1052861 >>1063210

>>1039774

It really is a UNIX and C problem. POSIX had 1100 "functions" plus dozens of commands and tons of other bullshit. It's a random pile of shit AT&T got standardized in order to collect licensing fees. It was successful at that purpose, since all sorts of non-UNIX companies like DEC and IBM started selling UNIX to comply with the "standard," but it sucked for users. With a good OS or programming language, you can read the history and papers and discover why they chose to do things one way over another, but with UNIX all the reasons are brain damaged bullshit like putting binaries with the home directories in /usr because /bin ran out of space. What really sucks about UNIX is that most of the bad things in UNIX were originally good things in another OS, but the UNIX implementation is so bad that even non-weenies think the whole idea is bad. It's not just OOP or binary files. I learned from the Microsoft article on "fork" that it was originally from Project Genie, which did it better. The one thing I thought was a real innovation of UNIX (albeit a bad one) turned out to be yet another misimplementation of something in a better OS.

>Although the term originates with Conway, the first implementation of a fork operation is widely credited to the Project Genie time-sharing system [61]. Ritchie and Thompson[70] themselves claimed that Unix fork was present “essentially as we implemented it” in Genie. However, the Genie monitor’s fork call was more flexible than that of Unix: it permitted the parent process to specify the address space and machine context for the new child process [49,71]. By default, the child shared the address space of its parent (somewhat like a modern thread); optionally, the child could be given an entirely different address space of memory blocks to which the user had access; presumably, in order to run a different program. Crucially, however, there was no facility to copy the address space, as was done unconditionally by Unix.

That's another problem with UNIX and the "minimalism" philosophy. Everything is less flexible than the operating system it was originally from. It used to be that new versions of something were more flexible and more powerful. Now things get less powerful and remove features but still get more bloated.

>>1051916

>What makes you think that if FORTRAN and COBOL were useful programming languages that hadn't died out they wouldn't be as monopolized as is the case with others now?

Because the languages were simpler.

If a vendor decides to do something about the crass
inadequacies of UNIX we should give them three cheers, not
start a flame war about how the DIRECTORY command *must*
forever and ever be called ls because that is what the great
tin pot Gods who wrote UNIX thought was a nice, clear name
for it.

The most threatening thing I see in computing today is the
"we have found the answer, all heretics will perish"
attitude. I have an awful lot of experience in computing, I
have used six or seven operating systems and I have even
written one. UNIX in my view is an abomination, it has
serious difficulties, these could have been fixed quite
easily, but I now realize nobody ever will.

At the moment I use a VMS box, I do so because I find that I
do not spend my time having to think in the "UNIX" mentality
that centers around kludges. I do not have to tolerate a
help system that begins its insults of the user by being
invoked with "man".


Apollo in my view were the only UNIX vendor to realize that
they had to put work into the basic operating system. They
had ACLs, shared libraries and many other essential features
five years ago.


What I find disgusting about UNIX is that it has *never*
grown any operating system extensions of its own, all the
creative work is derived from VMS, Multics and the
operating systems it killed.


 No.1052861>>1063197

>>1052850

Why wouldn't you put system resources for unix in unix system resources? You seem to be the one with braindamage.


 No.1052873>>1052876

>>1052703

>Building your own CPU is tought though. sub-micron lithography isn't as cheap and easy as 3d printing.

Use 7400 series TTL or CPUs that have been extensively reverse-engineered like the 6502.


 No.1052876>>1052890 >>1055547

>>1052873

What are you going to do? Post to 8chan from an NES? Couldn't really do anything contemporary with an old processor like that, but I am not saying demoscene stuff isn't cool and technically challenging.


 No.1052890>>1052894

>>1052876

I'm sure there's some autists out there using a text browser to post from a NES.

>you can't really do anything with a old proccessor like that

Yes you can, you can play games on it for example. What you mean to say is that your selection of modernish software to use on it would be extrememly limited since most programs are bloated and unoptimized peices of shit.


 No.1052894

>>1052890

Do you not what the word 'contemporary' means? Pull your head out of your ass.


 No.1053549

bumpy


 No.1055546

>>1039301

>be government spy contractor

>hack yourself so when somebody discovers your botnetted warez you can blame the hackers

classic


 No.1055547

>>1052876

I can post with Seamonkey on OpenBSD 6.4 and a Thinkpad X21, it's not even slow or anything.


 No.1062746>>1063197

>>1039750

this is completely disconnected from unix or c, you schizo, it affects all

take meds


 No.1062859>>1063197

>>1039750

Those kind of bugs predate UNIX, you MIT reject

https://www.multicians.org/security.html


 No.1063197>>1063210 >>1063302 >>1063367

>>1052861

Because that's not what it means. It's revisionist bullshit to pretend /usr wasn't really a disk for user home directories. One of the things I hate about UNIX culture is that they shit on real history and hate when people know the truth. It's not enough that they made a shit OS and a shit language, they have to pretend better ones didn't actually do what they really did. That's why they say the hierarchical file system came from UNIX and not Multics, because it would get people looking into Multics and comparing it with UNIX. They made up this myth that Multics died in the 60s and didn't do anything besides influence UNIX.

http://www.unix.org/what_is_unix/history_timeline.html

>Computer aided design, manufacturing control systems, laboratory simulations, even the Internet itself, all began life with and because of UNIX systems.

That's all bullshit, and that's the official UNIX site. For some reason, only UNIX weenies do this, then they say truth and historical accuracy don't mean anything when you call it out. Another thing they do is only talk about popularity and how widespread UNIX is. Never once do they mention technical aspects, like how some innovation in C or UNIX improved programming or made things easier or more reliable. Their whole "argument" is that UNIX is "good" because it's popular and replaced operating systems and languages that were originally used for the same things, then they complain when anyone wants to replace C and UNIX with something better.

>>1062746

Monopolies and monocultures are bad for security, but especially when they're based on an insecure OS written in a known bad language. Computer companies used to have multiple hardware systems each with multiple operating systems. It would be a lot harder to "take down the Internet" when people aren't all running the same thing.

>>1062859

I posted that myself once to show that the "Thompson" hack was before Ken Thompson. That's why I mentioned multiple compilers and FORTRAN and COBOL. Real standards are about having multiple implementations and not designing a language that would prevent hardware innovation. Weenie "standards" are about making multiple implementations harder and forcing everyone to use one single implementation. It's happening for Linux and Chrome.

       Raise your hand if you remember when file systems
had version numbers.

Don't. The paranoiac weenies in charge of Unix
proselytizing will shoot you dead. They don't like
people who know the truth.

Heck, I remember when the filesystem was mapped into the
address space! I even re<BANG!>


 No.1063210>>1063367


 No.1063302>>1063367

>>1063197

There's nothing better.


 No.1063367

>>1063197

based

>>1063210

based

>>1063302

unbased and LARP




[Return][Go to top][Catalog][Screencap][Nerve Center][Cancer][Update] ( Scroll to new posts) ( Auto) 5
84 replies | 4 images | Page ???
[Post a Reply]
[ / / / / / / / / / / / / / ] [ dir / agatha2 / animu / ausneets / b2 / choroy / dempart / freeb / vichan ][ watchlist ]