[ / / / / / / / / / / / / / ] [ dir / cutebois / hkpol / kpop / quests / rnd ][Options][ watchlist ]

/tech/ - Technology

You can now write text to your AI-generated image at https://aiproto.com It is currently free to use for Proto members.
Email
Comment *
File
Select/drop/paste files here
Password (Randomized for file and post deletion; you may also set your own.)
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Expand all images

File (hide): c76db1c59ab13eb⋯.jpg (384.3 KB, 1400x933, 1400:933, steve klabnik.jpg) (h) (u)

[–]

 No.981391>>981432 >>981679 >>982107 >>982723 >>982788 [Watch Thread][Show All Posts]

Should you learn C to “learn how the computer works”?

>I’ve often seen people suggest that you should learn C in order to learn how computers work. Is this a good idea? Is this accurate? I’m going to start with my conclusion right upfront, just to be crystal clear about what I’m saying here:

>C is not “how the computer works.”

>I don’t think most people mean this phrase literally, so that is sort of irrelevant.

>Understanding the context means that learning C for this reason may still be a good idea for you, depending on your objectives.

>I plan on making two follow-up posts as well, exploring more implications of this idea, but this is already quite a lot. I’ll update this post with links to them when I make them.

https://words.steveklabnik.com/should-you-learn-c-to-learn-how-the-computer-works

 No.981409>>981419

C is shit


 No.981410>>981413

Haha I saw that on Hacker News too OP ;) what's your username?


 No.981413

File (hide): 275286846288e05⋯.jpg (119.92 KB, 1024x683, 1024:683, steve klabnik 7.jpg) (h) (u)

>>981410

I'm Steve Klabnik


 No.981415>>986752

based steveposter fighting the echochamber


 No.981419>>981423

>>981409

For storing handling, yes. Otherwise gas yourself.


 No.981423

>>981419

What about memory handling?


 No.981429>>981431

Steve will chop his dick off and xis suicide commit will be pushed to master.


 No.981431

>>981429

>master

It's mistress now.


 No.981432>>981435 >>981454

>>981391 (OP)

>Should you learn C to “learn how the computer works”?

What fucking retard ever said "learn C to learn how a computer works"?


 No.981435>>981544 >>981574

>>981432

Literally all the LARPers on here. That's why I wrote this.


 No.981438>>981441 >>981442 >>981523 >>981605 >>981924 >>982013 >>982014 >>982087 >>982093 >>982270 >>982331 >>984436 >>986752 >>986773 >>986784 >>986818

File (hide): f25a620cd496c78⋯.png (173.55 KB, 548x410, 274:205, SleepingTerryDavis.png) (h) (u)

Holy C is the only language that one should bother learning, so fuck off with all your retarded useless languages


 No.981441

>>981438

Sleep Tight Terry


 No.981442

>>981438

Sleep Tight Terry


 No.981443>>981574

>Which programmers don’t understand how computers work?

>There’s just one problem with this: C also operates inside of a virtual machine.

>If C is like a version of assembly language that works on multiple computers with multiple architectures, then it cannot function exactly how each of those computers work simultaneously.

There is such a thing as anti-knowledge.


 No.981454

And here I was thinking C started as a joke among mates

>>981432

Never heard that either, avoiding idiots has a its perks


 No.981455>>981490 >>981499 >>981513 >>981522 >>981606 >>982708

OP didn't even read the fucking article apparently. Here are some snippets, because OP can't fucking focus for shit.

> From the C99 spec, Section 5.1.2.3, “Program Execution”:

> "The semantic descriptions in this International Standard describe the behavior of an abstract machine in which issues of optimization are irrelevant."

> In my opinion, this is the most important concept to understand when learning C. C does not “describe how the computer works,” it describes how the “C abstract machine” works. Everything else of importance flows from this concept and its implications.

Note that when he says "C operates in the context of a virtual machine", he doesn't mean a sandbox, and he doesn't mean a bytecode interpreter. Assumptions of execution that act to ignore hardware specifics (like ISAs) are a "virtual machine". All non-assembly languages that describe execution in the specification are describing a virtual machine.

> What do people actually mean? In the context of “should a Rubyist learn C to learn about how computers work”, this desire to drop “down to the metal,” as it were, is an interest in understanding not only their program, and how it works inside the VM, but to understand how the combo of their program + their VM operates in the context of the machine itself.

>

> Learning C will expose you to more of these kinds of details, because the abstract machine maps so much more closely to hardware, as well as abstractions provided by operating systems. C is very different than these sorts of languages, and so learning C can teach you a lot.

> But it’s also important to remember that C is fundamentally an abstraction of hardware, and abstractions are leaky. Don’t conflate what C does or how it operates with the machine itself.

Steve Klabnik is a fag, but OP is a retard who can't read. The entire point of the post is that learning C will not teach you how computers work. It will teach you more about how computers work than a higher level language, but C is still very abstracted from the hardware. C doesn't teach you shit about registers, dealing with interrupts, dealing with manual stack frames, manual manipulation of the program stack, or anything like that. C will not teach you how the computer works.

Steve is right in this case.


 No.981470

>"learn how the computer works"

>spends the whole time talking about hardware

When people talk about "how the computer" works, they mean the operating system and operating system level components. This sometimes means the hardware, insofar as operating systems are designed to work on a small number (and sometimes only one) architecture. It also means OS level apis, eg graphics, io, ipc, etc. In higher level languages, these concepts are abstracted away into generic OS agnostic interfaces. In c, you are encouraged to use the low level concept directly, allowing you to learn how the high level interfaces are implemented, and how the operating system really works.


 No.981490

>>981455

how about looking into what computers even are.


 No.981499>>981513 >>981520 >>981584

You should learn C to learn how modern computers with flat untagged memory work.

>>981455

A very insightful contrarian comment about the nature of C appears in an AlexanderStepanov (C++ STL designer) interview by Al Stevens in DrDobbsJournal (3/1995) (http://www.sgi.com/tech/stl/drdobbs-interview.html):

"Let's consider now why C is a great language. It is commonly believed that C is a hack which was successful because Unix was written in it. I disagree. Over a long period of time computer architectures evolved, not because of some clever people figuring how to evolve architectures---as a matter of fact, clever people were pushing tagged architectures during that period of time---but because of the demands of different programmers to solve real problems. Computers that were able to deal just with numbers evolved into computers with byte-addressable memory, flat address spaces, and pointers. This was a natural evolution reflecting the growing set of problems that people were solving. C, reflecting the genius of Dennis Ritchie, provided a minimal model of the computer that had evolved over 30 years. C was not a quick hack. As computers evolved to handle all kinds of problems, C, being the minimal model of such a computer, became a very powerful language to solve all kinds of problems in different domains very effectively. This is the secret of C's portability: it is the best representation of an abstract computer that we have. Of course, the abstraction is done over the set of real computers, not some imaginary computational devices. Moreover, people could understand the machine model behind C. It is much easier for an average engineer to understand the machine model behind C than the machine model behind Ada or even Scheme. C succeeded because it was doing the right thing, not because of AT&T promoting it or Unix being written with it." (emphasis added)


 No.981500>>983224 >>983226

>hmm i wonder how i can get all these bigots to stop rebelling against sjws?

>I KNOW I'LL TRASH THEIR CULTURE AND FUCK WITH THEM

we'll never stop.

you'll lose.

get used to it.


 No.981513

>>981455

>>981499

You're both right. C arose at a time where multiple machine architectures existed, but no easy way to port code from one architecture to another. C was designed as a light abstraction layer above these differing hardwares, with the compiler being responsible for translating from the abstract C machine to the target machine.

Then we went and built different C libraries for the different OSs, and reinvented yet another problem from our professional past.

> inb4 C wasn't first...

Yes I know. I'm deliberately skipping them to keep this short and focused.


 No.981520>>981541 >>981549

>>981499

>machine model of C is easier to understand than Ada and Scheme

Scheme, and other functional languages are by design more abstract than languages like C. If you already "know how a computer works" then something like scheme or sml is going to look alien at first , and difficult to reason about how the machine executes your code.

Ada is also more abstract then C with things like parameter modes, thick pointers and array indexes. But in contrast to scheme languages it is still fairly easy to reason about how the machine would execute it, e.g. iterating over an array and updating values should compile down to almost the same assembly, even though with Ada you can use


for each x of foo loop
x := x + 1;
--or
for I in foo'range loop
foo(I) := foo(I) + 1;

Now the question is what the supposed benefit is of C where you do pointer arithmetic instead of working with indexed arrays, given that they compile to near identical code.


 No.981522>>981536 >>981606

File (hide): 94947219a501692⋯.png (315.88 KB, 618x948, 103:158, thus always to tyrants.png) (h) (u)

>>981455

Basically right. Steve is also basically right, but phrased his argument in a way designed to induce controversy, not to facilitate understanding (he probably did this deliberately because he's a fag.)


 No.981523

>>981438

F

Sleep Tight Terry


 No.981536>>981538 >>981584

>>981522

>Steve is also basically right

For Steve to be right his point is so fucking retarded that it's mind boggling, something like computer languages are for humans not machines. Y'know why because this is typical lefty-style propaganda.

>but phrased his argument in a way

<java java java virtual machine C is definitely bad java java maybe C is not that bad java JVM virtual machine C runs on a virtual machine java java C might not run on an actual virtual machine but java virtual machine java

It's like you've never seen propaganda before.


 No.981538>>981551

>>981536

>For Steve to be right his point is so fucking retarded that it's mind boggling,

No, he's just stating a obvious boring truth using deliberately controversial language. Contrived controversy is a great way to promote your faggot blog and act like you're really smart and great. You having such a strong negative reaction to it is exactly what he wanted.


 No.981541>>981556

>>981520

You can use array subscripts in C too, you know. But the point is that, pointer arithmetic isn't black magic like array indexing is, since it's what really happening when you use those convenient brackets.

But I really find pointer arithmetic neat.


 No.981544

>>981435

Literally who?


 No.981546>>981547 >>981617 >>982758 >>983191

Alright riddle me this /g/. In most modern languages you can turn off safety features like array bounds checking and null dereferencing using compiler flags. So why not use a modern language like swift or rust instead of C. Both of them support the imperative programming design and have nicer features. You can be just as close to the metal in swift and rust. C has an ancient build system, its bloated, and you don't even have RAII. It seems you should only use C if you're working on legacy code.


 No.981547

>>981546

Use whatever you want to for your own projects. And be assured that everybody else is going to use whatever they want to. Stop worrying so much about it. If you spend your time trying to get everybody on the same page, you are wasting your time.


 No.981549>>981561

>>981520

Anybody who's confused by scheme probably has brain damage. It's one of the easiest languages to learn ever made. Certainly easier than C (not that C is very hard either.)


 No.981551>>981554

>>981538

>controversy

lemme check the niggersphere. Nope upboats everywhere. You think upboats come from people who think he's actively deceitful or from people who socially cuck themselves into thinking he's genuine?


 No.981554

>>981551

The "upboats" come from people who think he's saying something novel or interesting instead of banal, due to the way he deliberately presented the banal as something other than banal.


 No.981556>>981562 >>981574

>>981541

>array indexing is black magic.

Not really since the behavior, at least in Ada, is quite well defined.

I don't understand why people accept abstractions like if/then/else and loop constructs without second thought, but things like indexed arrays and strings are seen as the spawn of satan.


 No.981561>>981574

>>981549

You didn't understand the point. It's a lot easier for C to guess what assembly will be produced, even in complex cases. For a dynamically typed garbage collected language? Fuck no.


 No.981562>>981613

>>981556

>I don't understand why people accept abstractions like if/then/else and loop constructs without second thought, but things like indexed arrays and strings are seen as the spawn of satan.

Or switch, which will often work in a way that's completely unexpected to novice programmers (who expect that it always compiles down to an if-else-if chain).

Basically it's because people are dumb. They don't recognize their own biases, and frequently aren't aware of how little they know.


 No.981574>>981999

>>981435

I haven't seen anybody recommend C for that purpose here, I say this because I shill assembly\emu dev for that purpose. The norm here is scheme for that, are you sure you are on the right imageboard?

>>981443

Web developers, and people involved in that entire ecosystem sure don't.

>>981556

People accept if/else/etc because of >>981561.


 No.981584>>981595 >>981620 >>981936 >>981999

AT&T shills knew that C sucked, so they made up bullshit about it being "portable assembly" and this is one of the reasons why people think computers and software have to be so complicated and that bugs and exploits are normal. AT&T shills blamed everything C couldn't do on the hardware. C doesn't have nested functions because the PDP-11 didn't. C doesn't have array bounds or multi-dimensional arrays or slices because the PDP-11 didn't. C doesn't have garbage collection because the PDP-11 didn't. Lisp machines have all this, but the PDP-11 didn't. Then UNIX companies like Sun created RISCs which are designed around C and UNIX, so they could blame their new hardware for why C sucks and C for why their hardware sucks.

>>981499

>Over a long period of time computer architectures evolved, not because of some clever people figuring how to evolve architectures---as a matter of fact, clever people were pushing tagged architectures during that period of time---but because of the demands of different programmers to solve real problems. Computers that were able to deal just with numbers evolved into computers with byte-addressable memory, flat address spaces, and pointers.

That's revisionist bullshit. The evolution into the PDP-11 style of hardware happened before C and it was just one kind of machine. String handling does not need byte-addressable memory. More advanced hardware was motivated by more powerful languages and more possibilities for speeding up programs and simplifying software.

>C, reflecting the genius of Dennis Ritchie, provided a minimal model of the computer that had evolved over 30 years. C was not a quick hack. As computers evolved to handle all kinds of problems, C, being the minimal model of such a computer, became a very powerful language to solve all kinds of problems in different domains very effectively.

This is more revisionism. C was a quick hack and it also sucked even more before ANSI. Most C weenies idolize C because they don't know anything else, or if they do, it's a UNIX language like Java or awk.

https://www.bell-labs.com/usr/dmr/www/chist.html

>This is the secret of C's portability: it is the best representation of an abstract computer that we have. Of course, the abstraction is done over the set of real computers, not some imaginary computational devices.

Tagged and segmented architectures aren't imaginary. Too many computer companies went out of business and everyone left has a monopolistic mentality. Everything now has to be "compatible" which means running 50 million line browsers and bloated C++ compilers.

>Moreover, people could understand the machine model behind C. It is much easier for an average engineer to understand the machine model behind C than the machine model behind Ada or even Scheme.

C and C++ are so complex and poorly defined that everyone has to oversimplify and misunderstand everything. Even comp.lang.c and the standards committee get confused.

>C succeeded because it was doing the right thing, not because of AT&T promoting it or Unix being written with it.

To me, "doing the right thing" means working correctly, signalling errors, and so on. UNIX weenies think the "right thing" is popularity and popularity is the only measure of value, which sucks. They need to defend everything wrong with C because losing popularity would mean C is worse.

>>981536

>something like computer languages are for humans not machines.

That's absolutely true. Computer design used to be about how humans use computers and it always was. Programming languages are for people to read. Tagged architectures were built to make computers better for people. UNIX's tape drive emulators (pipes and byte sequence files) have almost nothing to do with how we use computers today and lead to enormous wastes in code, memory traffic, CPU usage, and so on, but they fit the PDP-11 and how it was used. Programs have tens of millions more lines of code than they need because they're based on "cobbled together gunk" that has nothing to do with how we use computers.

   Hey. This is unix-haters, not RISC-haters.

Look, those guys at berkeley decided to optimise their
chip for C and Unix programs. It says so right in their
paper. They looked at how C programs tended to behave, and
(later) how Unix behaved, and made a chip that worked that
way. So what if it's hard to make downward lexical funargs
when you have register windows? It's a special-purpose
chip, remember?

Only then companies like Sun push their snazzy RISC
machines. To make their machines more attractive they
proudly point out "and of course it uses the great
general-purpose RISC. Why it's so general purpose that it
runs Unix and C just great!"

This, I suppose, is a variation on the usual "the way
it's done in unix is by definition the general case"
disease.


 No.981586

First half of the article is only about staying true to the clickbait. Don't waste my time and get to the point, everyone knows C is not asm.


 No.981595>>982690

>>981584

>speeding up programs and simplifying software.

Read "fixing my broken language for lazy bums who want an overly complex machinery doing their own malloc/free at runtime". Once you accept that only compile time garbage collection is okay, you'll be liberated from your ivory tower of lambda calculus retardation.


 No.981605

>>981438

Stealing this

Sleep Tight Terry


 No.981606>>982690

>>981455

>Steve is right in this case.

He's stating the obvious by attacking a strawman in a quite obnoxious way. And even when I started learning C and knew nothing, I knew what assembler is and I knew that I wasn't writing direct machine code - I mean I needed to compile the stuff, that automatically made me think about that. No one can be that stupid. C is though a good compromise between trying to be an autist and starting with assembler or using Java and dealing with object orientation and all that modern bullshit that is really useful in software development but really not that interesting when you start out. Having pointers and memory management in C, that'll better indeed teach you something about 'how a computer works'. Or actually about how a computer doesn't work when you fuck things up.

>>981522

Accurate.


 No.981613>>981623

>>981562

>Or switch, which will often work in a way that's completely unexpected to novice programmers (who expect that it always compiles down to an if-else-if chain).

Hmm, if I use a switch statement I expect it to be fast by not staggering through that chain. It should be a lookup table. I wonder if bytecode languages handle it the same or if it all gets either optimized or mangled to whatever in the process.


 No.981617

>>981546

>swift or rust

yeah sure, I'll use the apple language. Not too sure about who's behind rust either. C on the other hand is ancient. There's the gcc we all know and used for years. And there's a C compiler for everything while until a few weeks ago, I didn't even have a Rust compiler on my machine. (Some package I installed needed it to be compiled) Same thing with PHP when it comes to web stuff. It runs basically everywhere, it can do everything I'd ever need a website to do and it's pretty open whereas I'm always a little suspicious when it comes to Oracle.


 No.981620>>981930

>>981584

somebody screencap this, it doesn't fit on my poorfag screen.


 No.981623>>981632 >>981643

>>981613

>Hmm, if I use a switch statement I expect it to be fast by not staggering through that chain. It should be a lookup table.

Most novice programmers get this wrong. Ask around and see for yourself.


 No.981632>>981643

>>981623

Yeah, I think at some point even one of my professors said that. I wouldn't know either if it wasn't for Terry.


 No.981643>>981657 >>982751

>>981623

>>981632

You know that not all switches get compiled into jump LUTs, right? A


switch(i)
{
case 0:
func1();
break;

case 100000:
func2();
break;
}

won't, for example. But yeah, you're supposed to use it when you know the compiler will not produce any branch.


 No.981657

>>981643

>Or switch, which will often work in a way that's completely unexpected to novice programmers (who expect that it ALWAYS compiles down to an if-else-if chain).

It helps to read.


 No.981666>>982758

I remember I liked this book.


 No.981679>>981731 >>982690

>>981391 (OP)

I'm with Linus Torvalds on this: "even if the choice of C were to do *nothing* but keep the C++ programmers out, that in itself would be a huge reason to use C."

http://harmful.cat-v.org/software/c++/linus

C++ and object-oriented languages do more to teach you bad habits on how to avoid programming properly than help you in any way. Once you try to write a proper efficient program, you're frequently back to doing things the C way. If you take a C programmer but make him do OOP bullshit, the C programmer will still manage a decent job. But if you take a OOP programmer and ask him to write C, it will be a fucking disaster and you will realize what a clueless monkey you are dealing with. And when you want a OOP programmer to write efficient software, he's gonna have to resort to doing things the C way more instead of leaning on all those layers of abstraction and bullshit frameworks (and those layers of abstraction and frameworks also contribute wonderfully to bugs and unreliable behavior). And that's a big reason why people should learn to program in C instead of C++ and other BS.


 No.981731

>>981679

baste and redpilled


 No.981757

ok ok learn c and how the gcc works aswell as assembly/machine language.


 No.981855>>981915

You should learn Forth and assembly for your arch, tbh. And start with something simple, like a Z80.


 No.981889

No. C is too high level, it'll teach you nothing.


 No.981915>>981940 >>982758

>>981855

Why the Z80? I don't even think PolyFORTH was ported to the Z80. Stop LARPing and pretending you're a forther, and just use gforth on <your os>.


 No.981924

>>981438

Sleep Tight Terry


 No.981930>>982121

File (hide): 99b3ff39b058e38⋯.png (150.99 KB, 1472x847, 1472:847, I'm the smartest programme….png) (h) (u)

>>981620

Here you go.


 No.981936>>981971 >>982690

>>981584

>Everything now has to be "compatible" which means running 50 million line browsers and bloated C++ compilers.

Imagine being exactly as retarded as UNIX weenies, with the same identical disdain of backwards compatibility aka "making software that stays useful and reliable"


 No.981940>>982018

File (hide): 2004c2398beb6bb⋯.jpg (3.8 MB, 2839x1406, 2839:1406, front_300.jpg) (h) (u)

>>981915

Dumb nigger, every 80's computer had several Forth implementations. Z80 also has the advantage of tons of software via CP/M.


 No.981971

>>981936

>making software that stays useful and reliable

If they actually did this you wouldn't need backwards compatibility.


 No.981999

File (hide): 4395757a7a0fd60⋯.jpg (61.78 KB, 1280x720, 16:9, steve klabnik a.jpg) (h) (u)

>>981574

>I haven't seen anybody recommend C for that purpose here

That's because you are a newfag.

>The norm here is scheme for that, are you sure you are on the right imageboard?

This is /tech/, not /g/. (((Scheme/LISP))) niggers are not allowed here.

>>981584

based


 No.982013

>>981438

Sleep Tight Terry


 No.982014

>>981438

Sleep Tght King Terry ;_;


 No.982018>>982100

>>981940

CP/M isn't what I think of when I think Forth. The only Forth I know for z80 is CamelForth. If you want to learn Forth and a CPU, learn the 6502.


 No.982087

>>981438

Sleep Tight Terry


 No.982093

File (hide): aee06cd96fabeb3⋯.jpg (92.38 KB, 977x543, 977:543, DjI_ZtuWwAgpoRJ.jpg) (h) (u)

>>981438

Sleep Tight Terry

Press F


 No.982100>>982277

>>982018

Well that's one of them, but you didn't look very hard. That CDROM cover I posted is a hint of the vast amount of tools you can get from CP/M (in addition to whatever "native" software is available for any given Z80 computer, like say the Amstrad CPC/PCW series). And it just so happens, those systems have an 80-column screen, which is quite a bit more comfy for programming than a VIC-20 and its 22 columns. But otherwise it doesn't really matter which 8-bit computer you use, because they're all simple and a good starting point to learn the hardware.


 No.982107

>>981391 (OP)

I know javascwipt Im basicwy a pwofwessional pwoggrammaw I'm going to wite evewything in nodejs even my computer apps in node-webkit :)


 No.982121

>>981930

thanks fren


 No.982270

>>981438

Sleep tight Terry.


 No.982277

>>982100

I didn't look, I was going off memory.


 No.982331

>>981438

Sleep tight Terry


 No.982690>>982692 >>982728 >>982885 >>983071

>>981595

>Read "fixing my broken language for lazy bums who want an overly complex machinery doing their own malloc/free at runtime".

Tagged architecture has many serious advantages. All data structures can be used no matter where they came from because they share the same type system. That's great because it means you don't have to care about what language or garbage collector other people use. Not having to malloc/free is just one of the advantages.

>>981606

>No one can be that stupid.

UNIX weenies are brain damaged.

https://www.quora.com/Why-is-the-C-language-taught-if-the-future-is-Java

>C is though a good compromise between trying to be an autist and starting with assembler or using Java and dealing with object orientation and all that modern bullshit that is really useful in software development but really not that interesting when you start out.

C teaches how C works. C shits on the last 60 years of compiler technology (even though it's not that old), so you will have no idea what compilers and programming languages can actually do.

>Having pointers and memory management in C, that'll better indeed teach you something about 'how a computer works'. Or actually about how a computer doesn't work when you fuck things up.

Understanding how a GC moves objects and pointers teaches you a lot more about memory management than malloc and free. There are also a lot of languages that have pointers and the equivalent to malloc and free but don't suck.

>>981679

>C++ and object-oriented languages do more to teach you bad habits on how to avoid programming properly than help you in any way. Once you try to write a proper efficient program, you're frequently back to doing things the C way.

That's because C++ sucks. The "C way" is the shitty way. People in the 80s and 90s were calling C++ "weakly object-oriented" but AT&T shills blamed OOP because blaming C for why C++ sucks would be bad for their business. OOP that was good meant different languages like Smalltalk and CLOS in Common Lisp. Common Lisp actually does have all the code reuse benefits that OO proponents said OOP has. Java is an improvement over C and C++, which is all it was meant to be.

>And when you want a OOP programmer to write efficient software, he's gonna have to resort to doing things the C way more instead of leaning on all those layers of abstraction and bullshit frameworks (and those layers of abstraction and frameworks also contribute wonderfully to bugs and unreliable behavior).

Those are C++ problems, not OOP problems. OOP is about not needing frameworks and reducing bugs. Computer scientists used to say that most design patterns and frameworks are to make up for flaws and deficiencies in the language.

>And that's a big reason why people should learn to program in C instead of C++ and other BS.

There are more languages than C and C++.

>>981936

>Imagine being exactly as retarded as UNIX weenies, with the same identical disdain of backwards compatibility aka "making software that stays useful and reliable"

Browsers and Linux are not useful and reliable. They constantly need more and more millions of lines of code because they're badly designed, but they still can't do what mainframe OSes did in the 60s, like file versions and proper error handling instead of panic. UNIX linkers still can't handle anything more than PDP-11 assembly could. UNIX "design" is based on bullshit like running out of disk space on a PDP-11 (/usr) and using a tape drive archive for code libraries ("ar" format), so UNIX weenies assume that everything is only used because of popularity and "backwards compatibility" and not because it's actually good, or even that it can be good.

The fundamental design flaw in Unix is the asinine belief
that "programs are written to be executed by computers
rather than read by humans." [Now that statement may be
true in the statistical sense in that it applies to most
programs. But it is totally, absolutely wrong in the moral
sense.]

That's why we have C -- a language designed to make every
machine emulate a PDP-11. That's why we have a file system
that forces every file to be viewed as a sequence of bytes
(after all, that's what they are, right?). That's why
"protocols" depend on byte-order.

They have never separated the program from the machine. It
never entered their tiny, pocket-protectored with a
calculator-hanging-from-the-belt mind.


 No.982692>>982803

>>982690

>They have never separated the program from the machine. It

>never entered their tiny, pocket-protectored with a

>calculator-hanging-from-the-belt mind.

And thank god it didn't! Because otherwise it wouldn't be fun to do anymore.


 No.982708>>982803

>>981455

>The entire point of the post is that learning C will not teach you how computers work. It will teach you more about how computers work than a higher level language, but C is still very abstracted from the hardware. C doesn't teach you shit about registers, dealing with interrupts, dealing with manual stack frames, manual manipulation of the program stack, or anything like that. C will not teach you how the computer works.

The reason why C is so ubiquitous is because C allows for manipulating what you blatantly say it cannot. C is also considered dangerous for this very reason -- extreme freedom.


 No.982711>>982727

Why does nobody recommend learning verilog to learn how computers work?


 No.982718>>982724

So what is the alternative to C? I have yet to find a programming language that is not over-engineered piece of cancer that never stops growing. Literally everything is bloated with (((enterprise))) anti-features or has shit ecosystem. Show me a simple programming language that is at least comparable to C simplicity, speed and ecosystem size and I will not write a line of C in my life ever again.


 No.982723>>982803

>>981391 (OP)

>C is not “how the computer works.”

C closely resembles algebra. If people understand algebra, it's easy for them to understand C more quickly than OOP languages.

Captcha:plebQ.


 No.982724>>982726

>>982718

> ever again

Times change. C wasn't so popular in the 80's, and probably won't be in 30 years.


 No.982726>>982734 >>982803

>>982724

I see no language existing today replacing C as the de facto lingua franca of programming and computers in general. If anything, programmers will revert/resort to programming assembly to the specific make and model of processor. I don't know if Ritchie or Brian W. Kernighan would object to a student learning the electronic switches necessary to operate a computer effectively.


 No.982727

>>982711

...because Verilog has nothing to do with how computers work?


 No.982728

>>982690

Quite frankly, any object system short of CLOS is just pathetic.


 No.982734>>982743 >>982755 >>982758

>>982726

Yeah but nobody knows what's going to happen in 30 years. Maybe the world will have moved to Javascript machines or whatever (not necessarily Lisp, but same concept).


 No.982743>>982758

>>982734

Javascript machines? There's even more that can go wrong than in C!


 No.982751>>982789

>>981643

It could be a LUT where the index is divided. If you had a lot of cases with constant distance like 100 200 300 400...


 No.982755

>>982734

>>982734

> nobody knows what's going to happen in 30 years.

Because nobody knows, you cannot argue that point. I say that most programmers know C and leads to little cost. I do not think to myself that a job programming C would be a poorly paid job.


 No.982758>>982779

>>981546

C/C++ has a lot of tools and 3rd-party libraries, unlike fag languages, like rust and swift, or some meme languages that nobody uses. C and C++ also have better performance than rust and swift.

>>981666

That's a good book!

>>981915

Z80 is popular and simple.

>>982734

>>982743

>tfw when computers itself become based on shitty frameworks, like (((electron)))


 No.982774>>982803

It might be a stretch to say C makes you "understand computers", but I think it helps to understand other languages. I started out with C++, and my understanding was surface level. Like, what the hell does #include <iostream> even mean? Why does std::cout have such weird syntax? What even is std::? Don't even think about what a string is, just make a variable of the string type and hope it works.

Eventually I ended up playing with C because that's what the devkitpro compiler for Wii homebrew uses. Later on I dabbled in HolyC which legitimately helped. It really helps you to understand better when you don't have all these classes and weird operator overloads confusing you when you're just trying to learn what's actually going on. People who start with C++ or java tend to become the "mindlessly copy/paste from stack overflow" type of programmer. In other words, pajeet.


 No.982779>>982795

>>982758

Z80 is fun too! Z80 assembly was actually the first language I ever learned.


 No.982788

>>981391 (OP)

I think people say 'learn C to learn how computers work' should be saying 'learn C to learn how computer software works'. Computers are much more than software.

Software is just a layer of abstraction on computers. The most incomprehensible stuff, at least to normies, is in the circuitry.


 No.982789

>>982751

I doubt compilers do these kind of checks, though.


 No.982795>>982800 >>986868

File (hide): 6920b36f1db3bf3⋯.jpg (53.71 KB, 500x478, 250:239, 692.jpg) (h) (u)

>>982779

Some people tell beginners to start with X86 ASM, however, I DISAGREE. Z80 doesn't have any dumb crap that gets in your way and the ASM code is readable. If you like games (L-Liking to play games every now and then doesn't make me a man-child, right?), then learning GameBoy ASM might be fun. Play around arbitrary code execution bugs in Pokemon Red (https://glitchcity.info/wiki/Arbitrary_code_execution#In_Generation_I), or something. The CPU used in GameBoy is a bit like a synthesis of X86 and Z80. Read the Pan Docs (http://gbdev.gg8.se/wiki/articles/Pan_Docs) if you are interested in GB ASM.

Learn Z80 ASM

http://sgate.emt.bme.hu/patai/publications/z80guide/

http://www.chibiakumas.com/z80/multiplatform.php

http://www.z80.info/lesson1.htm


 No.982800

>>982795

I strongly agree. In my case I started out programming games on the TI-83+ (which is a z80 calculator)


 No.982803>>982811 >>982902 >>983071

>>982692

>And thank god it didn't! Because otherwise it wouldn't be fun to do anymore.

That quote is exactly right. The UNIX weenie idea of fun is writing tens of millions of lines of code to do simple things in complex ways. Linux has hundreds of system calls and there are tens of thousands of kernel developers and 15 million lines of code. Pipes are fragile and depend on the bit encodings of text which is supposed to be human readable. The UNIX way is to pretend that programs communicate through virtual PDP-11 tapes, making everything even more complicated. C bit operators are based on the PDP-11. UNIX weenies "have never separated the program from the machine." They "make every machine emulate a PDP-11."

>>982708

>The reason why C is so ubiquitous is because C allows for manipulating what you blatantly say it cannot. C is also considered dangerous for this very reason -- extreme freedom.

He's absolutely right. C does not do anything about registers, interrupts, stacks, or many other things you can do in assembly and with a compiler, like garbage collection, coroutines, overflow checks, arbitrary precision arithmetic, lazy evaluation, closures, and so on. That's because C is not your computer. C is just a language with a compiler that doesn't do as much as other compilers. Weenies do not respect compilers that do things like bounds checking and string handling because AT&T shills have convinced them that worse tools are better.

>>982723

>C closely resembles algebra.

High level languages resemble algebra. This goes back to "autocodes" designed for scientists and mathematicians in the 50s, which led to Fortran and Algol. C does stupid bullshit like 00011 being equivalent to 00009 and "hello" + 2 meaning "llo" (but you can also subtract 2 again) because it's pointer arithmetic.

http://www.homepages.ed.ac.uk/jwp/history/autocodes/

>>982726

>I see no language existing today replacing C as the de facto lingua franca of programming and computers in general.

That's monopolistic thinking. UNIX weenies believe there should be one language used for everything even if it sucks because that's the AT&T culture. The whole idea of a "de facto lingua franca" sucks because different languages have different ways of doing things. Different data types, concurrency, object systems, and so on. Even worse is a language like C that is designed for flat memory PDP-11 hardware because it prevents better hardware from being built.

>I don't know if Ritchie or Brian W. Kernighan would object to a student learning the electronic switches necessary to operate a computer effectively.

They're AT&T employees. AT&T employees didn't do anything to correct these weenies who say UNIX is the first OS written in a high level language and all that other bullshit, so they probably don't care much about you learning anything.

>>982774

>Why does std::cout have such weird syntax? What even is std::? Don't even think about what a string is, just make a variable of the string type and hope it works.

That weird std:: syntax comes from Lisp packages, like cl::setf means setf in the common-lisp aka cl package. Most languages that use . for methods and structs/records use . for that too, like Ada, Java, and OCaml. I don't know why C++ copied Lisp instead of something that would fit better with the rest of the language. Then again, C and C++ suck, so something that actually makes sense would stick out.

>People who start with C++ or java tend to become the "mindlessly copy/paste from stack overflow" type of programmer. In other words, pajeet.

C, C++, and Java are all UNIX languages. A lot of things in UNIX languages make no sense and have no reasoning besides being whatever the compiler or interpreter did. They copy/paste because they don't know what the code does.

    But it's much worse than than because you need to invoke
this procedure call before entering the block.
Preallocating the storage doesn't help you. I'll almost
guarantee you that the answer to the question "what's
supposed to happen when I do <the thing above>?" used to be
"gee, I don't know, whatever the PDP-11 compiler did." Now
of course, they're trying to rationalize the language after
the fact. I wonder if some poor bastard has tried to do a
denotational semantics for C. It would probably amount to a
translation of the PDP-11 C compiler into lambda calculus.


 No.982811

>>982803

real mvp of the thread


 No.982885>>982902 >>982945

>>982690

>That's why we have a file system that forces every file to be viewed as a sequence of bytes (after all, that's what they are, right?)

You've piqued my interest, what else would a file be viewed as?

(I'm a non-techie)


 No.982902>>982945

File (hide): c78c21f0ab6daee⋯.png (109.64 KB, 263x326, 263:326, disabled_1.png) (h) (u)

>>982803

>C doesn't resemble your hardware

>please use these other languages that resemble your hardware even less

I'd call you a cheap whore for sucking off anything which claims it isn't Unix, but that's too kind. You've abandoned any idea of useful payment long ago and your throat is looser than Nausicaa's valley of the wind, so sometimes you vomit up a mixture of everything you've swallowed lately and call it a post.

>>982885

If his track record is any indication, he'll either forget to respond or give you some retarded filesystem concept from the 70s with such obvious downsides no one has tried it since.


 No.982945>>982979 >>983071

>>982885

>You've piqued my interest, what else would a file be viewed as?

Typed files or data structures. PL/I has keyed files which can be accessed in a random order by key because a disk is not a tape. Files could also be part of an object store or a database. None of these require any changes to hardware because they're just ways to access data on the disk. UNIX weenies who think C is how the computer works also think UNIX file systems are how the disk works.

https://www.kednos.com/pli/docs/reference_manual/6291pro_025.html

https://en.wikipedia.org/wiki/Persistent_object_store

https://en.wikipedia.org/wiki/WinFS

>>982902

>>please use these other languages that resemble your hardware even less

Hardware is designed for programs, so it's actually the other way around. You're using hardware that resembles the languages less, which means they could be faster on different hardware. RISC does this too, except it's designed for C and UNIX.

>I'd call you a cheap whore for sucking off anything which claims it isn't Unix, but that's too kind.

If you've been following my posts, you would notice I've always had the same list of things I like.

>If his track record is any indication, he'll either forget to respond or give you some retarded filesystem concept from the 70s with such obvious downsides no one has tried it since.

There are no downsides. Not being supported by the C standard library is a flaw in C, just like not having strings that don't suck. Typed files don't prevent you from using sequences of bytes and pretending your SSD is really a PDP-11 tape, but it would be stupid.

>In another article WD writes:
>|> [of VMS]
>|> I sure hope so. Any o/s which puts file types in the o/s
>|> instead of the program is really creating problems for the
>|> user, and I find that more of a problem than the user
>|> interface. If programs really want a "$ delimited left
>|> handed nybble swapped hexadecimal" file type, let it be
>|> done in the program or shared library, and not at a level
>|> when all user-written file transfer and backup programs
>|> have to deal with it. As my youngest says "yucky-poo!"

Huh? Let's think about this.

Tighter integration of file types to the OS are not a
problem. In my experience, UNIX offers the weakest file
maintenance offerings in the industry, save for MS-DOS. In
using Tandem Guardian and VMS I've found that ultimately,
one could:

* Back up files.

* Transfer files.

* Convert files.

...much more easily and safely than with UNIX. Yes, it was
between Guardian or VMS systems but instead of going into an
"open systems" (whatever THOSE are) snit, read on.

As a result:

* Each RDBMS has its own backup and restore facility of
varying functionality, quality, and effectiveness,
complicating support for sites adopting more than one
RDBMS.

* All existing UNIX backup and restore facilities are highly
dysfunctional compared to similar facilities under the
aforementioned operating systems. They can make only the
grossest assumptions about file contents (back it up or
not, bud?) and thus cause vast redundancy in backups; if
you change a single byte in the file, you back up the
whole thing instead of changed records.

* Transferring files from system to system under UNIX
requires that all layers of functionality be present on
both sides to interpret files of arbitrary form and
content. Embedded file systems ensure that file transfer
is enhanced because the interpretation and manipulation
facilities will be there even if the highest layers
aren't (ie: you can at least decompose the file). Find
me one person who guarantees they can decompose an Oracle
or Ingres file (ie: someone who has a product that will
always do it and guarantees it'll work for all successive
releases of these packages).

Once one strips away the cryptology, the issue is control.
UNIX is an operating system that offers the promise of
ultimate user control (ie: no OS engineer's going to take
<feature> away from ME!), which was a good thing in its
infancy, less good now, where the idiom has caused huge
redundancies between software packages. How many B*Tree
packages do we NEED? I think that I learned factoring in
high school; and that certain file idioms are agreed to in
the industry as Good Ideas. So why not support certain
common denominators in the OS?

Just because you CAN do something in user programs does not
mean it's a terribly good idea to enforce it as policy. If
society ran the same way UNIX does, everyone who owned a car
would be forced to refine their own gasoline from barrels of
crude...


 No.982979>>983023

>>982945

>PL/I has keyed files which can be accessed in a random order by key because a disk is not a tape.

Hard drives are not solid state drives, they still read sequential data faster than non-sequential data. Why else do you think defragmentation is a thing? There's even daemons which place your most-used programs closer to each other to increase system responsiveness.

>Hardware is designed for programs, so it's actually the other way around. You're using hardware that resembles the languages less, which means they could be faster on different hardware.

And yet cheaper general purpose microprocessors outperformed Symbolics machines at running Lisp code. Sure, hardware is designed for programs but the reverse is also true and taking either for granted wastes time and system resources. This is likely the largest reason for C's grip on system programming: for all its flaws, its structure and quirks make reasoning about the compiler's assembly output easier than more complicated or abstract languages.

>If you've been following my posts, you would notice I've always had the same list of things I like.

False, your taste has changed over time. It took you several months to begin answering basic questions like "if Unix sucks so much, what operating system should I use instead?" and you go through phases where you ramble about certain ideas and OSes more than others or silently drop particular things.

>There are no downsides.

Another lie. In this case, sacrificing HDD performance for muh non-sequential files and muh SSDs when HDDs are still in widespread use with notable reliability advantages is fucking retarded. You also ignore the memory overhead of tagged memory or version-numbered filesystems, then blame the blasted Unix weenies for designers and users of memory-constrained systems in the past not wanting this extra overhead.


 No.983023>>983052 >>983181

>>982979

>sacrificing HDD performance for muh non-sequential files

Being able to look at files as something besides being a stream of bytes does not imply the hard disk has to do random access all the time, nor that it is slower.

Take text files as an example. In Unix it is just a stream of bytes with "0A" taking the role of new line or line terminator. Note that this already causes issues where behavior is not consistent between nano and vi, where nano shows an additional empty line at the end of your text file, and vi doesn't.

Now if the OS were to handle text as a sequence of bounded strings, where the length of the string is stored at start you can easily think of a situation where this is faster than the Unix way.

Take a text file with three lines. The first line contains 100 megabyte worth of random letters, the second is "hello world!", and the third line is again 100 megabyte of random letters.

Now say you want to open the file in an editor. Unix will have to read through 200 megabyte of data to figure out is is a 3 line file.

Our hypothetical system can read the first 80 characters of the first string. Skip ahead to the second line, read it, read the first part of the third line, and skip ahead to the EOF.

An added bonus is that the number of lines in unambiguously defined. You could take it even further and prepend you text file with the line count and pointers to the start of each line.

A design choice like this would cascade down through the userland and libraries, so please don't respond with "but c allows you to do that" because that is not the point.


 No.983052

>>983023

Then you enable line wrapping and it becomes retarded again


 No.983071

File (hide): ecc5ffc69aaa0ba⋯.jpg (51.53 KB, 1280x720, 16:9, steve klabnik 9.jpg) (h) (u)

>>982690

>>982803

>>982945

based UNIX hater weenie is doing a perfect job at bumping my shitty low effort bait thread.


 No.983181>>983237

>>983023

If your text file has 100 megabytes of data on one line you have much bigger problems than muh Unix or muh C. As usual, your proposal's main benefit is increased performance in an extreme edge case where someone does something absolutely fucking retarded, in exchange for greatly increased memory consumption and larger filesizes for sanely-written programs and documents.

The big reason your kind is disliked isn't because you present alternatives, but because you're dishonest and pretend they're free features with no tradeoffs or downsides. Then you throw a fit when hardware and OS designers decide against implementing your favourite features, blaming it on the dastardly Unix weenies and swearing they would love everything you propose if only Rob Pike stole them from other OSes written by the primordial, infallible saints of compsci.


 No.983191

>>981546

Trying to get all programmers to use one programming language is like trying to get all humans to use one human language: good fucking luck.


 No.983224

>>981500

This. I remove librust and gimp rust executables. Can we get a list of other faggy languages that we can gimp?


 No.983226

>>981500

sudo apt-get remove librust

vi <rust app>

>delete a few lines

save


 No.983237>>983255

>>983181

>If you have100 megabytes of data on one line you have much bigger problems.

This crossed my mind of course, and you are right.

It was just the easiest way to disprove the argument that sequential is always best or that random access is retarded.

I'm sure there are other cases where not handling everything as a sequence of bytes has merits.

>greatly increased memory consumption

It is obvious that that scheme would incur an additional few bytes per line.

> sanely-written programs suffer

Since we can't even get consistent behavior for a trivial example between two well known editors I think you overestimate our ability to write sane software with the current system.

>Dishonest

Meh. Your arguments are only about some measure of efficiency, mostly size related, and ignore everything else. You say I present a stupid edge case. I say your focus on file size or memory consumption is an edge case as well.


 No.983255>>983561

>>983237

>disprove the argument that sequential is always best or that random access is retarded

Nice strawman, fag. My point was that unless you're exclusively targeting solid state storage, sequential access is an overall choice outside rare edge cases. Designing your system around these edge cases is a recipe for bad performance in general.

>I say your focus on file size or memory consumption is an edge case as well.

>performance is an edge case

Nice webdev logic. Performance and resource consumption are always important, especially when you can't afford to throw more hardware at problems, and ignoring "little" things like sequential memory access is a big reason why modern software (including vidya) runs like shit.

>but it's only a few bytes per line!

That shit adds up quickly, especially since most normal text files are made of many short-ish lines and there's loads of these files on any given system. Sure, you can count the lines of that one file with two 100MB lines faster and you've reduced nano's line count by one but is it really worth slowing down and bloating everything else? You'd probably gain more by yelling at the idiot who generated that file until he makes something better.


 No.983561>>986762

>>983255

>Nice webdev logic

The overall idea is that you make a trade off.

If the Os does more, everything on top has to do less. Not saying it's a good idea, but lets say the OS can handle something akin to csv in functionality. That saves everybody the work of building their own.

And because your OS understands something akin to CSV, maybe other things can be done easier.


 No.984436

>>981438

Sleep tight Terry

You are missed.


 No.986752

>>981415

>mfw steve klabnik secretly hates niggers and is working behind frienemy lines to kill every last one

>gleekitty.tif

>>981438

Sleep Tight Terry


 No.986762>>986863

>>983561

That's what libraries are for, retard. No need to put everything in kernel space.


 No.986773

>>981438

Sleep tight, Terry


 No.986784

>>981438

Sleep Tight Terry


 No.986818

>>981438

Sleep Tight, Terry


 No.986863

>>986762

> kernel space

>OS is just the kernel

If I may unironically interject for a moment...


 No.986868

>>982795

>L-Liking to play games every now and then doesn't make me a man-child, right?

If the only thing you do with vidya is play it yes, yes you are.


 No.986869

do people actually say this? I think this is attacking a straw man, who actually think this. maybe someone wrote that one time on a forum or something. I hope he gets ran over with a car. communist retard fuck.




[Return][Go to top][Catalog][Screencap][Nerve Center][Cancer][Update] ( Scroll to new posts) ( Auto) 5
119 replies | 13 images | Page ?
[Post a Reply]
[ / / / / / / / / / / / / / ] [ dir / cutebois / hkpol / kpop / quests / rnd ][ watchlist ]