[ / / / / / / / / / / / / / ] [ dir / flutter / fungus / leftpol / mascot / ss / vg / vichan / zsl ][Options][ watchlist ]

/tech/ - Technology

You can now write text to your AI-generated image at https://aiproto.com It is currently free to use for Proto members.
Name
Email
Subject
Comment *
File
Select/drop/paste files here
Password (Randomized for file and post deletion; you may also set your own.)
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Expand all images

[–]

 No.919757>>919768 >>919771 >>919776 >>919797 >>919806 >>919854 >>919993 >>920000 >>920121 >>920122 >>920219 >>920327 >>920874 >>922055 >>925751 >>926574 [Watch Thread][Show All Posts]

Considering that C++ can be just as fast as C there is probably no need for C anymore as it's just too difficult to use right even for experienced programmers.

 No.919758>>919776

Just write everything in Rust


 No.919764

Pascal/Lisp should have been adopted over Cee.


 No.919767>>919783

>C is more difficult than C++

What are you smoking? C will go away once there is a better alternative available. For large amount of tasks C is still the right tool for the job.


 No.919768

>>919757 (OP)

The question is do we really need C++ when all it does is increase compile times and create more OOP pajeet shitters.


 No.919771

>>919757 (OP)

No other language's compiler can give you the same performance (other than Ada, I think).


 No.919776>>919783

>>919757 (OP)

>>919758

kill yourself rustfag


 No.919783>>919787 >>919794

>>919767

>For large amount of tasks C is still the right tool for the job.

such as?

face it, even in "muh embedded shitstems" c++ is used

>>919776

nice sage


 No.919785>>919788 >>919806

In all seriousness, I'm not the only one who finds object oriented programming more overcomplicates right?


 No.919787>>919789 >>919856

>>919783

>even in ... c++ is used

Javascript is must be a great language since it's used by so many people.


 No.919788

>>919785

It's overused, probably because it was hyped so much during the 90's, like it was going to magically fix all software problems.


 No.919789>>919804

>>919787

Javascript is not good, but it's absolutely the right tool for a lot of jobs. You can't really get around it, except by avoiding those jobs.


 No.919794

>>919783

>such as?

Stop pretending that you don't know.


 No.919797

>>919757 (OP)

>implying C++ is easier to use right even for experienced programmers

kek


 No.919804>>919811

>>919789

Which job is Javascript the "right tool" for?


 No.919806>>919812 >>919821 >>920792

>>919785

No, you're not. In my humble opinion OOP is a clever way to obfuscate code for job security (at its worst). With OOP, you no longer have plain functions that transform input data into output data, now you have "objects" (blobs of code + internal state) which have relationships and dependencies on other objects. But of course the machine doesn't think in OOP, it still thinks in code+data. So OOP is for the people, not the computer. People thought OOP would help with code reuse. They were well intended but wrong. You know what helps code reuse? Backward compatibility (not only the ability to compile old stuff as-is but also the ability to run old stuff as-is). Libraries and engines are also examples of code reuse; and surprise, they have nothing to do with OOP either.

>>919757 (OP)

>Do we really still need C?

Yes. C persists because C++ is an nastier and more complex language. Consider the difficulty of learning all of C++17 from scratch. Fun experiment, try writing a tutorial for this for beginners. Things will get even fuckier with C++2? when they introduce Concepts. Don't get me wrong, C++17 is nice and all, finally adding std::filesystem 20 years too late, but don't you have the feeling that C++ is ever more quickly becoming a bloated, unlearnable mess since C++11? I do, but then again I'm just a dumbass who has yet to master move semantics, naively believing that sort of optimization shit is for the machine to figure out, not me, the supposedly high level programmer.


 No.919808>>919813

>Do we really need C?

Good question!

>Considering

Allow me to

>that

talk about a

>C++

... ugh.

>as it's just too difficult

...

>even for experienced programmers.

C++ is too difficult, even for experienced programmers. What you can't even begin to understand, you certainly can't secure or build well, except by accident. As hard as this these theorem-proving languages are, I've never once thought "man, I bet C++ isn't as hard as this", because, although that's probably actually true, it's like saying that a 1kg bag of dicks is easier to eat than 5kg of steak. Absolutely true. Now give me the steak. I will pass on the bag of dicks.


 No.919811>>919820

>>919804

Scripting on web pages.


 No.919812>>919814

>>919806

>So OOP is for the people, not the computer.

All programming languages are. Otherwise we would be writing in assembly.


 No.919813

>>919808

>Absolutely true.

we bow down to your superior experience


 No.919814

>>919812

Assembly is also for the people, especially on CISC processors.


 No.919820>>919824

>>919811

That's only because it's the only language that browsers support, not because it's the best language for the job.


 No.919821>>919825 >>919835 >>919947 >>920000

>>919806

Isn't C++ like 95% the same as C, minus the stuff they added for classes? Can't you technically run C code in C++?


 No.919824>>919826

>>919820

It's the best language for the job because it's the only language for the job. It's not the best language in the full possibility space of how computing could have turned out, but it's the only choice in this world.


 No.919825

>>919821

no. C++ has a ton of shit, and if you're doing C++ right you're not writing anything remotely like C. And no, you can't write a lot of C in C++. C has continued to evolve and C++'s grandfathered version of C has not. The grandfathered version is actually quite old, and still has practices like casting malloc()'s return because C++ thinks it returns char* when in C it returns void* and simply assigning that to a variable of the appropriate type will do what you want.

C++'s ability to work with C is inferior to ATS's ability to do that.


 No.919826>>919828

>>919824

Well now you're just arguing semantics, but fine.


 No.919828>>919833

>>919826

It seems like the only reasonable meaning in the context of talking about whether we still need languages.

We definitely need javascript, even though it's a bad language. We also need C, and FORTRAN, for somewhat similar reasons.


 No.919833

>>919828

>in the context of talking about whether we still need languages

The context was C vs C++ and/or what is the best tool for the job. The argument wouldn't exist if C++ was the only choice.


 No.919835>>919844

>>919821

>Isn't C++ like 95% the same as C, minus the stuff they added for classes?

Umm sure if you count exception handling as "stuff they added for classes", so that you can signal an error from a failing constructor or overloaded operator but curiously not a failing destructor... I mean you could because it compiles fine but really, really shouldn't. C is 95% simpler though.


 No.919836

You don't even need C++, there is Rust.


 No.919844>>919858

>>919835

>but curiously not a failing destructor...

That's because destructors are called during cleanup when an exception happens. You really, really, really don't want exceptions to except your exceptions *insert yo dawg meme here*, bad things would happen from that.


 No.919854

>>919757 (OP)

Stop trying to make excuses for your incompetence and lack of critical thinking.


 No.919855

yet another rust shill thread


 No.919856

>>919787

It was never claimed C++ is better because more people use it.

Learn to read you brainlet.


 No.919858>>919860 >>919867

>>919844

>trying to reason with rabid C fanboys that never wrote any C or C++ in their life

lul


 No.919860

>>919858

>there's a good reason for this feature having this wart

>therefore this language is well-designed

my daycare's floor is covered, three feet high, with plastic balls. it's a big ball-pit, like children's fast-food stores used to have.

it's a little irritating to walk through, but it's critical that we keep the floor densely covered in balls.

why?

because there's broken glass under the balls. What if a kid fell over? without the balls, he'd be immediately horribly injured. look at this mfer right here, hates kids, thinks that stone-age "buildings with clean uncluttered floors" is some kind of model that'll carry us into the future.


 No.919867>>919869

>>919858

bumping shit thread


 No.919869>>919873

>>919867 (Me)

>forgetting to greentext

brb, have to kill myself


 No.919873

>>919869

Would've been funny if you also forgot to sage.


 No.919884

<lang X is better than Y

pajeet's first post


 No.919947

>>919821

There are certain, small, parts of C that are different in C++. It's been a while since I've written anything in C++, but I can remember there being some problems with some C habits that I still used. Again, it was mostly small, obscure stuff, that I had to google around to see why gpp didn't like it.


 No.919993

>>919757 (OP)

Is this bait? I can't even tell anymore. C++ is too hard to implement and nobody in the world actually understands it (they don't even understand C but at least they can somewhat get away with it).


 No.920000>>920008 >>920019

>>919757 (OP)

C++ takes a lot longer to compile because C++ compilers are worse than C compilers even in 2018. If C++ is used for embedded systems, then you have to use C-like subset. you can't use OOP stuff, because it uses too much memory. If you are looking for a simple language, then C++ is a bad choice because it's so huge and inconsistent. There is no point in using C++ for low-level stuff as the one of main points of C++ is to hide low-level details.

>>919821

>Isn't C++ like 95% the same as C, minus the stuff they added for classes?

no, as others said. but usually a C-like subset of c++ is used when c++ is used in embedded systems

>Can't you technically run C code in C++?

if it is C89


 No.920005>>920009 >>920019

As an actual embedded systems guy, C++ is a terrible replacement for C as everything useful in C++ would be disabled due to requiring far too complex a runtime or non-portable magic (e.g. no zero cost exceptions), the standard library wouldn't be present due to it relying on template bloat to get the job done, and the absolute dogshit support for debugging C++ would make your life hell when you're faced with debugging with serial debuggers and crash dumps ala crashkernel. Have you ever tried to even get the right line a C++ program that uses templates crashed at let alone dump specific symbols and set breakpoints?


 No.920008>>920019

>>920000

>C++ takes a lot longer to compile because C++ compilers are worse than C compilers even in 2018.

No, they're no slower, the speed difference is because C++ code depends on massive reams of template expansion and ends up asking the compiler to effectively compile far more code. Metaprogramming was a huge mistake and I was disappointed to see Rust repeat mistakes we recognized as mistakes 20 years ago leading to the same unsolvable compile performance problems.


 No.920009

>>920005

I noticed Xilinx seems to be pushing C++ for their High level synthesis track, I do wonder if industry is taking to it though.


 No.920019>>920096 >>920179

>>920000

>C++ takes a lot longer to compile because C++ compilers are worse than C compilers even in 2018.

C++ takes longer to compile because the standard library headers liberally abuse template metaprogramming warts that are extremely slow to evaluate, and you include heaps of them with every compilation unit. Try compiling a C++ unit that doesn't use any standard headers. Blazing fast, just like C.

>If C++ is used for embedded systems, then you have to use C-like subset. you can't use OOP stuff, because it uses too much memory.

Bullshit. There is a freestanding subset of language that does away with some features to get rid of the runtime, but the core language is left intact. The only real problem with OOP in embedded is overuse of dynamic memory allocation, and that can be fixed with custom allocators and careful choice of algorithms.

When people use a C-like subset of C++ it's usually either:

a) they're old C hands who can't bother to learn better (very common in embedded), or:

b) the compiler available for the platform in question implements an ancient dialect of C++ without all the useful constructs, so there is not much point in going above the basics, and/or:

c) the compiler available for the platform has a half-assed implementation of C++ bolted on top of C constructs and produces awful code whenever C++-only features are used (again, common on embedded platforms).

There is nothing wrong with the language itself for embedded usage.

>>920005

>everything useful in C++ would be disabled due to requiring far too complex a runtime or non-portable magic

That's a bit of an overstatement. Sure, exceptions are out but they're cancer anyway, even on desktop, but plain OOP doesn't require any runtime.

>the standard library wouldn't be present due to it relying on template bloat to get the job done

...most of which is instantiated on use or inlined. You only get what you need/use compiled in. The freestanding subset of C++ specifically says which parts of it are available without a runtime, and it's most of the useful parts.

>>920008

Metaprogramming is a very useful feature and it can be fast, when done the right way (hello LISP). However it needs to be designed into the language from the very start --- not bolted on after accidentally noticing that some constructs evaluated at compile time are Turing-complete.


 No.920096>>920567

File (hide): f72e9decea0ab6a⋯.jpg (75.84 KB, 648x1052, 162:263, herbs.jpg) (h) (u)

>>920019

A voice of reason among all the FUD


 No.920121>>920126

>>919757 (OP)

Any language that relies on lambdas and delegates instead of coroutines for pervasive multithreading is not long for this world. C++ header files destroy locality and force you to write pajeet code. C++ has so few structured multithreading constructs that it forces you to take unmaintainable shortcuts. No two "multithreaded" parts of your app will use the same lines of code. You are always so scared to write anything elaborate that you pray every API includes a context pointer so they can handle architecture instead of you.

All C++ scalability comes from minimalism instead of thorough design. You can't get to the future with minimalism. Scaling down is scaling for death.

Java was a practice run for C#. You can't create god-tier shit in Java either.

C++ threading libraries are heavy on blocking threads and polling at worst, and force app developers to jump between lily pads of unmaintainable callbacks at best. Open Source should be called Open Boilerplate because that's all it is: the love, study, and proliferation of worthless boilerplate and the languages that require it.


 No.920122>>920126 >>920136

>>919757 (OP)

Even if C++ was as fast as C at run time (which it's not), it still takes longer to compile.


 No.920126>>920129

>>920122

Even if you knew what you are talking about (you don't), you still haven't written anything in either C or C++.

>>920121

>no arguments


 No.920129

>>920126

>No code anyone would allow to be checked in.


 No.920136

>>920122

>(which it's not)

it is, if you disable exception support


 No.920160>>920792

Pascal, Lisp dialects, and Ada - C is an ugly bondage and discipline language.


 No.920175

We have this "debate" every week.

I give everyone permission to shit-post ITT.


 No.920179

>>920019

>but plain OOP doesn't require any runtime.

No one in embedded wants OOP, though. It doesn't even make sense in embedded. Look at something like the skb struct in Linux - how would you translate that to OOP?

>exceptions are cancer

You're retarded. We'd love to have exceptions, the problem is they can never be relied on to be implemented efficiently or at all.

>standard library instantiated on use or inlined

That level of bloat is rarely acceptable.

>metaprogramming is good

>LISP did anything right

How's undergrad?


 No.920219

>>919757 (OP)

>as it's just too difficult to use right even for experienced programmers.

Are you scared of pointers and printf or something? Yes you are because you are a little bitch.


 No.920327

>>919757 (OP)

I'm so fucking tired of kids on /tech/ too afraid or too dumb to use an address register of a CPU. If you can't even understand indirect addressing modes and arrays, why are you here? Shitpost on /pol/ or cuckchan with all the other children.


 No.920436>>920468

>/tech/ accuses people of being too dumb for pointers

>/tech/ has never written C/C++ therefore proving they're too stupid for pointers

a board full of brainlets


 No.920468>>920471

>>920436

if OP thinks that pointers are too hard for humans to use reliably

truth is, they're hard like "clean uncluttered floors" is hard. even an idiot can do it and over time even an expert will fuck up. the solution is vigilance, not giving up and putting ball-pits over broken glass and AIDs needles

then it would be odd for OP to also be good at pointers, no?

I don't get this /tech/ meme of everyone here being LARPers to the point of literally not having written anything, ever. Anyone you could ever single out could look up a tutorial and get back to you in a day, from a cold state of "programming? will that help me cheat at WoW?"


 No.920471>>920792

>>920468

if it's so easy to avoid problems when using pointers then why are there so many security problems related to them?

checkmate faggotface


 No.920567>>920574

>>920096

>There is nothing wrong with the language itself for embedded usage.

This is true but C++ is not actually good for programming in general, regardless of even going into talking about performance, footprint, portability, or memory usage. But feel free to read the latest spec to find a workaround whenever you run into a philosophical problem.


 No.920574>>920581

>>920567

C++ is great for programming in general. You're just a brainlet.


 No.920580>>920601

File (hide): feb1a108cfc7939⋯.png (11.73 KB, 1000x706, 500:353, 2812558A-6F58-11E5-9A38-7E….png) (h) (u)

C++ does a great job at fixing what isn't broken by bolting on several object-oriented nightmares onto it in order to satisfy several awful use cases.

The only reason C++ is as popular as it is, is because it's actually 5 or 6 languages interpreted by the same compiler - so it has the popularity of that many languages.

If anything, this is a prime evidence that people badly need Scheme, Haskell and Perl6.


 No.920581>>920586

>>920574

OOP is the worst for programming, and you're the brainlet for not learning a proper functional language.

https://wiki.haskell.org/Learning_Haskell


 No.920586

>>920581

>a proper functional language

>picks the one with lazy semantics

any beauty you ever saw in Haskell was a reflection of some ML feature.


 No.920601>>920792

>>920580

Why? Just use Go. Go solves all of C's problems without fulling embracing complicated software design paradigms.


 No.920792>>920813 >>920839 >>921388

>>919806

>So OOP is for the people, not the computer.

All programming is for people. "The fundamental design flaw in Unix is the asinine belief that 'programs are written to be executed by computers rather than read by humans.'"

>People thought OOP would help with code reuse. They were well intended but wrong.

That's more UNIX weenie fearmongering. OOP does help with code reuse. C++ and Java suck because they copied brain damage from C and weenies blame OOP instead of the real problem, C.

>You know what helps code reuse? Backward compatibility (not only the ability to compile old stuff as-is but also the ability to run old stuff as-is). Libraries and engines are also examples of code reuse; and surprise, they have nothing to do with OOP either.

Backwards compatibility is only good for "code reuse" for large corporations which are or can push around hardware companies. It doesn't help reduce bloat, which is the main reason for code reuse. If you can "reuse" code that is slow and bloated and increases the amount of code you have to write, it's better off not using it at all.

>>920160

>C is an ugly bondage and discipline language.

C is also bondage for your brain and your hardware. It creates a mentality where you can't imagine anything not sucking.

>>920471

>if it's so easy to avoid problems when using pointers then why are there so many security problems related to them?

UNIX weenies blame pointers, strings, and arrays but the real problem is that the C implementations of them suck. Pointers, strings, and arrays shouldn't cause any security problems. All of these problems were solved in the 60s.

>>920601

The only thing good about Go is that it's better than C, but so are C++, PHP, and JavaScript.

The worst disadvantage of C++ at the moment is political:
accepting C++ as the standard OO language de facto tends to
kill other existing languages, and stifle the development of
a new generation of essentially better OO languages.

I've been waiting for an appropriate time to use this quote.
This is as good a time as ever.

Programs are written to be executed by computers rather
than read by humans.

This complicates program comprehension, which plays a
major role in software maintenance. Literate
programming is an approach to improve program
understanding by regarding programs as works of
literature. The authors present a tool that supports
literate programming in C++, based on a hypertext system.

- abstract to an article in the Jan 1992 issue of the Journal of
Object-Oriented programming

The fundamental design flaw in Unix is the asinine belief
that "programs are written to be executed by computers
rather than read by humans." [Now that statement may be
true in the statistical sense in that it applies to most
programs. But it is totally, absolutely wrong in the moral
sense.]

That's why we have C -- a language designed to make every
machine emulate a PDP-11. That's why we have a file system
that forces every file to be viewed as a sequence of bytes
(after all, that's what they are, right?). That's why
"protocols" depend on byte-order.

They have never separated the program from the machine. It
never entered their tiny, pocket-protectored with a
calculator-hanging-from-the-belt mind.


 No.920799

I'm convinced you're taught all these useless languages in Jew school to keep you from being productive. Self-taught programmers don't end up like you. Meanwhile, almost everything substantial (not webshit) you use is written in C, C++, java, or .NET.


 No.920813>>920839

>>920792

>All programming is for people.

Missing the point, the point being that OOP is a tarpit between the programmer and the machine. OOP doesn't help the programmer talk to the machine, it does the opposite. And no, I'm not advocating for coding in assembly language, although doing so makes one appreciate the straightforwardness of C.

>"The fundamental design flaw in Unix is the asinine belief that 'programs are written to be executed by computers rather than read by humans.'"

So that's where the "all programming is for people" truism comes from, reeking of socialist idealism. Programs are not written to be read by humans. Oh look at this highly optimized game executable, Imma open it in Notepad and read it! Oh look at this random JavaScript code, Imma read it. I may not be a developer, but Imma read it just the same!

Your true pursuit is that of high quality, non obfuscated, well documented code which is straightforward, elegant, and preferably not object oriented. Or it would be, if it wasn't for OOP apologetics wasting your brain space.


 No.920839>>920966

>>920792

C++ and Java are not even object oriented languages, nor are the common coding styles in them OOP. Most people still program procedurally with them.

>>920813

>Programs are not written to be read by humans

>reeking of socialist idealism

Wrong. It's all about maintainability. If you write some obscure code that is optimized for the compiler to generate code for the machine you are writing on that no human can understand, then that code is unmaintainable. That might be fine for some small utility functions, but once you reach the size of a large application having everything be so obscure makes it a nightmare to work with.

>Oh look at this highly optimized game executable

We are not talking about executables, but rather the source code.

Having to pretend to be the processor 100% of the time is wasting human time in which they could be spending on thinking on a more abstract plane.


 No.920846>>920852 >>920890

Load and use a C++ library from Python. I dare you.


 No.920852>>920860 >>920870


 No.920860

>>920852

>using boost

We all make that mistake once.


 No.920870>>920886 >>920890

>>920852

>https://www.boost.org/doc/libs/1_67_0/boost/python/module_init.hpp

>extern "C"

>the solution to C++'s ABI problems is to use the C ABI

Every time.


 No.920874>>920881

>>919757 (OP)

to avoid other programmers using the boost library out of nowhere


 No.920881

>>920874

you literally cannot build opengl/other graphical modules in opencv libraries without using boost.

>inb4 dont use it


 No.920886>>920888 >>920890

>>920870

It's a stupid complaint. Do you complain that we use text-based serialization formats for interchange between languages?


 No.920888>>920889

>>920886

The fact is the only way to write C++ programs that can interface with anything else is to limit yourself to a C interface. You're not really talking to a C++ program, you're talking to a C wrapper that has none of the benefits of C++.


 No.920889>>920903

>>920888

Again, it's a stupid complaint. Do you complain you can't natively embed javascript in python? What about ML? Haskell? No. You reduce the interface to something everything can speak. C is very close to what the hardware speaks so interfaces between languages are almost always roughly C.


 No.920890>>920895 >>920899

>>920846

>Load and use a C++ library from Python. I dare you.

On a Lisp machine, you could mix any languages because of tagged memory. Multics and VMS do this with data descriptors. If you wanted to pass a data structure from Lisp to a Pascal or Fortran program and back, you could do that without needing a "foreign" language like C in the middle or converting to text. That's real compatibility. Today, it might be something like combining Python, JavaScript, and Ruby into one program.

>>920870

>>the solution to C++'s ABI problems is to use the C ABI

C++ originated on UNIX, is based on C, was a preprocessor for C, and was made by the same company as C, but they can't even get their OS to work with it after decades of running C++ programs, and they have no interest in fixing the problem. That's how incompatible everything is on UNIX.

>>920886

I certainly complain about that. In languages not designed for UNIX, including BASIC, you can pass an array or other data structure to another program. Multics and Lisp machines are designed around that kind of sharing. UNIX text serialization is a huge source of bloat and waste, not only in terms of cycles and memory, but also debugging and development time.

Generally when slamming C++ all you need to do is find
something C does badly, point and jeer and say, "lookit how
DUMB that is good golly!" and then pause, ask the audience,
"can you imagine something worse?" and while they're all
still shaking and puking and they finally quaver "nonono!"
you point out that C++ in fact *DOES* something worse!!

Gets 'em every time.


Let's do the exercise:

C's notion of how data objects are stored to disk is
pretty primitive. This can be excused to a degree because C
wasn't *designed* to solve that problem, it's a lower-level
language. However, it's clear that any reasonable
higher-level language needs to support some kind of not
entirely brain-crippled persistent storage.

C++ is designed as an "object oriented programming
language" but, if you look at how object persistence can be
done..... uh.... It *CAN* be done, right? You mean you
need to write some kludgy stuff that does a C-style write()
and assigns a type flag and then does a *cast* or something
down in its guts?? And if you cast an object to another
type of object it forgets what it is? Waitamminit!!!! You
mean every application that wants to do persistence has to
kludge its own application-specific tricks together to make
it work?

Yes. C++. At better compiler stores near you.


 No.920895

>>920890

>I certainly complain about that

Then you're a LARPer. No one's interested in handling every language's special snowflake binary serialization format, or wrestling shitty languages like LISP into dealing with unsigned machine words.


 No.920899>>920903 >>920912 >>921061

File (hide): e37f733b9ba3bc2⋯.png (5.41 KB, 75x135, 5:9, ClipboardImage.png) (h) (u)

>>920890

>>>the solution to C++'s ABI problems is to use the C ABI

>C++ originated on UNIX, is based on C, was a preprocessor for C, and was made by the same company as C, but they can't even get their OS to work with it after decades of running C++ programs, and they have no interest in fixing the problem. That's how incompatible everything is on UNIX.

Read the quote again - they are compatible, you just have to use the lower level ABI. How do you export to import classes into C? You convert it to a struct, then pass that to C. The same problem, BTW, exists between higher level lanuages: JS uses utf-16, Python uses ascii/utf16/utf32, depending on the largest codepoint in a given string (include one emoji, quadruple the size of your string), and Ruby uses utf-8. You can't just pass the memory between languages, you need to convert between formats first. The easiest way to do this is to convert to a standard conversion format, then convert back. This conversion format is called JSON.

>I certainly complain about that. In languages not designed for UNIX, including BASIC, you can pass an array or other data structure to another program. Multics and Lisp machines are designed around that kind of sharing. UNIX text serialization is a huge source of bloat and waste, not only in terms of cycles and memory, but also debugging and development time.

The problem is that the data format is inexorably tied to the machine. If the language is tied to the machine this isn't a problem. But nowadays we care about P O R T A B I L I T Y, so the data has to be represented the same on every machine. MShit has a common data representation for .NET application, which you can use between C#,F#,PS,VB, etc. (probably, I don't touch windows). On the JVM you can mix and match Java,Kotlin,Scala,etc freely. Both those do this by implementing a custom VM, which are highly compatible internally, but incompatible externally. If you want python on either VM, you need to use a custom implementation which is probably still on 2.6, because fuck the last decade.


 No.920903

>>920899

It's not just about data structures. It's also about calling conventions. Function calls have very specific and rigid requirements regarding how they're supposed to be called at the assembly level; they expect specific arguments in specific places and have cleanup requirements and other such business. With C, Ada, Fortran, Pascal and other such languages, it is simple to follow a simple standard ABI such as the SysV ABI and produce a polyglot system that interoperates correctly.

C++ doesn't care about any of that. Using C++ begets more C++ and the only code that will ever touch your C++ interfaces is more C++ code, preferably code compiled with the same compiler and the same version.

So your solution to this problem is to serialize data and communicate via sockets. That's seriously slow. It's slow as fuck. It's what a dynamic language would do.

>>920889

Those are virtualized languages, moron. They don't pretend to be OS-loadable programs. They don't compile to ELF executables. They don't export symbols and entry points. They're up front about the fact they need their own virtual machine in order to run.

Who said you can't embed JS? It's pretty much designed for that. In fact, you can load up the JS, Lua, Scheme, Java and C# virtual machines into any program and talk to them and interoperate with virtualized programs.

Good luck performing a method call on a C++ object from another language. You can't even get at the required symbol because of how C++ implements overloading, member functions and other such features. The symbol mangling probably isn't consistent even within the same compiler family; I've seen people get burned when they tried to reuse old binaries compiled with old ass compilers. Not even C++ can reliably reuse its own code.


 No.920912>>920921

>>920899

>nowadays we care about P O R T A B I L I T Y

Oh fuck off with that bullshit.

>MShit has a common data representation for .NET application, which you can use between C#,F#,PS,VB, etc. (probably, I don't touch windows).

>On the JVM you can mix and match Java,Kotlin,Scala,etc freely.

>Both those do this by implementing a custom VM, which are highly compatible internally, but incompatible externally

It's just a high-level instruction set implemented by a virtual machine. Obviously, if your language compiles to Java bytecode, it becomes possible to directly interoperate with other languages that also generate the same code and use the same high-level JVM data structures.

If C++ was implemented on top of the JVM, it would generate code for its own implementation of classes with ridiculously fucked up symbols that the other languages simply can't interface with; the C++ code would be able to call Java code, but the reverse isn't true. C++ projects are parasites that don't even try to play nice with anything but other C++ projects.


 No.920921>>920925 >>920972

>>920912

>If C++ was implemented on top of the JVM, it would generate code for its own implementation of classes with ridiculously fucked up symbols that the other languages simply can't interface with; the C++ code would be able to call Java code, but the reverse isn't true.

What you're talking about here isn't an intrinsic C++ problem, i.e. a problem with the language itself, but rather an issue of standardisation between compilers. The standards committee elected to leave many details of compiler implementations flexible, as this has typically been "the C++ way". This is not without cost, and has lead to the symbol inconsistency nightmare you've described. But this could be easily corrected in a future revision of the standard, if so desired.


 No.920925>>920927

>>920921

It can't be corrected as some platforms lack the hardware required for some of C++'s features.


 No.920927>>920929

>>920925

Such as?


 No.920929>>920941 >>920972

>>920927

Ask reddit if you need spoonfed.


 No.920941

>>920929

In other words you don't know of any, and were just being flippant. My reason for pressing you is that it is likely that any C++ feature dependent on the presence, or lack of, particular hardware would have little to do with the ability of a compiler to generate symbols in a standardised manner. The problems are orthogonal; and if not, the interaction would be very minor, limited to those particular features. Thus it would be possible to standardise symbol generation across compilers if the committee wanted to, whilst keeping those hardware dependent bits platform specific.


 No.920966>>920994 >>921061

>>920839

>C++ and Java are not even object oriented languages, nor are the common coding styles in them OOP. Most people still program procedurally with them.

Communism was never tried before. Real communism. Pure communism. Forgive me, I'm just trying to be funny because your OOP fanboyism invites ridicule.

So tell me, what are the "real" object oriented languages? Preferably, languages which you actually use? Which you write code in, for other humans to read, because they'll magically understand your obscure language of choice by virtue of them being human and your style being OO?

>Having to pretend to be the processor 100% of the time is wasting human time in which they could be spending on thinking on a more abstract plane.

But they're not pretending to be the processor or they'd be writing machine code directly, or at least assembly code. Your mistake is thinking that you're doing yourself a favor by alienating yourself from the machine. Actually who knows, your fantasy of "thinking on a more abstract plane" may yet come to fruition with the advancement of AI... future computers may be programmable by voice commands and facial expressions. Would that be abstract enough for you? Or is it not the "correct" plane?


 No.920972>>921057 >>921061

>>920929

Kill yourself, retard. Make detailed posts or fuck off.

>>920921

The usual target architectures and ABIs simply don't support C++ in any reasonable form. Linux loads ELF executables, and that format is made out of various sections and tables. The symbol table is just a C string to address map that associates an exported symbol to the relevant data or code. It's very simple and languages that export functions this way integrate very well with the rest of the system.

In ELF there's no notion of exported classes, per-class function name spaces, virtual method tables, nothing. Anything that isn't a name › value map is gonna be implemented as an abstraction on top of ELF. The C++ compiler implements this by encoding this information in-band with the ELF data. There's no ELF data structure that allows disambiguation between A::func and B::func, so it just exports two different symbols with the namespace information encoded into them. Same thing happens with overloads: the parameter information is encoded into the symbol. Obviously, this means other code can't simply load up your binary and start referring to symbols within it -- they'd have to know the compiler's metadata scheme in order to interface with it.

Linux uses the SysV ABI for the architecture in use, so when you load some binary you know how to call its functions. When you load a C++ binary, you don't know jack shit. You can't even get at the functions themselves, and even if you manage that, how the hell do you call them? What's the calling convention? How do things like this pointers get passed? There just isn't a C++ equivalent of SysV. Using C++ libraries from other programs is about as sane as trying to talk to the Go runtime from another program. It's so ridiculous, compilers have broken compatibility even with themselves.


 No.920994>>921044

>>920966

>So tell me, what are the "real" object oriented languages?

Two off the top of my head are smalltalk and eo.

>Preferably, languages which you actually use?

I don't use OO languages. I think it's an inferior way to think about computing.

>voice commands and facial expressions

Those are slow input methods, that makes it tough to communicate your ideas to the computer. Those ways also seem prone to not thinking about edge cases / other complications that your design might need.


 No.921044>>921047 >>921061

>>920994

>smalltalk

That's since before C++. (Checked Wikipedia, 1972 with a stable release 38 years ago.) Thanks for the history lesson.

>eo

Never heard of it, ever. (Checked Wiki, nothing there but Esperanto...)

>I don't use OO languages. I think it's an inferior way to think about computing.

OK. What languages do you use that aren't "an inferior way to think about computing"?

>Those are slow input methods, that makes it tough to communicate your ideas to the computer.

I just knew someone was going to say that. How about direct access to the programmer's brain waves?

>Those ways also seem prone to not thinking about edge cases / other complications that your design might need.

That's the point, being high-level means relying on the machine to do the low-level things on its own. The higher level you are, the less control you have. C is the best balance between low-level and high-level hence its longevity, but that's just my opinion.


 No.921047>>921243

>>921044

Not that anon, but "eo" was probably a typo: https://en.wikipedia.org/wiki/Io_(programming_language)


 No.921057>>921218

>>920972

>Obviously, this means other code can't simply load up your binary and start referring to symbols within it -- they'd have to know the compiler's metadata scheme in order to interface with it.

I think standardising the metadata scheme for symbol export would go a long way toward resolving these compatibility issues. Rather than the ad-hoc system in place today where every compiler just does as it likes, a compiler would be required to generate them in well-formed manner in order to be considered standards-compliant. You'd have to have glue to map it into each binary target (ELF, Windows, etc.), but that's mostly a job of translation; the heavy-lifting would have already been done in deciding on a standard naming scheme in the first place.

>There just isn't a C++ equivalent of SysV.

True, and this is ideally where a standardisation effort should be focused, imo - at least in establishing common basics like calling convention and vtable/virtual dispatch. I would be surprised if most compilers hadn't converged on a common implementation these days anyway, just copying best practices. As for a full C++ ABI, though, there are good arguments against passing "objects" across application boundaries, most notably validation: the receiver of an object can never trust an object to be well-formed, despite its assurance that it validates on construct. You simply don't have these issues when working with built-ins and PODs, because with plain old data and no implied "intelligence" there is no false assurance of correctness. There's a lot to be said for the simplicity of the C ABI for that reason, and I don't view full support for objects to be that much of a win.

>It's so ridiculous, compilers have broken compatibility even with themselves.

Yup. It's a complete mess at the moment, but I wouldn't rule out the possibility of standardisation. The language has changed enormously in the last decade, particularly in how it is governed, so it's not completely outside the realm of possibility. And one of the primary arguments for permitting complete flexibility in compiler implementation - competition in compiler development space - has been more-or-less nullified these days with the availability of mature, free-software compilers.


 No.921058>>921070 >>921218

unsurprisingly this thread is full of people who have no idea what they're talking about

no one on /tech/ has programmed in any professional capacity but everyone here loves to spread FUD


 No.921061>>921070 >>921103 >>921218

>>920899

>Read the quote again - they are compatible, you just have to use the lower level ABI.

Bullshit. Having to use assembly or a foreign language like C in the middle means it's not compatible. If they were compatible, you would use the same native formats.

>How do you export to import classes into C? You convert it to a struct, then pass that to C.

Lisp machines have standard classes, structures, numbers, strings, and arrays. You don't have to serialize or convert anything at all. On a Lisp machine, you really do "just pass the memory between languages" because there's one address space.

>The problem is that the data format is inexorably tied to the machine. If the language is tied to the machine this isn't a problem. But nowadays we care about P O R T A B I L I T Y, so the data has to be represented the same on every machine.

This isn't true. A binary data format is no more "inexorably tied to the machine" than a text data format.

>>920966

>Communism was never tried before. Real communism. Pure communism. Forgive me, I'm just trying to be funny because your OOP fanboyism invites ridicule.

That's a bullshit comparison. OOP has been tried before and it was shown to work a long time ago. That's why everyone wanted to add objects and classes to their language, but a lot of them did it wrong and it doesn't work as well. It's more like claiming airplanes don't work because someone built one out of sticks and it doesn't fly.

>>920972

>The usual target architectures and ABIs simply don't support C++ in any reasonable form. Linux loads ELF executables, and that format is made out of various sections and tables.

These are UNIX/ELF design mistakes. ELF is incompatible with anything but C and something with types like C, which sucks. Everything else integrates very poorly with the rest of the system.

>In ELF there's no notion of exported classes, per-class function name spaces, virtual method tables, nothing. Anything that isn't a name › value map is gonna be implemented as an abstraction on top of ELF. The C++ compiler implements this by encoding this information in-band with the ELF data.

Lisp machines support an object system and packages, so this is a UNIX flaw. It was a flaw in the 80s too. Other systems could do it correctly.

>>921044

>That's the point, being high-level means relying on the machine to do the low-level things on its own. The higher level you are, the less control you have.

This is true in a sense, but higher level gives you more control over the implementation. On a Lisp machine, numbers are usually automatically promoted to bignums and replaced by a pointer to the bignum because everything is a tagged word. UNIX's anti-user philosophy leads to distrust of OS developers, so users are brainwashed to believe anything that makes things better has to have an ulterior motive.

>C is the best balance between low-level and high-level hence its longevity, but that's just my opinion.

C is a horrible balance. Its longevity comes from its flaws. C still uses null-terminated strings because a better string data type would not be compatible. C arrays are still broken because they decay to pointers so fixing them would break every C program. C's longevity comes from extreme incompatibility, where you are not even allowed to pass an array from one function to another in the same program. You have to use a pointer to the first element instead.

	The crowning glory of it all is C++, the birthplace
of function prototypes. Function prototypes must have been
implemented the way they were because it was easier for the
compiler writer to get all the parameters before starting to
parse the function, rather than having to figure them out in
the ensuing lines. When a language is bent, twisted, and
malformed to save the compiler writer a bit of effort,
you've got to wonder why the guy's writing compilers in the
first place. Now, C++ could really use a decent compilation
environment, with link time name resolution. But
NnnnnoooooOOOOOOO, rather than "do it right" some late night
caffeine and taco fever nightmare scheme was concocted
whereby symbols get transformed into DES-encrypted dog vomit
and back.

C++ is heading for standardization. All of this
happened because nobody wanted to put a few hooks for type
checking in their linker and object file format. I think
there's a lesson in all this, but I'm damned if I can see
it.


 No.921070

>>921061

>OO has been tired ... shown to work

'OO' is a bunch of stuff. You'll benefit from some of that stuff compared to someone writing FORTRAN in an ancient manner.

Today, other people are using the good stuff and the difference is now that you're spinning your wheels or committing self-harm with the bad stuff.

>>921058

>I keep seeing BTFO's therefore you're all LARPers!

yeah kid, I had this experience too, as a 12-year-old watching US Senators debate each other on CSPAN2.

Since it was a back-and-forth, and since I was persuaded by each speaker in turn, I realized pretty quickly that I was too clueless to evaluate the arguments I was hearing.

Another thing you'll eventually notice is that snake-oil salesmen are much more versed in the virtues of snake-oil than are its detractors in the vices of snake-oil. The side that eschews the stuff naturally comes to have less experience with it and be caught in 'gotchas' like its actual color, whether such-and-such snake's oil is often used, etc.


 No.921103>>921148 >>921216

>>921061

>OOP has been tried before and it was shown to work a long time ago. That's why everyone wanted to add objects and classes to their language, but a lot of them did it wrong and it doesn't work as well.

My anti-OOP sentiment I draw in part from personal experience with C++ but also from reading "OOP Oversold" (may it rest in peace) and Ben Lynn's C craft*.

With that out of the way, I'm skeptical about your claim that "[OOP] was shown to work a long time ago" while at the same time, apparently, languages such as C++ and Java "did [OOP] wrong" while being the most successful.

As far as I'm concerned, naively, OOP is a combination of three things:

1. Put the noun before the verb: "data.func()" instead of "func(data)"

2. Have internal (hidden) state associated with a group of functions

3. Juggle with relationships between classes, hoping it makes the source code easier to understand and to reuse (the cargo cult of design patterns)

Now explain to me how can OOP help me when the machine still thinks in terms of code working with data? To rephrase this, explain to me what's the benefit in distancing myself from the "natural" way the machine works.

* https://crypto.stanford.edu/~blynn/c/


 No.921148

>>921103

>Now explain to me how can OOP help me when the machine still thinks in terms of code working with data? To rephrase this, explain to me what's the benefit in distancing myself from the "natural" way the machine works.

OO is useful in situations where the problem domain can be naturally decomposed into objects that interact with each other, e.g. modelling and simulation. OO is a burden when a programmer is forced to build a "tokenizer class", for example, rather than simply writing a free function tokenize(string). In all cases, OO distances the programmer from the machine, but in some cases it can be worth it.

When OO fits and I opt to use it, internally my objects are written in a procedural style: free functions defined in the implementation file that mutate data members in clearly defined ways. I rarely use private member functions, because they encourage sloppy programming. One thing I can't stand, and that OO-adherents seem to love, is treating private state as if it were global within the context of the object: i.e. mutable from any member function, at any time, with no rules for when and how it is accessed. It's a nightmare to maintain.


 No.921157>>921216

OO is like metaprogramming; just a shitty way to "fix" C's lack of real generics.


 No.921216

>>921103

>My anti-OOP sentiment I draw in part from personal experience with C++ but also from reading "OOP Oversold" (may it rest in peace) and Ben Lynn's C craft*.

I'm not "anti-operating systems" just because UNIX sucks or "anti-programming languages" because C sucks.

>With that out of the way, I'm skeptical about your claim that "[OOP] was shown to work a long time ago" while at the same time, apparently, languages such as C++ and Java "did [OOP] wrong" while being the most successful.

UNIX, C, and JavaScript are successful, but they do a lot of things wrong. They would still be wrong if they weren't successful.

>1. Put the noun before the verb: "data.func()" instead of "func(data)"

You are supposed to understand data.func as selecting the func member/method out of data, like ['a','b','c','d']['concat']('e','f') in JavaScript. In Lisp, methods are (func data) like other functions.

>2. Have internal (hidden) state associated with a group of functions

This is a good thing. UNIX has virtual byte-oriented tape drives (pipes/UNIX files) as the only "object" it supports, which sucks. On an OS with objects, pipes would be objects that probably wouldn't be used much because we're not using a PDP-11.

>3. Juggle with relationships between classes, hoping it makes the source code easier to understand and to reuse (the cargo cult of design patterns)

Inheritance done right can save a lot of code and make things easier to understand.

>Now explain to me how can OOP help me when the machine still thinks in terms of code working with data? To rephrase this, explain to me what's the benefit in distancing myself from the "natural" way the machine works.

The machine is made up of physical objects that work the way OOP works.

http://userpage.fu-berlin.de/~ram/pub/pub_jf47ht81Ht/doc_kay_oop_en

> - I thought of objects being like biological cells and/or individual

>computers on a network, only able to communicate with messages (so

>messaging came at the very beginning -- it took a while to see how to

>do messaging in a programming language efficiently enough to be

>useful).

>>921157

>metaprogramming and OO were created to fix C

This is your brain on UNIX.

Subject: Re: Smalltalk Versus C++ (No Flame Wars, Please!)

In some article RH writes:

> A core dump is C++'s way of "delaying the objection until
> run-time"

Putting it another way:

`Method not found' is Smalltalk's way of spelling `core dumped'


 No.921218>>921667

>>921057

Until this standardization happens, I simply won't consider C++ stable. It's insane how people have to wrap their C++ code in a C interface in order to have a working system, and even that's way too hairy due to complete and utter garbage features like exceptions and, like you said, objects and their validation. The fact is C++ is an internal implementation language and it's a C world out there; when people try to turn it into a C++ world, it brings nothing but pain. C++ is OK if it's fully contained behind a C header file, but that implies C++'s features are only useful inside one project and can't be exported to other users of your code.

>>921058

>hurr

Kill yourself.

>>921061

>ELF is incompatible with anything but C and something with types like C, which sucks.

>Everything else integrates very poorly with the rest of the system.

This is only partially true. The fact is ELF is a simple format. It's not designed for C's type system at all. Rather, it fits C well because C is a simple language. It also fits Pascal and Fortran.

ELF simply doesn't support many of the constructs available in higher level languages. There is no "class table" or separate symbol tables per name space or anything like that.

Scheme is also a simple language. Guile uses ELF as its object file format:

http://wingolog.org/archives/2014/01/19/elf-in-guile

>Lisp machines support an object system and packages,

Look, you're comparing Unix which ran pretty much everywhere to a custom Lisp-focused architecture that had support for Lisp constructs in hardware. Not just some executable format. In hardware.

IN HARDWARE

It's obviously going to be exceptionally well integrated. Just like the code we write for virtual machines of today like the JVM and CLR.


 No.921243

>>921047

>probably a typo

No. I was talking about https://github.com/yegor256/eo

Io is probably better than eo though.


 No.921247

File (hide): 2a28e2e32179430⋯.png (1.1 MB, 1920x1080, 16:9, LinusToNvidiaWithLove.png) (h) (u)


 No.921388>>921447

>>920792

>OOP does help with code reuse

Modularization is what helps with code-reuse. Somewhere along the line, modules and packages were conflated with objects.


 No.921424

Questioning C is a heresy.


 No.921447>>921512

>>921388

That's part of it, but the OOP approach adds non-trivial possibilities.

If there's a huge amount of code already written to support Foo, and I subclass Foo into Bar (in a reasonably Liskov-compliant way), then all that code automatically supports Bar as well.

Code that might otherwise be hard-wired to call the old functions now seamlessly calls my functions.


 No.921512>>921513

>>921447

OO is shit and so are you

C above all


 No.921513>>921532 >>921560

>>921512

Enjoy your win32api NULL NULL NULL NULL NULL


 No.921532>>921534 >>921560

>>921513

That's just Windows being shit. C is fine in Linux.


 No.921534>>921571

>>921532

cee and leenucks are the best amirite? XD


 No.921553>>921581

There's not many people that actually can teach c++ or any kind of programming whatsoever, even Bjorn Strustrop stressed that many times.

I guess those fuckers in academia are at fault as well. Everything beyond c++98 is too hard to lecture and a great deal of low grade IT schools are spitting out JS brainlets like crazy...


 No.921560>>921571

>>921513

Win32 API is fine , how would OO improve it's design? I can't see it actually making it better to use- it doesn't fit into a hierarchy and it's not always clear what "owns" which function.

>>921532

If you try and do what win32 is supposed to do with X11 you will quickly find that it's total shit, win32 is a giant improvement on what unix is still stuck with for it's GUI's. I will admit the file handling is nicer, although I try and use the C standard library for all file I/O.


 No.921571>>921579

>>921534

If you'd ever done C on Linux and win32 you'd understand the difference being talked about and could make a useful comment.

>>921560

>I try and use the C standard library for all file I/O.

I hope you don't mean you do that on Windows, as that's a mistake if so. None of those functions are supposed to be used. Most of the C standard library on Windows has basically been deprecated for 30 years.


 No.921579

>>921571

I am doing this on windows, how is it a mistake? It's fully cross platform and works fine. I'm sure that such an API does not have as many features as the win32 counterpart, but I just don't need to use those features. Yes I could use nothing but win32 functions and completely forget about the standard library like all the code examples on MSDN do, but I am writing cross platform programs and so if I can write some routine in a cross platform way with no real drawback (like using standard file I/O) I don't see a reason to not do it.

Also "C standard libary is deprecated" on windows just isn't true, even if it was true for the MSVC compiler, I don't use that compiler, I use another compiler with its own libc implementation, which implements all of the C11 standard, including the optional parts that glibc doesn't implement, as well as some of the technical reports that you can read here: http://www.open-std.org/jtc1/sc22/wg14/www/projects


 No.921581>>921585 >>921653 >>921667

>>921553

>Everything beyond c++98 is too hard to lecture

But even C++98 is hard to lecture, if you go in depth.

I hope I'll be forgiven for exercising my brain a little in the arcana of C++, even if it means going off topic a bit.

>C++98

<constructors that require only a single param (except copy ctrs) should be marked as "explicit" or else implicit conversions may happen

<check for self-assignment in operator=

<don't throw exceptions in destructors, and hopefully never call a func that will throw, or else terminate() is called

<when using dyn allocated mem, watch out for shallow copy in copy ctr and operator=

<base classes must have virtual destructor even the destructor has no code

<when using multiple inheritance, inherit virtually to avoid diamond problem

<member functions that don't change member data must be marked as const

<member data that should be changeable by const member functions must be marked as mutable

>C++03

<not even Stroutrup cares about this one, he sees it as C++98 with bugfixes

>C++11

<yay, now you can use the natural brace init for stuff like std::vector, no more add each element individually by push_back() or resize()+at()

<yay, now you can use std::strings as filenames for std::fstream

<yay, use std::unique_ptr to replace std::auto_ptr which you never used

<yay, you must write move constructor and move operator= now, nigger!

<now you can use lambdas in the things from <algorithm> the way God intended; fun exercise: how to capture by const ref instead of plain ref?

<use std::array instead of the evil C arrays

<yay, move semantics which you shouldn't care about in a high level language, and rvalue refs designated by && (logical AND) for maximum confusion, also the std::move() FUNCTION to help with your confusion, nigger!

<use std::tuple instead of structs, because we'll be damned if we, the C++ guys, copy compound literals and designated initializers from evil C!

<variadic templates

<range for

>C++14

<std::make_unique and some lambda bugfixes but fuck me if I remember anything else

>C++17

<yay concepts! actually no concepts!

<here's std::filesystem 20 years too late and even then we had to nick it from Boost

<folding expressions

I'd say C++ is possibly one of the most difficult languages to teach. Not just because it's big and has lots of features but also because it's ridden with pitfalls... pitfalls being things that you should never-ever do, yet when you do them they compile fine and without warning. So naturally you have to teach those TOO, paradoxically.


 No.921585>>921623

>>921581

What's a better high level language than C++?

>inb4 C

apparently it's just as complicated considering how many people get things wrong with it


 No.921619>>921626

Smart pointers are literally a zero cost abstraction


 No.921623>>921641

>>921585

No trolling but if you consider C and C++ to be medium-level languages instead of high-level, then C++ really is one of the better choices... inb4 Rust.

If however you want a high-level language similar to C++, I recommend D.


 No.921626

>>921619

Vector is faster than plain old data array tbh


 No.921629>>921651 >>921667

If you guys hate NULL terminated strings, you will love Plan 9, from the manual[1]:

[...]Text strings in 9P messages are not NUL-

terminated: n counts the bytes of UTF-8 data, which include

no final zero byte. The NUL character is illegal in all

text strings in 9P, and is therefore excluded from file

names, user names, and so on.

Remember, Plan 9 (and also Inferno) were developed to fix the mistakes of Unix and to improve its strong points.

And, unlike Lisp Machines, you can actually use both today.

[1]http://man.cat-v.org/plan_9/5/intro


 No.921641>>921651

>>921623

What if you want a language that people actually use?


 No.921651>>921665 >>921667

>>921641

Stay at C++ then. Or if you're implying people don't use C++ then you can try C#. Plenty people still used it last I checked.

>>921629

I too was attracted by the design of Plan9 but once I got there I realized it has an alien interface with a very steep learning curve for humans. Furthermore it is underdeveloped to the point of being unusable. Even downloading a 10 year old Knoppix ISO yields a more usable OS. So... why would you even recommend Plan9? It's dead.


 No.921653

>>921581

Re-reading my fagtext I realized compound literals are more akin to constructors than tuples, so I take back that one point, because they're not comparable. Tuples still annoy me though. Something's not quite right about them.


 No.921665

>>921651

Indeed Plan 9 is not useful as a everyday OS, depending on your case, since there are people using it this way.

I recommend it though, as a learning tool. There's an excellent book by Francisco Ballesteros called "Introduction to Operating System Abstraction Using Plan 9 from Bell Labs". I'm just at the beginning, but since Plan 9 is so simple I'm not having trouble understanding it.


 No.921667>>921671 >>921832

>>921218

>It's insane how people have to wrap their C++ code in a C interface in order to have a working system, and even that's way too hairy due to complete and utter garbage features like exceptions and, like you said, objects and their validation.

Exceptions are good, but C++ exceptions are broken.

>The fact is C++ is an internal implementation language and it's a C world out there; when people try to turn it into a C++ world, it brings nothing but pain. C++ is OK if it's fully contained behind a C header file, but that implies C++'s features are only useful inside one project and can't be exported to other users of your code.

That's because UNIX sucks and C++ came from UNIX, so most implementations of C++ inherited that brain damage. This same incompatibility affects Lisp, Python, Java, JavaScript, Ada, and so on. C++ is from the same company and is expected to work better. They don't even expect anything else to be compatible at all. Mainframe OSes tried to make every language compatible through OS conventions and compilers. Lisp machines did it in hardware.

>This is only partially true. The fact is ELF is a simple format. It's not designed for C's type system at all. Rather, it fits C well because C is a simple language. It also fits Pascal and Fortran.

ELF is designed for assembly language on machines similar to a PDP-11. Some computers do more in hardware than ELF does in software.

>Look, you're comparing Unix which ran pretty much everywhere to a custom Lisp-focused architecture that had support for Lisp constructs in hardware. Not just some executable format. In hardware.

ELF is software and software should be easier to fix than hardware, and it usually is, except on UNIX because everything sucks so much that it can't be fixed.

>>921581

>I'd say C++ is possibly one of the most difficult languages to teach. Not just because it's big and has lots of features but also because it's ridden with pitfalls... pitfalls being things that you should never-ever do, yet when you do them they compile fine and without warning. So naturally you have to teach those TOO, paradoxically.

C is like that too, except the ratio of pitfalls to features is even higher. C classes have to teach bullshit like the broken switch scoping, why types have dumb syntax, why 00011 == 00009, and other crap that was bad when C was designed and could have been fixed in a couple minutes.

>>921629

Plan 9 uses null-terminated strings because it's written in C.

>Remember, Plan 9 (and also Inferno) were developed to fix the mistakes of Unix and to improve its strong points.

Bullshit. Plan 9 is still much worse than Multics and even worse than Linux and System V. Everyone outside the UNIX weenie culture already knew lengths were a good idea in the 60s.

>>921651

>Furthermore it is underdeveloped to the point of being unusable.

That's the point. You, the user, are supposed to fix all their problems. It worked for UNIX, so they expected it to work for Plan 9 too.

https://pdos.csail.mit.edu/~rsc/plan9.html

>This is not Plan 9 hacking per se, but it would be interesting to fix dot-dot in Unix. There might even be enough issues involved to make it a good senior or master's thesis. I don't use Unixes enough to do it myself, but if someone were interested I'd be glad to talk through problems and lend help.

It would take less time to make a new OS than it would to "fix dot-dot in Unix" and there's nothing "interesting" about fixing someone else's bugs.

    Continuing in the Unix mail tradition of adding
tangential remarks,

Likewise,

I've always thought that if Lisp were a ball of mud,
and APL a diamond, that C++ was a roll of razor wire.

That comparison of Lisp and APL is due to Alan Perlis - he
actually described APL as a crystal. (For those who haven't
seen the reasoning, it was Alan's comment on why everyone
seemed to be able to add to Lisp, while APL seemed
remarkably stable: Adding to a crystal is very hard, because
you have to be consistent with all its symmetry and
structure. In general, if you add to a crystal, you get a
mess. On the other hand, if you add more mud to a ball of
mud, it's STILL a ball of mud.)

To me, C is like a ball. Looked at from afar, it's nice and
smooth. If you come closer, though, you'll see little
cracks and crazes all through it.

C++, on the other hand, is the C ball pumped full of too
much (hot) air. The diameter has doubled, tripled, and
more. All those little cracks and crazes have now grown
into gaping canyons. You wonder why the thing hasn't just
exploded and blown away.

BTW, Alan Perlis was at various times heard to say that
(C|Unix) had set back the state of computer science by
(10|15) years.


 No.921671>>921673

>>921667

>ratio of pitfalls to features is even higher. C classes have to teach

a lot less. who cares what the ratio is when you're talking about a molehill instead of a mountain?


 No.921673>>921694 >>921791

>>921671

C is a molehill of features and a mountain of pitfalls. Some things take a long time to master because they're powerful and have a lot of capabilities, like CLOS and Lisp macros, and some things take a long time to master because they suck and are badly designed, like null-terminated strings and C switch.

Students used to learn more about programming because they didn't have to deal with all the bullshit. Good type syntax can be taught in a few minutes, but C type syntax sucks and makes everything hard to understand, so it takes a lot longer even for experienced programmers. C switch is much harder to understand than pattern matching even though pattern matching is a lot more powerful because C switch has so many pitfalls. Strings are really easy to understand, but C mixes in bullshit like null terminators that are just an artifact of a bad implementation of strings and have nothing to do with processing text.

I feel compelled to submit the following piece of C code:

switch (x)
default:
if (prime(x))
case 2: case 3: case 5: case 7:
process_prime(x);
else
case 4: case 6: case 8: case 9: case 10:
process_composite(x);

This can be found in Harbison and Steele's "C: A Reference
Manual" (page 216 in the second edition). They then remark:

This is, frankly, the most bizarre switch statement we
have ever seen that still has pretenses to being
purposeful.

In every other programming language the notion of a case
dispatch is supported through some rigidly authoritarian
construct whose syntax and semantics can be grasped by the
most primitive programming chimpanzee. But in C this highly
structured notion becomes a thing of sharp edges and lose
screws, sort of the programming language equivalent of a
closet full of tangled wire hangers.


 No.921694>>921726 >>921886

>>921673

NUL-terminated strings are garbage and I agree with you on that point. However, there is nothing wrong with the switch statement. Complaining about the switch means you simply don't underatand what it is supposed to be. Obviously it isn'tgoing to work like ML's pattern matching.


 No.921726

>>921694

I think he's complaining that the switch statement is badly designed because of its warts; Duff's Device* comes to mind.

So, you have to manually break each case or else they fall through, but why not have switch designed the opposite way: fall through only when the continue statement is used, otherwise break by default? Then you can't declare variables in the case without enclosing its body in curly braces. This can be forgiven because declaring variables wherever you wanted wasn't standard until C99. And finally, the wart that makes Duff's Device possible: cases are like goto labels and the switch needn't be their immediate parent. This definitely reeks but I'm not sure if it reeks of bad design per se, or it reeks of a "template metaprogramming" kind of easter egg.

* https://en.wikipedia.org/wiki/Duff%27s_device


 No.921791>>921800 >>921886

>>921673

Null terminated strings are fine. It's much easier and more natural to move around a simple pointer to data of any length.


 No.921800>>921870

>>921791

never mind that LISP lists are "nil" terminated.


 No.921832>>921871 >>921886

>>921667

>Plan 9 uses null-terminated strings because it's written in C.

I provided a source in my post to prove I'm not talking from my ass, where's yours?

>Bullshit. Plan 9 is still much worse than Multics and even worse than Linux and System V.

What is worst and how it's worst compared to that?

I'm not a fan of ad hominen, but you sound like a little bitch. You probably never used Multics, because it only runs in a extremely small set of mainframe computers. Emulating is not the same as using everyday in a working environment. You just you to bitch, that's it.


 No.921838>>921846 >>921871 >>921886

No, we really don't. C is essentially "performance over everything" - Ada is only around 10% less performant; but it is safer, reliable, and readable. I used to have stockholm syndrome for C - now (when I don't program in it for months) I realize it is a very, very ugly language.


 No.921846>>921858 >>921871

File (hide): 43ff04d7803959f⋯.png (31.7 KB, 512x288, 16:9, benchmarks.png) (h) (u)

>>921838

C makes it really easy to create bugs. To check if you are safely accessing a buffer, you need to check that your index is within the bounds of the buffer, the pointer is not null and that no overflows occurred in that calculation. It sounds easy to just check that every time. But in C, people end up creating functions that specify that a parameter should be non-null, the arguments can't cause it to write out of bounds and other very specific restrictions - memcpy for example. I don't know if there's a solution to this, apart from programming your applications in other languages. Ada's runtime checks can stop all integer overflow and buffer overflow attacks. Fortran is another language that is better than C; and it is also highly performant.


 No.921858>>921871

>>921846

A correct program is infinitely faster than an incorrect program. These benchmarks will look better for other languages when the people doing benchmarks stop considering programs that allow for arbitrary code execution as "correct".


 No.921870>>921871

>>921800

Even so let's take a look at two programs which print out the contents of a list. First one in C.

#include <stdio.h>
#include <stdlib.h>

struct node {
int data;
struct node *next;
};

void myprint(struct node **list)
{
if (*list == NULL) {
return;
}

printf("%d\n", (*list)->data);
myprint(&(*list)->next);
}

int main(int argc, char **argv)
{
struct node *head = malloc(sizeof(struct node));
struct node *next = malloc(sizeof(struct node));
head->data = 1;
head->next = next;
next->data = 2;
next->next = 0xDEADBEEF;

myprint(&head);

free(head);
free(next);
return 0;
}
This code creates a list that isn't null terminated and then tries to print it out. When ran it correctly prints out 1 and 2, but it then then goes on to segfault. It is almost impossible for our myprint method to guard against this. It treats a null pointer as the end of the list and any other pointer as a valid node. This means that this function has to assume that the caller is doing everything correctly. Now let's take a look at what we can do with LISP to protect against this from happening.
(defun myprint (list)
(when (consp list)
(print (car list))
(myprint (cdr list))))

(myprint (cons 1 (cons 2 #xDEADBEEF)))
For those that can't read LISP, I am constructing a list that doesn't end with NIL, but rather ends with the hexadecimal value of DEADBEEF. When this program is run it prints out 1 and 2. It then successfully finishes unlike the C program which got a segfault. You should first note that in C, it was trivial to create the list incorrectly. All we had to do was set a next pointer to 0xDEADBEEF instead of NULL. You can also imagine something bad happening when a next pointer is being changed when manipulating the list. In LISP on the other hand, we had to specifically create something that looked like a list. The regular way to create a list is just (list 1 2) unlike what we had to do above. In the LISP version of the program, our myprint function could successfully notice that input wasn't quite right and gracefully handle the problem. In this case, we made our function not care about the what the terminating value was.

While LISP does have NIL terminating lists, it is hard to accidentally turn a valid list into an invalid one. Even if we have a "list" which doesn't end with NIL LISP gives us the tools to be able to handle it unlike C.


 No.921871

File (hide): 2e893b1392c406a⋯.mp4 (611.29 KB, 1280x720, 16:9, shut-up-richard.mp4) (h) (u) [play once] [loop]


 No.921886

>>921694

>However, there is nothing wrong with the switch statement.

Except for the fact that it's broken. Case dispatch is very simple and easy to use and understand. C switch adds in all this bullshit that also makes everything more complicated because switch can jump into any control structure and past variable declarations. ML pattern matching also adds a lot of things, like variable binding and nested matching, but that makes it more powerful so there is a benefit to the complexity.

>>921791

>It's much easier and more natural to move around a simple pointer to data of any length.

That doesn't have anything to do with null-terminated strings. If you copy a reference to a string in Lisp, Python, or JavaScript, it's just a pointer.

>>921832

>I provided a source in my post to prove I'm not talking from my ass, where's yours?

Plan 9 documentation sucks, so it's hard to understand that it uses null-terminated strings. A look at the source shows that Plan 9 uses null-terminated strings and the Plan 9 C string literals are null-terminated.

https://github.com/0intro/plan9/blob/7524062cfa4689019a4ed6fc22500ec209522ef0/sys/lib/man/permind/ptx1.c

>>921838

>C is essentially "performance over everything"

Your post is correct except for this part. C weenies don't care about performance because they are using a language that prevents people from making faster hardware. Ada can be faster on different hardware because a lot of things can be done by the machine, which is also what Lisp machines do, but C only becomes faster when the hardware is designed to run everything else slower, like on RISC.

      OK.  How about:

switch (x)
default:
if (prime(x)) {
int i = horridly_complex_procedure_call(x);

case 2: case 3: case 5: case 7:
process_prime(x, i);
} else {
case 4: case 6: case 8: case 9: case 10:
process_composite(x);
}

Is this allowed? If so, what does the stack look
like before entry to process_prime() ?



I've been confusing my compiler class with this one for
a while now. I pull it out any time someone counters my
claim that C has no block structure. They're usually
under the delusion that {decl stmt} introduces a block
with its own local storage, probably because it looks
like it does, and because they are C programmers who use
lots of globals and wouldn't know what to do with block
structure even if they really had it.

But because you can jump into the middle of a block, the
compiler is forced to allocate all storage on entry to a
procedure, not entry to a block.

But it's much worse than than because you need to invoke
this procedure call before entering the block.
Preallocating the storage doesn't help you. I'll almost
guarantee you that the answer to the question "what's
supposed to happen when I do <the thing above>?" used to be
"gee, I don't know, whatever the PDP-11 compiler did." Now
of course, they're trying to rationalize the language after
the fact. I wonder if some poor bastard has tried to do a
denotational semantics for C. It would probably amount to a
translation of the PDP-11 C compiler into lambda calculus.


 No.921888>>921907

Reminder:

Ken and Ritchie decided to build a bare-bones version of MULTICS (called UNICS) - focusing on simplicity and performance above everything. They stripped out or changed most features relating to safety/security, maintenance, and consistency. Around same time, Thompson was trying to get BCPL to work on his similarly awful PDP-7; he had to trim it further to make it fit. He then changed the syntax for personal style. That included long-time headaches like going from ALGOL := assignment and = equality to = assignment and == equality. This resulted in the B language. The PDP-11 was byte/character-oriented instead of word-oriented - B didn't work well. Ritchie added limited typing to it to support these plus a few other changes. The result? C. They then rewrote parts of UNICS in C, others were too hard. Those difficulties led to adding structs which allowed the rest to be re-written in C. Neither C nor UNIX were meant to be portable despite the claim that it was a "cross-platform assembler." It's actually the simplicity of C and UNIX which allowed easy porting - this started from about a year later by other people. K&R added some types and standard I/O - these remained for years the baseline for portable C. Fast forward, and a man named Bjourne loves programming in Simula, an ALGOL60 superset with classes and concurrency, so he creates C with classes to bring it closer to Simula. Bjourn also adds features from ALGOL68, Ada, CLU, and ML to C along with classes to produce C++: an extension to C that reduces freedom where sensible but mostly adds features for productive or maintainable programming. The ANSI C standard created a superset of C features, including some from C++. Later, ANSI becomes the standard style for C programming. Tons of code was written this way. The worst aspects of UNIX and C were intentional design decisions that had nothing to do with what a good systems language should look like and everything to do with the limitations of the EDSAC, PDP-7, & PDP-11. We should've ditched or modified C to look more like ALGOL68 a long time ago. C and UNIX should be avoided where possible because they're Bad by Design for reasons that stopped applying sometime in the 80s or early 90s. However, if your device is a PDP-11 equivalent, then there is a language and OS that will work: consider C and UNIX 1.0.


 No.921907>>921910

>>921888

>Bjourne

>Bjourn

You managed to misspell his name in two different ways.


 No.921910

>>921907

I didn't write it.


 No.922055

>>919757 (OP)

>Do we really still need C?

Yes.

>Do we really still need assembly?

Also yes.


 No.925626>>925629

Strings? In C, we make you work so hard that it's not even worth using strings.

- It's easier to do advanced graphics rendering in C than it is to use strings.

- Appending and concatenating strings (i.e. working with text) is a complete waste of your time and no one uses this every day in programming.

- String types (and heck even boolean types) are useless. No one uses true or false or text in programs. Come on, be realistic - when was the last time you had to combine two strings together! Never! Never in my entire programming career.

Want to concatenate some strings in C language, Barney says? I politely respond "yes."

First you have to include stdio.h

Then you have to include string.h

Then you have to include stdlib.h

Finally, now that we've included all those atrocious include files, let's concatenate some strings. Or maybe we should just forget about it, because real programmers don't use strings or deal with text daily, according to the C language.


int main() {
char *str1 = "Hello ";
char *str2 = "World";
char *str3;

str3 = (char *)malloc((strlen(str1) + strlen(str2) + 1)
*sizeof(char));
/* str3 = (char *)calloc(strlen(str1) + strlen(str2) + 1,
sizeof(char)); */

strcpy(str3, str1);
strcat(str3, str2);

printf("str3: %s\n", str3);

free(str3);

return 0;
}

Couldn't we have just done something like this, Barney?

S = 'something1' + 'something2';

Guess not.


 No.925629>>925637

>>925626

Go back to your toys. Leave the real work for men.


 No.925637>>925652 >>925796

>>925629

>It's a "real" language because it makes everything harder for me!

"In 1969, AT&T had just terminated their work with the GE/Honeywell/AT&T Multics project. Brian and I had just started working with an early release of Pascal from Professor Nichlaus Wirth's ETH labs in Switzerland and we were impressed with its elegant simplicity and power. Dennis had just finished reading 'Bored of the Rings', a hilarious National Lampoon parody of the great Tolkien 'Lord of the Rings' trilogy. As a lark, we decided to do parodies of the Multics environment and Pascal. Dennis and I were responsible for the operating environment. We looked at Multics and designed the new system to be as complex and cryptic as possible to maximize casual users' frustration levels, calling it Unix as a parody of Multics, as well as other more risque allusions. Then Dennis and Brian worked on a truly warped version of Pascal, called 'A'. When we found others were actually trying to create real programs with A, we quickly added additional cryptic features and evolved into B, BCPL and finally C. We stopped when we got a clean compile on the following syntax:

for(;P("\n"),R--;P("|"))for(e=C;e--;P("_"+(*u++/8)%2))P("| "+(*u/4)%2);

To think that modern programmers would try to use a language that allowed such a statement was beyond our comprehension! We actually thought of selling this to the Soviets to set their computer science progress back 20 or more years. Imagine our surprise when AT&T and other US corporations actually began trying to use Unix and C! It has taken them 20 years to develop enough expertise to generate even marginally useful applications using this 1960's technological parody, but we are impressed with the tenacity (if not common sense) of the general Unix and C programmer. In any event, Brian, Dennis and I have been working exclusively in Pascal on the Apple Macintosh for the past few years and feel really guilty about the chaos, confusion and truly bad programming that have resulted from our silly prank so long ago."


 No.925652

>>925637

not using code blocks anymore?


 No.925751>>925775

>>919757 (OP)

"We" of course still need C. Because security researchers need their money, too. Also NSA needs it to plant backdoors more easily and with more plausible deniability.

This is also the main reason of trolling in Rust threads. NSA doesn't want anyone to switch from C, so they troll against Rust everywhere where it's possible.

Same applies for C++ --- but to a bit lesser degree.


 No.925775>>925891 >>925894

>>925751

No, the reason people troll Rust threads is because of how anal the people behind it are. Ada is also just plain better. That said, both are better than C.


 No.925796


 No.925891>>925892

>>925775

>how anal the people behind it are

oh really?

at all times when I had an interaction with them, it was pretty much okay.

perhaps you just need to stop calling people niggers and kikes left and right?


 No.925892>>925901

>>925891

You are a perfect example.


 No.925894>>925899 >>925902

>>925775

>troll

>>>/reddit/

>Ada is also just plain better.

proof?


 No.925899>>925905

>>925894

"Troll" is certainly an antiquated term today, but I was reply to him (he used "trolling") so I felt it was apropos. As for Ada, you can check this thread

>>924106


 No.925901

>>925892

Elaborate.


 No.925902>>925905

>>925894

>>troll

you forgot to tell something?


 No.925905>>925908 >>925911 >>925916

>>925899

I don't see any proof for Ada's superiority in the linked thread. Only a LARPer linking to various websites with ada in the URL.

>>925902

?


 No.925908>>925915

>>925905

>?

you quoted some text and added nothing


 No.925911>>925915

File (hide): fda8a9621a5f8e7⋯.png (27.33 KB, 1114x338, 557:169, Screen Shot 2018-06-06 at ….png) (h) (u)


 No.925915


 No.925916>>925917 >>925919

>>925905

It has really nice type safety and support for modular design. For example, if you were handling a temperature sensor, you could declare Celsius and Fahrenheit types as subtypes of natural numbers. If you had a variable of type Celsius and tried to use it in place of a variable of type Fahrenheit, the compiler would detect it. It's a verbose and robust language. The problem with Rust's "safety" model is that it makes it impossible to write modular code that is amendable to change.


 No.925917>>925920

>>925916

>It has really nice type safety and support for modular design. For example, if you were handling a temperature sensor, you could declare Celsius and Fahrenheit types as subtypes of natural numbers. If you had a variable of type Celsius and tried to use it in place of a variable of type Fahrenheit, the compiler would detect it. It's a verbose and robust language.

That's Rust.

>The problem with Rust's "safety" model is that it makes it impossible to write modular code that is amendable to change.

Proof?


 No.925919

>>925916

>The problem with Rust's "safety" model is that it makes it impossible to write modular code that is amendable to change

A sufficiently advanced bullshit is indistinguishable from bullshit.


 No.925920>>925921 >>925927

>>925917

Borrow checker.


 No.925921>>925923


 No.925923>>925925

>>925921

Good job, you just exposed yourself as a LARPer. Not even surprised, I guess all Rust fags are like this.


 No.925925>>925926

>>925923

>hurr durr you are a LARPer

>says the Ada LARPer

fucking LOL. I asked you multiple times for proof but you can't provide any.


 No.925926>>925928 >>925929

>>925925

The borrow checker rejects perfectly fine programs - the fact that you don't know this means you're just fellating the language without ever using it.


 No.925927

>>925920

Not a problem if you know what you're doing.

As a last resort there's always Arc.


 No.925928>>925930

>>925926

>muh borrow checker

I see. I'm wasting my time here. You can't provide proof for Ada's superiority because you don't know anything about Ada. You are just a LARPer shitposting on /tech/.

Have fun. I'm out.


 No.925929

>>925926

They are convertible to super-fine, which do similar shit and are accepted by borrow checker.

Why don't you say that a static type checker rejects perfectly fine programs? It's as much relevant for regular statically typed languages like Java.


 No.925930

>>925928

>I was exposed as LARPer. I'm out of here.

I did give you an example of why Ada is good you fucking knuckle dragging ape. If you knew a thing or two, you'd understand that Ada is used in highly critical situations. It was a project of the DoD.


 No.926574

File (hide): cf4b9fbf0cb8d52⋯.png (39.49 KB, 450x600, 3:4, 9front__9noose01.png) (h) (u)

>>919757 (OP)

Bjarne Stroustrup Did Absolutely Nothing Wrong -> http://harmful.cat-v.org/software/c++/I_did_it_for_you_all

Mister Stroustrup is a true patriot. HE is trying to secure our jobs.

Also the C++ stream redirections are cool

 std::cout << "BEAUTIFUL" << std::endl; 




[Return][Go to top][Catalog][Screencap][Nerve Center][Cancer][Update] ( Scroll to new posts) ( Auto) 5
174 replies | 10 images | Page ?
[Post a Reply]
[ / / / / / / / / / / / / / ] [ dir / flutter / fungus / leftpol / mascot / ss / vg / vichan / zsl ][ watchlist ]