[–]▶ No.913318>>913327 >>913335 >>913400 >>913557 >>914241 >>914529 >>915609 [Watch Thread][Show All Posts]
There are too many programming languages. Back in the day when most computing breakthroughs were made there were four. People invested in computing could focus on what they were doing instead of bumbling to keep themselves up to date on latest developments to maintain employability.
▶ No.913327>>913557
>>913318 (OP)
Languages aren't enough. We have to have multiple new frameworks per language every month.
▶ No.913329>>913334 >>913337 >>913375 >>913413 >>913557 >>913665 >>913690 >>915639
I think all of these languages that people do use are 15+ years old. Most jobs are for established languages.
Creating new languages is a good way to progress. That's true even if the languages don't end up being used all that much. Barely anybody used ALGOL 68, but it was very influential.
Most computing breakthroughs were made when there were only a few languages because most computing breakthroughs were made early on, and there were only a few languages early on. Remember, correlation is not causation. If A causes B and A causes C then that doesn't mean B causes C.
▶ No.913334>>913343
>>913329
If early people had to be in touch with 20+ languages, do you think they had had time to come up even with double linked list?
It is a problem.
▶ No.913335>>913337
>>913318 (OP)
So? You don't have to use every programming language. There's no reason not to keep using C in 2018.
▶ No.913337>>913695
▶ No.913343
>>913334
People don't have to be in touch with 20+ languages even now.
▶ No.913375
>>913329
lol shut the fuck up. nobody who jumps to the latest web PL every and web framework 5 minutes matters nor anything they make
▶ No.913400>>913424 >>913447 >>914063 >>914566
>>913318 (OP)
You're entirely right. We need to make a new programming language to replace all others.
I'll get started on the logo, you do the rest.
All you really need are C/C++ and some Assembly
▶ No.913408>>913424 >>914091
>Back in the day when most computing breakthroughs were made there were four.
There were over 60 new languages of note developed in the 1970s alone. Stop LARPing, /pol/nigger.
▶ No.913412
Everyone should have their own language and write their own software.
▶ No.913413>>913414 >>913592
>>913329
Correlation is not necessarily causation. If A causes B and B causes C, A must cause C.
▶ No.913414
>>913413
I don't see how that's relevant here in the first place, but it's also not necessarily true. If A directly causes a decrease in C that may balance out the indirect increase via B.
▶ No.913424>>913446
>>913408
Heh.
>>913400
Or use existing ones. Scheme and Fortran. That should do it. Rest can be memoryholed.
▶ No.913446
>>913424
Really just take everything from every language and reinvent and add it to the standard library of a LISP or Scheme-like language.
You can write documents in Scribble in Racket. Do it right, and LaTeX, MD, HTML/CSS/JS, PS all get collapsed into it.
Then make this nulang compilable to machine code and C, C++, Rust, D, and Go collapse into it.
The same fate waits for Python, Ruby, shell, Perl, Java... all will be dissolved and absorbed in to ((((The (International (Language))))).
▶ No.913447>>913477
>>913400
C++ has literally no purpose
▶ No.913477>>913559 >>913563
>>913447
It's C+classes+destructors, which is pretty damn handy.
▶ No.913557>>913626
>>913318 (OP)
There are still basically 4: C++, Java, Python and Javascript. You might need to pick up one or two more for a specific job (e.g. Swift, Ruby, TypeScript) but these are all variants on the above anyway. If you want to explore and expand your mind you can check out Haskell, Coq, Lisp or Rust but these are all strictly optional.
>>913327
This is the much bigger problem. Ironically it's why bloated languages like C++ and Python are better. Whatever the flaws of the language and standard library, once you know them you don't have to keep relearning frameworks.
>>913329
>Creating new languages is a good way to progress. That's true even if the languages don't end up being used all that much.
This is true, e.g. ideas from more esoteric languages like Haskell influenced C#.
>Most computing breakthroughs were made when there were only a few languages because most computing breakthroughs were made early on, and there were only a few languages early on.
Mostly agreed, although for a counterpoint see Peter Thiel's ideas on how innovation has stalled due to academia and other institutions favoring people with a less adventurous (and less optimistic) outlook.
▶ No.913559>>913566
>>913477
>is pretty damn handy
it's pretty much bloat
pure C is the only way
▶ No.913563>>913566
>>913477
oh okay, better get a 3000 page spec that changes every year for that then
▶ No.913566>>913572 >>913676 >>913731 >>913779
>>913559
>>913563
nope. C++ is ugly and bloated but it's better than C for everything except perhaps embedded programming. Some people can't handle the ugliness of C++ but that's something they have to get over.
C++ gives you much stronger guarantees than C. I'm not a language lawyer so I can't say it with the right terminology, but any instance of a class can only exist in a valid state, i.e. between calling the constructor and the destructor. This is a powerful guarantee that helps prevent errors, because you can say "as long as you have an instance of this class, I can promise certain things about its internal state". This allows RAII and smart pointers.
C++ also has templates which allow types-safe generic programming.
C++ does take time to learn, but once you get familiar with it, you don't feel uncomfortable with its complexity. Even though the spec is complex, you have an intuitive feel for what is "correct" in any situation.
▶ No.913572
▶ No.913579>>913676
Language wars are for /tech/ autists only. Real coders just fucking code.
▶ No.913582
Programmers have big egos.
▶ No.913583>>913584 >>913600 >>913685 >>913691
watch this, OP: https://www.youtube.com/watch?v=lvclTCDeIsY
in only minutes you will feel a great weight leave your shoulders, forever. You will no longer react to programming languages with an immediate desire to learn them. I envy you--I only gained this freedom after learning Haskell well enough to absolutely despise it.
▶ No.913584>>913685
>>913583
I got to 2:35. I heard "cis-white men". I stopped.
▶ No.913592>>913595
>>913413
>Correlation is not necessarily causation
JFC this is not an argument. Go find some counter data or a reasoned argument. Correlation absolutely is evidence of causation.
▶ No.913595>>913596 >>913597
>>913592
It depends on the context.
▶ No.913596>>913598
>>913595
the context is that one anon said "here's a graph. older stuff is more important. don't take this too seriously" and then some other anon said something completely senseless in 'reply', and now other(?) anons are sperging out because they interpret the senseless remark as a weak attempt at a rebuttal of the original anon's remarks.
▶ No.913597
>>913595
>It depends on the context.
Fuck you
▶ No.913598>>913601
>>913596
>they interpret the senseless remark as a weak attempt at a rebuttal of the original anon's remarks.
Thats what it was anon
▶ No.913600>>913618
>>913583
> after learning Haskell well enough to absolutely despise it.
Nice try anon. We know this is a lie.
▶ No.913601>>913602 >>915367
>>913598
it can't be, because it doesn't even contradict the original anon's remarks.
remark 1: "Remember, correlation is not causation. If A causes B and A causes C then that doesn't mean B causes C."
remark 2: "Correlation is not necessarily causation. If A causes B and B causes C, A must cause C."
both agree that correlation is not causation. One proposes that two phenomena (B and C) can have a common cause (A) -- i.e., although the correlation of B and C is strong, neither causes the other. The other proposes that earlier causes in a causal chain of phenomena can be said to directly cause the later effects: f.e., if you light a fire, and then the fire burns down a house, then "you burned down the house".
Both remarks are true. The second does not contradict the first. The second is not a rebuttal. It just sounds like one because the second anon is an idiot.
▶ No.913602
>>913601
>It just sounds like one because the second anon is an idiot.
Thats called a shitty attempt
▶ No.913618>>913619 >>913731
>>913600
laziness is simply a bad feature, like 'contexts' in Perl. The expressivity you get from monadic operators is entirely offset by how dense and obscure the language becomes in the hands of its community. Typeclasses are OK. If everyone who learned Haskell had learned OCaml first, they'd be a lot impressed with it.
▶ No.913619
>>913618
"lot less".
ML is where 100% of any magical feelings come from, when someone's learning Haskell. "wow, types can be actually useful." / "wow look at me I've spent the last hour designing this program and I haven't put anything but types to paper." / "wow, so you can do this instead of returning null in error". That stuff's real. Then you go on to appreciating lame gimmicks: oh, an infinite list of primes... well I guess this language has that cool (ML) stuff, so maybe this is also really useful...
▶ No.913626
>>913557
C++, Java, Python and Javascript
▶ No.913665
>>913329
lmaoing at all these fucking front-end developers. No wonder we have all that bloat shit.
▶ No.913676>>913688 >>913731
>>913566
C is unsuitable for modern software development.
C++ is based on C but even worse.
>only exist in a valid state,
I think you mean invariants?
Also
>C++ is typesafe
Gno
>>913579
>Tool doesn't matter
>>>/reddit/
▶ No.913685>>913793
>>913584
>>913583
If you want a hearty laugh, I suggest that you watch at least a minute more.
>When working with the language you should assume you are interacting with a woman.
>Unapologetically femme language
>coreteam all woman
few slides later
>Core team is x y and myself
>That thing giving the talk is a woman.
To top it all of, looking at their shithub
>Latest commit 2bbf642 on Jun 13, 2016
Guess they killed themselves.
▶ No.913688
>>913676
> modern software development
Yuck! Just give me an Amiga, Atari, 8-bit computer, maybe even TempleOS. Those are fun, this other shit ain't.
▶ No.913690
>>913329
That list suprises me. I expected Pajeet Hypertext Processor to be way higher on the list.
▶ No.913691
>>913583
>by Sig Cox
This has to be satire.
▶ No.913695
▶ No.913729>>913751 >>913876
Lua is the best high level language for ease of programming. C is the best high performance high level language.
I don't like assembly because for general purpose programs it's slow, both to execute and to write programs in.
▶ No.913731>>913751 >>913752 >>913763
>>913566
>nope. C++ is ugly and bloated but it's better than C for everything except perhaps embedded programming
no.
>C++ gives you much stronger guarantees than C. [...] any instance of a class can only exist in a valid state, i.e. between calling the constructor and the destructor.
fuck off, even in Java (which is about 100x safer than C++) class invariants are impossible to preserve except for literally nobody who fully understands the language. see for example https://stackoverflow.com/questions/16159203/why-does-this-java-program-terminate-despite-that-apparently-it-shouldnt-and-d
>C++ also has templates which allow types-safe generic programming.
C++ templates are trash
>C++ does take time to learn, but once you get familiar with it, you don't feel uncomfortable with its complexity
the same claim is made about every PL, yet 99.999999999999999% of the users literally don't understand the language. C is one of the biggest offenders here, so adding new features to C isn't helping
>>913618
>If everyone who learned Haskell had learned OCaml first, they'd be a lot impressed with it.
You mean SML, OCaml is just SML+PHP. Typeclasses only have an advantages over modules because the languages are text based.
>>913676
>implying there's a virtuous definition of modern software development that applies to any existing people
▶ No.913751
>>913731
>>implying there's a virtuous definition of modern software development that applies to any existing people
True, I was spouting meme terminology myself.
Better phrased would be
>C, C++ and java are unsuitable for any software development.
The actual state of software proves my point.
>>913729
I liked the Oz syntax, but unfortunately that it is slow as fuck and really only suitable as a research/teaching tool.
▶ No.913752>>913759 >>913763
>>913731
>fuck off, even in Java (which is about 100x safer than C++) class invariants are impossible to preserve except for literally nobody who fully understands the language. see for example https://stackoverflow.com/questions/16159203/why-does-this-java-program-terminate-despite-that-apparently-it-shouldnt-and-d
<you can lose your class invariants if your multithreaded code has a race condition, therefore having at least some guarantees about class invariants in a single-threaded context is no better than having none at all
Please don't be retarded.
▶ No.913759>>913769 >>913779
>>913752
>therefore having at least some guarantees about class invariants in a single-threaded context is no better than having none at all
I never said that. I was just giving an example of how flaky class invariants are in even the most safe mainstream languages. C++ is 100x worse. If you unironically think you're getting some better code just by using classes your code is probably broken as fuck. Classes in fact don't provide any useful form of invariants at all pretty much. The only one useful thing about them is probably the ability to make an Abstract Data Type, which can already be done in any typed PL ever, including SML or Haskell (and stuff like Ada IIRC)
▶ No.913763>>913769 >>913893 >>913952
>>913752
>>913731
>A sensitive operation in my lab today went completely wrong. An actuator on an electron microscope went over its boundary, and after a chain of events I lost $12 million of equipment. I've narrowed down over 40K lines in the faulty module to this:
>Sensitive operation
>Expensive equipment
>using Java
HAHAHAHAHAHAHAHAHAHA
Using that fucking piece of shit pajeet language with everything is a class garbage for real time machine control.
▶ No.913769>>913790 >>913893 >>913950
>>913759
So, when you look at the answers, you find that he wasn't doing what he needed to be for those invariants to apply: his class was neither immutable, nor the fields marked as volatile.
>>913763
This dumb motherfucker is a gold mine. I can't tell if he's retarded or just trolling. One of his questions is that he can't understand the error message generated by this "java program":
class A {
static {
System.out.println("Hello world");
}
}
It prints out "Hello world" as expected. Can you GUESS the error message and it's cause?
Then, there's this q and a:
Question (not his) is how to do this Python in Java
for x in range(2, n):
if n % x == 0:
print n, 'equals', x, '*', n/x
break
else:
# loop fell through without finding a factor
print n, 'is a prime number'
His answer:
class A {
public static void main(String[] args) {
int n = 13;
found: {
for (int x : new int[]{2,3,4,5,6,7,8,9,10,11,12})
if (n % x == 0) {
System.out.println("" + n + " equals " + x + "*" + (n/x));
break found;
}
System.out.println("" + n + " is a prime number");
}
}
}
I actually feel bad knowing that there are people who think that this is okay.
▶ No.913770>>913893
There are too many ice cream flavors. Back in the day when most frozen food breakthroughs were made there were four. Ice cream men could focus on what they were doing instead of bumbling to keep themselves up to date on the latest flavors to maintain employability.
▶ No.913779>>913790 >>913893
>>913759
>I never said that
The comparison >>913566 made was with C. C++ is 100x worse than what? Java? SML or Haskell? That's not relevant, because even if C++ was just C with constructors and destructors, it would still be 100x times better than ordinary C when it comes to preserving invariants.
>Classes in fact don't provide any useful form of invariants at all pretty much.
It just so happens that they can, at least in C++. Not having to manually manage resources for every single data structure that isn't static or entirely on the stack is pretty useful. Also, unlike in Java, object lifetime in C++ is completely deterministic.
▶ No.913788
Sepples is impossibly complicated and it doesn't even have real macros. It forces you to write in the object-oriented paradigm in which everything is a class even if it doesn't need to be. 30 years in and they finally added in lambda functions, a feature which has existed in LISP since day one. God, what a clusterfuck of a language.
▶ No.913790
>>913769
>Troll
Could very well be, considering pic related is his avatar. And the fact that within the span of a few weeks he asks a basic question about that hello world as well as one where he is presumably debugging machine control software.
Maybe we are being taken on a ruse cruise by /tech/ from 5 years ago.
>>913779
913759
>Some talk about ADT's classes and invariants
Indeed, you don't need classes and curly bracket vomit to do something like that. Well, if you are using a sane language.
package simpleadt is
type point is private; --Keyword here is private
with Type_Invariant => SomeTest(P: point);
--Functions below are just the specifications.
--Their Bodies would be in the package body.
procedure StepX(P: in out Point); --make a step or whatever
function GetX(P: in Point) return Integer; --Read X coordinate
private
--The full type declaration happens in the private part.
--That point is actually a record with two fields in unknown to clients.
--They just interact with it via the function provided in the public part.
type point is record
X: Integer := 0;
Y: Integer := 1;
end record;
--The Expression function below gets checked at relavant points during runtime
function SomeTest(P: Point) return Boolean is
(P.Y-P.X = 1);
end;
end simpleadt;
▶ No.913793
▶ No.913876
>>913729
Tough noogies, it keeps out the riff-raff. Plus, you're not constantly second-guessing a shit cumpiler for a shit, halfass standard designed to create security disaster that we're in today. Fuck C and the cianigger it rode in on.
▶ No.913893>>913938
>>913763
>hurr durr machine control it has to be hard real-time
>>913769
as I said, my point is that people don't know how their PL works, regardless of how "simple" the meme community believes them to be. For example every idiot claims C is simple, yet most C code is garbage and the writer is obviously oblivious of one or more of {the machine, the spec, the implementation}
>>913770
meanwhile in the software world, there's no good software and changing PL every 5 minutes (and guaranteeing everyone in your company has no understanding of the code) isn't helping
>>913779
>C++ is 100x worse than what?
100x worse than C
>malloc
>deterministic
gno
let's be honest: most people only use C++ for games because it's what's used and they believe something bad will happen if they use a simpler language like C (which isn't a simple language, but at least is simpler than C++). then you have about 3 PL nerds who use C++ because they're too virgin to understand that the benefits C++ provides are all moot because of the language's complexity (again, even C was too complex in the first place, but PL virgins don't understand this because they never got fucked by the edge cases yet)
▶ No.913938
>>913893
>100x worse than C
Wrong.
>malloc
Why the fuck are you even bringing up malloc? Constructors and destructors have nothing to do with heap allocation. "Deterministic" here means that you can always know at which point in the program deallocation will occur, unlike with most (all?) garbage-collected languages.
>they believe something bad will happen if they use a simpler language
Or maybe they like the encapsulation provided by namespaces and classes, and don't want to reinvent the wheel in C every time they need some form of dynamic polymorphism? Maybe they want lambdas? Or perhaps they like having generic containers and algorithms, along with proper strings?
<you can encounter some edge cases in this complex language, therefore all benefits are moot :^)
ebin
▶ No.913950
>>913769
>troll
It's impossible to troll on Shit Overflow. Unless you make it really over the top, you'll get buried by a mountain of pajeetposting and retards will believe you're being serious.
▶ No.913952
>>913763
>everything is a class garbage
>class
Do you mean object?
>Java
But Java has primitives. Were you referring to Python?
>shitty code can't be written in good languages
Really?
▶ No.913958>>913964
I've been trying to compile a list of languages that aren't cancer.
Since programming is only a hobby atm I guess I get to do this.
So far the list is:
Forth, Lisp (mostly Scheme nowadays), SML, and Prolog
Everything else is basically just cancerous.
▶ No.913964>>913967
>>913958
>Forth
is dead. It's literally a few autists with their own personal Forth-like languages. Apart from them the only community you'll find are non-coding LARPers, faker than the most faking-it-till-he-makes it anon on /tech/ (because they don't even have the goal of one day 'making it')
>Lisp
isn't a language. 'Scheme' is barely even a language. Common Lisp is pretty good though.
>SML
if you typo a type constructor in a pattern-match, you'll bind a variable instead. OCaml fixes this with case-sensitive type constructors. ATS fixes this by requiring that type constructors in pattern matches look like function calls.
OCaml's a little bit weird, vs. SML, but it's not any more cancerous and you'll get more done with it.
>Prolog
oh come on. Prolog is so dead that more Prolog is written in Prolog-like DSLs inside other languages, than in Prologs themselves. Is that the joke? That only in death do we escape cancer?
▶ No.913967>>914159
>>913964
Scheme isn't dead yet. It is interesting that some of the most expressive languages (Forth and Lisp) are essentially dead.
Prolog seems fun just for the mind bending experience of logic programming.
I cannot defend SML.
▶ No.914063>>914066 >>914068 >>914539 >>915127 >>915143 >>915222
>>913400
You don't need to.
C/D/C++/Rust/Ada (pick one)
Java/C#/Go/Swift (pick one)
Python/Ruby/Perl (pick one)
Javascript/PHP (pick one)
Now mix them together
▶ No.914066>>914539
>>914063
>C/D/C++/Rust/Ada (pick one)
C++
>Java/C#/Go/Swift (pick one)
Go
>Python/Ruby/Perl (pick one)
Fuck, none of them
>Javascript/PHP (pick one)
If I have to, javascript
▶ No.914068>>914613
>>914063
a reduced version of Lua which can be compiled into machine code is king, although I don't know whether it's easy to implement the tables that way.
▶ No.914069>>914084 >>914088
C. Pascal. Scheme. Go. Erlang.
What else could you possibly want?
▶ No.914084
>>914069
Are you kidding me? C AND Pascal? Go AND Erlang? Pick one of each.
▶ No.914088
▶ No.914090>>914096
>Lua
>Arrays start at 1
might as well use matlab
▶ No.914091>>914137
>>913408
>everything I don't like is /pol/
>>>/bog/
▶ No.914096>>915222
>>914090
that's one of the only bad aspects of Lua. I barely notice it though, it's only ever been annoying when using command-line arguments and the like (because those arrays start at 0).
▶ No.914137
▶ No.914159
>>913967
Forth isn't dead since it's used in embedded systems. You could even use it for desktop and server shit if you want to, but there industry trends are pic related, which is why today you have bloat and botnet galore. And it will get worse as white men are pushed out of IT for political reasons. Fun times ahead!
▶ No.914169
▶ No.914240
the reason why there are so many languages is because of proprietary languages and web design non sense. girls want a language that fits their small mind so they people make a language for them but then there are other pajeets or girls with different degree of brain retardation so you have to make a language for them also
▶ No.914241>>914306 >>914314 >>914419
>>913318 (OP)
>There are too many programming languages. Back in the day when most computing breakthroughs were made there were four.
Back in the day, there were hundreds of programming languages. There's a paper from 1966 called "The next 700 programming languages" because there were already around 700 in existence at the time. Only a few of them were important, but that doesn't mean people weren't making them. A lot of them were (what we would call) scripting languages for single applications.
>People invested in computing could focus on what they were doing instead of bumbling to keep themselves up to date on latest developments to maintain employability.
That was because computer companies cared about compatibility. The calling conventions supported higher level data structures. Multics and VMS have data descriptors which identify the type of data so you don't have to care about the language as long as the data in memory matches what the descriptor says. The Lisp machines extended it to OOP, closures, GC, and dynamic typing. On Lisp machines, the same data types are shared for every program and the hardware has tags so it knows the type of everything. The same GC is shared for all programs running on the computer. UNIX weenies only care about C, so everyone has to use a C FFI even when they don't want to use C. C++ is from the same company but it's still as much of a second-class citizen on UNIX as it was in 1990. This means fewer common language features are compatible today than they were in the 60s. The only way to share a string is by converting it to a null-terminated string or keep a null on the end, which can cause serious bugs on the C side when the string contains a null somewhere else, and C can't handle garbage collection.
Of course the most amazing thing about function
prototypes that everyone overlooks is that a DECENT
environment, with a decent compiler and linker, would be
able to check all that stuff at link time! There's nothing
wrong with C as it was originally designed, it's just that
nobody thought to have link-type parameter checking built
into the compilation environment. Rather than someone
putting all that work into the linker (what, a few months?)
we have a language standard that forces every user to
perform the job of link editor in their head. Great. And
it's becoming the lingua franca of programming in the 90s.
It's grimly amusing to note that UNIX, the cradle of C as it
were, lags hopelessly behind DOS in the quality of C
programming environments, with the notable exception of
Saber-C. Saber-C is another unusual case, however, since it
gives you a NICE environment, with compile and link error
checking, runtime error checking, and so on, and when you
want to actually generate an executable....? You're on your
own and you're back to using the outdated piece of shit
compiler the vendor gives you with your system - a
descendant of the original 'pcc' or some similar
abomination, hacked on by generations of grad students on
their way to guruhood. Or you can use gcc, if you have the
time, the MIPS, and the disk space, and can tolerate its
little foibles.
The crowning glory of it all is C++, the birthplace
of function prototypes. Function prototypes must have been
implemented the way they were because it was easier for the
compiler writer to get all the parameters before starting to
parse the function, rather than having to figure them out in
the ensuing lines. When a language is bent, twisted, and
malformed to save the compiler writer a bit of effort,
you've got to wonder why the guy's writing compilers in the
first place. Now, C++ could really use a decent compilation
environment, with link time name resolution. But
NnnnnoooooOOOOOOO, rather than "do it right" some late night
caffeine and taco fever nightmare scheme was concocted
whereby symbols get transformed into DES-encrypted dog vomit
and back.
C++ is heading for standardization. All of this
happened because nobody wanted to put a few hooks for type
checking in their linker and object file format. I think
there's a lesson in all this, but I'm damned if I can see
it.
*The story of the programmer who wrote all his function
prototypes to accept (char *) and cast everything to (char
*) before calling any function is pure fiction. Really.
▶ No.914306>>914351
>>914241
>large uncited block quotes
fuck off
▶ No.914351>>914355
>>914306
Its from UnixHaters newfag.
He posts this shit in every other thread.
▶ No.914355
>>914351
>Its from UnixHaters newfag.
And I bitch about it in every thread.
▶ No.914419
>>914241
do you actually think anyone reads the text you quote that you paste in every thread about programming languages?
▶ No.914529
>>913318 (OP)
Engineering is all about finding the objectively best way to do a specific task. The “opinion” fags is what ruined it.
▶ No.914539>>914556
>>914063
>>914066
>Fuck, none of them
You have to
>If I have to, javascript
How about I add Typescript and Coffeescript to the mix?
▶ No.914556>>915222
>>914539
Coffeescript is fucking dead and Typescript is nothing but a temporary transition before actual Javascript implements types.
▶ No.914566>>915222
"There are only two kinds of languages: the ones people complain about and the ones nobody uses." -- Bjarne Stroustup
>>913400
I happen to agree with your choice of languages, but it could have been another language family above assembly as well. A lot of people when I was learning C, did programming in Pascal. The object oriented paradigm became popular and both languages got an objection oriented extension. I have more complaints about C++ than I do about C, but I think that is unavoidable when you design languages to mimic human thinking instead of computer logic. I particularly dislike exceptions. In any case I stick with ANSI for both, because that whole ever evolving language circus and the maintenance required is not for me. I want programming to be about the program and the language to be a tool. I like languages like lua and tcl as scripting extensions to programs and Perl is my preferred text-processing beast. A POSIX compliant shell on top of that and I rarely touch other languages unless it is to patch existing code. Oh, I forgot.. I do use PHP too, not that I particularly enjoy it, but that's not as much an objection to the language as it is the use case.
My biggest problem is how languages are not stable. I think this is a place where, it would be really useful to stick with do one thing and do that thing well. I don't want to fight with libraries over which version of a programming language we should be using and I'm glad there are a lot of C and C++ people who agree and appreciate the ANSI standards. We have a serious dependency hell on Linux much worse than what we made fun of on Windows, because of this. Can we please stop this incompatibility madness and then I don't care whatever language you use. Oh thank you, I need both Python 2 and 3 for this project and thanks again for the upgrade to C++11 it really helped my system become a big mess.
▶ No.914613
>>914068
Terra is exactly this.
▶ No.915127>>915133 >>915144
>>914063
This is good guide to decide what to learn. Still, it will be 4 languages to keep tabs on. Like learned natural languages, their conventions will keep bouncing around in your head, making you a worse coder in all of them. Now step into corporate environment where you possibly need to learn new language every now and then as you are transfered between teams and projects.
I'm absolutely sure the reason why so many software projects fail (70%) is because of this pointless jumping around and cognitive burden.
▶ No.915133>>915222
>>915127
> Like learned natural languages, their conventions will keep bouncing around in your head, making you a worse coder in all of them
... what?
Native bilingual faggot here: I excel in one of the languages, "suck" at the other one. I put suck in quotation marks because I actually am fairly fluent in it, just that fully context switching to it often takes whole minutes since I am not used to speaking it, so I only speak it if it's strictly necessary to avoid making a fool of myself. I also happen to be fairly fluent in written English because I deal every day with that, but I suck at spoken English because I could count the times I had the chance to do it.
Likewise, I deal with several programming languages, at work and at home, and also attempt to abuse the exclusive properties of each language. I have been working with these languages day after day for the past year and now switching between them is fairly cheap. Maybe not free, but I can assure you once the switch is done there are absolutely no downsides, because as they say, knowledge takes up no space, and practice makes perfect.
>I'm absolutely sure the reason why so many software projects fail (70%) is because of this pointless jumping around and cognitive burden.
This may be true, but that's because context switching kills productivity, and also because learning is slow, and mastering even more so. You can also find context switching turnoffs when moving between different files written in the same language as well, btw.
▶ No.915143
>>914063
If you're white, it's a good idea to take advantage of your higher IQ and focus on the languages at the top of anon's stack. C and Ada ftw! Java, Python, and Javascript are all very easy (and have their place), but the job market for these skills can easily be saturated by the unwashed masses. Also don't overlook VHDL/SystemVerilog, both hardware design and electrical engineering are still relatively unpozed fields (It's not pajeet/gook free, but at least the pajeets there have a higher likelihood of being competent)
▶ No.915144>>915223
>>915127
>conventions
>cognitive burden
lol they are almost the exact same language. His list is shit you can easily achieve.
A proper aim would be to learn a language from each family.
▶ No.915222>>915236 >>915551
>>914063
there are only 2 categories (C, Ada, etc; everything else), not 4. "scripting" is a meme. "web" languages like JS or PHP aren't even a real thing, those are just broken as fuck shit
>>914556
>coffeescript
lol a bunch of regex and macros as a compiler (and the spec is just whatever the compiler produces)
>>914096
lua has only ~1 bad aspects. lol
>We have a serious dependency hell on Linux much worse than what we made fun of on Windows, because of this
>>914566
Just because you don't understand how Windows has dependency hell, doesn't mean it's not there. For example ASLR is impossible to implement on Windows (or at least has been for 15 years. I don't care if it's been fixed since the last time I checked, that's still 15 years to get some basic thing implemented)
>>915133
>i understand all my PLs
no, you don't even understand one of them.
▶ No.915223
>>915144
>learn a "web language" like PHP as well as a "scripting language" like Perl.
they're literally the same shit, nigger
▶ No.915236>>915240
>>915222
>no, you don't even understand one of them.
I am sure you could recite the spec of your language it is C, right? Or at least that's what you would like to say if you actually programmed to a T, and also every single bug of every single compiler ever produced for it.
▶ No.915240
>>915236
no, i don't understand any PL either. that's the problem. they're all bloat and full of monkeys amending the spec and implementation. I understand the one I wrote though, since a necessary trait required by the basis of a secure OS is being able to understand every detail of the PL
▶ No.915367
>>913601
>it doesn't even contradict
Yes anon its a failure of a response I agree.
▶ No.915551
>>915222
>you're out with your work laptop being productive
>8pm, Saturday night, local bar
>keep the White Russians coming, please.
>suddenly: "omg anon shit's fucked"
>an entire platform is down
>you start logging into stuff
>suddenly these servers have not enough memory?
>no, the same program is the problem on each of these servers
>wtf is this thing? is it important? what exactly is it doing?
<if scripting is a meme, proceed to BAD END #22: "put a ticket in with vendor, get a response on Monday, replace website with we're-so-sorry notice in the meantime, maybe then get fired for being only as useful an intern"
<if scripting is not a meme, continue:
>strace process. it's continuously receiving SQL responses
>open script. find only three database queries.
>lol whoever wrote this has no idea how big we are
>confirm process is OK to interrupt
>patch script, kill process, restart script. it works fine now
>mass out process kills and the fix to the affected platform
>get back to drinking
>"wow, good thing we have people like anon around!"
>next paycheck has a bonus
▶ No.915609
>>913318 (OP)
What's that image?
▶ No.915637>>915693 >>915744
I really, really wish it were possible to measure how many man-hours involved in pic related were wasted by redundancy or just plain lack of users, compared to total man-hours.
▶ No.915639>>915641
>>913329
>C in the last spot
>js in the top spot
/tech/ eternally btfo
▶ No.915641>>915643 >>915644
>>915639
>js is great
>c sucks
>take that, tech
Did you know that suicides in the IT sector are at an all time high?
Especially among indians and chinese?
▶ No.915643
>>915641
Did not know. Can you back that up with some sources?
▶ No.915644
>>915641
I think suicides, period, are at an all time high.
▶ No.915693>>915702
>>915637
this could have been solved with a planned economy goyim
▶ No.915744>>915889 >>915896 >>915916 >>916036
>>915637
Instead of measuring, I'd rather just start from scratch and write a new OS. Logo suggestions?
▶ No.915889
>>915744
How about a cock?
▶ No.915896>>916036
>>915744
How about just getting a copy of Temple OS and then adding internet capabilities to it?
▶ No.915916>>916036
>>915744
In order to make an operating system from scratch, you must first create the universe.
▶ No.916036
>>915744
>>915916
>>915896
Dumpterfire x86 and start making operating systems for low power, non-botnet microcontrollers. Several hundred orders of magnitude easier to do, and you can shed the cancer that is modern software ecosystems in favor of a totally new environment. There is no need, for example, for general purpose "web browsers". 8ch would have a dedicated application, potentially even dedicated hardware if you were so inclined. Plug-and-play.