[–]▶ No.941595>>941643 >>941645 >>941690 >>941743 >>941857 >>941864 >>942027 >>942122 >>943519 [Watch Thread][Show All Posts]
WE FUCKING DID IT!
WE FUCKING WON!!!!
PYTHON BTFO!!!!!!!!!!!!!!!!!! FOREVER!!!!!!!!
Now that PEP 572 is done, I don't ever want to have to fight so hard for a
PEP and find that so many people despise my decisions.
I would like to remove myself entirely from the decision process. I'll
still be there for a while as an ordinary core dev, and I'll still be
available to mentor people -- possibly more available. But I'm basically
giving myself a permanent vacation from being BDFL, and you all will be on
your own.
After all that's eventually going to happen regardless -- there's still
that bus lurking around the corner, and I'm not getting younger... (I'll
spare you the list of medical issues.)
I am not going to appoint a successor.
So what are you all going to do? Create a democracy? Anarchy? A
dictatorship? A federation?
I'm not worried about the day to day decisions in the issue tracker or on
GitHub. Very rarely I get asked for an opinion, and usually it's not
actually important. So this can just be dealt with as it has always been.
The decisions that most matter are probably
- How are PEPs decided
- How are new core devs inducted
We may be able to write up processes for these things as PEPs (maybe those
PEPs will form a kind of constitution). But here's the catch. I'm going to
try and let you all (the current committers) figure it out for yourselves.
Note that there's still the CoC -- if you don't like that document your
only option might be to leave this group voluntarily. Perhaps there are
issues to decide like when should someone be kicked out (this could be
banning people from python-dev or python-ideas too, since those are also
covered by the CoC).
Finally. A reminder that the archives of this list are public (
https://mail.python.org/pipermail/python-committers/) although membership
is closed (limited to core devs).
I'll still be here, but I'm trying to let you all figure something out for
yourselves. I'm tired, and need a very long break.
▶ No.941597>>941599 >>941602 >>941864 >>942325
Python's become a mess of feature creep lately.
▶ No.941598>>941619
Second largest paragraph is his defense of a CoC, good riddance. The fact that he didn't choose a strong womyn of color to be his successor speaks volumes about Trump's America; python is clearly deeply rooted in a toxic white supremacist masculinity. Hopefully a intersectional solution can be provided, which is why the new core devs must all be LGBT, preferably T minorities. Let's move forward together.
▶ No.941599>>941614
>>941597
If even one woman or person of color is helped by what you call "feature creep", then it is worth it. How dare you, as a white male try to remove features that make it easier for historically marginalized people to get ahead. Shame on you.
▶ No.941602>>941617 >>941690
>>941597
Python was never good.
▶ No.941605
▶ No.941606>>941612 >>942325
Good.
Time for the Perl6 comeback.
▶ No.941612
>>941606
Perl 6 sucks shit. We need Perl 5.
▶ No.941614>>941619 >>941637 >>943484
>>941599
It's funny cus I'm literally a sandnigger
▶ No.941617
HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA
>>941602
this
▶ No.941619>>942197
>>941598
Pardon me, it is actually the largest.
>>941614
Some of you guys are alright as long as you stay in your own countries and remove the Jew.
▶ No.941637>>941939
>>941614
Minority = Not being a white cis male + having correct opinions. You have the wrong opinions and are thus not a minority.
▶ No.941643
>>941595 (OP)
tl;dr faggot
▶ No.941645
▶ No.941657
I nominate Brian Fagioli to take his place
▶ No.941678>>941683 >>941748
I'm sorry, but who's that person? What does he have to do with Python? He stepped down from what?
▶ No.941680>>941710 >>942332
PEP 572 should never had been accepted. For a language that strives for readability why would you add a feature that obfuscates assignments. Everywhere else assignment is done in the format of variable = expression, but now we also have to watch out for assignments being done literally anywhere? How did they even think this would be readable?
▶ No.941683>>941687 >>943781
>>941678
Guido created Python, then oversaw several stupid decisions over its lifetime such as splintering the language into two mutually incompatible versions that are both still in use a decade later, and the global interpreter lock which has permanently gimped Python's parallelism, or PEP 573 which yet again introduces another incompatibility to previous versions of the language. He's now stepping down as leader without appointing a de facto replacement while inviting the blue hair brigade to splinter the community as a final "fuck you" to everyone.
▶ No.941687>>941705 >>941868
>>941683
Hopefully this will kill python once and for all and we can be free from one more entryist POS language
▶ No.941690>>941700
>>941602
horse shit tbh
>>941595 (OP)
what's the link to this specific message?
what's the context? (what exactly did "we" do/win?)
▶ No.941700>>941701
>>941690
>what's the link to this specific message?
Read the thread
https://mail.python.org/pipermail/python-committers/2018-July/005664.html
>what's the context?
See previous messages in the mailing list
>(what exactly did "we" do/win?)
"Won" in the sense that the guy that annoys anons (aka Guido) is gone.
▶ No.941701>>941713
>>941700
>the guy that annoys anons (aka Guido) is gone
how exactly is this a good thing?
there's nothing wrong with being annoying
>other text
I found that link in the meantime but thx.
▶ No.941705>>941934
>>941687
Are you the kind of retard who hated BASIC when it was the thing?
▶ No.941707>>941711 >>941722 >>942325
Is this actually good?, if python dies doesn't that mean everything will be written in node.js now? (if it wasn't before).
▶ No.941710>>942325
>>941680
He's got C envy I think. It makes no sense to do that in python rather than just call a function twice as it's a trade of simplicity for performance in a language that already abandoned performance.
▶ No.941711>>941712 >>941715 >>941720
>>941707
Node.js is only for web pages. (I'm not sure never used it and never will)
▶ No.941712>>941719
>>941711
Wrong, you can use it for anyeverything
▶ No.941713>>941722
>>941701
>how exactly is this a good thing?
Ask OP or ask whoever cares.
>I found that link in the meantime
Next time find it beforehand, sweetie.
▶ No.941715>>941721
>>941711
node.js literally does not work in browsers. It's for servers and scripting, not webpages. Javascript is for browsers.
▶ No.941719>>941725 >>941733
>>941712
And how do you put node.js into a vidya game?
Not that I would want to pozz my vidya
Also there are slower alternatives like Lua and if you want to go faster you can just use C for the pros or C# for the faggots).
▶ No.941720>>941724
>>941711
node.js sadly gets used for more than webshit. If you have Adobe 'Creative Cloud' (subscription DRM Jew they forced on their ecosystem) you'll find it running 24/7 on your desktop doing god knows what with tons of ram.
▶ No.941721>>941723
>>941715
>servers
I thought it runs serverside and builds the page.
▶ No.941722
>>941713
>Next time find it beforehand, sweetie.
who do you call sweetie homo?!
stop with that bullshit or else!
>>941707
probably in Ruby or Lua or Dart or… js.
half of these are shit… fucking russian roulette.
but maybe someone reasonable will fork python and take care.
▶ No.941723>>941726 >>942737
>>941721
That's not a core function of node.js. That's just something that people have built using node.js. Either way it's still not run in the browser.
▶ No.941724
>>941720
Holy shit that's awful. KILL THEM ALL REEEEEEEEEEE
▶ No.941725
>>941719
you can't pozz what's already 99.9…% pozz.
▶ No.941726>>941728
>>941723
>it's still not run in the browser
I never claimed or believed that. I'm sorry you misinterpreted my words.
▶ No.941728>>941731 >>941732
>>941726
No, there was no misinterpretation, only you being wrong. This is like saying Java is for webpages because of jsp.
▶ No.941731
>>941728
Okay. I accept that I phrased it wrong. When I said web pages I meant web servers which host websites. English is not my native language and sometimes I phrase things in a wired way.
▶ No.941732
>>941728
And I have no idea about nodejs but I remember a thread about it where nasa hired pajeets who build a website with it and pajeets where like "NodeJS is going to space! This proves JavaScript is actually good!"
▶ No.941733
▶ No.941743>>941870
>>941595 (OP)
The writing is on the wall, Python's days are numbered as Julia creeps towards 1.0.
Guido is wise to quit while Python is still king. It's all downhill from here.
▶ No.941748
>>941678
It's okay that you don't know who he is, but it's not okay that you decided to open your cock holster before finding out. Learn to use a search engine, you useless motherfucker.
▶ No.941864>>949099
>>941595 (OP)
>WE
Did this place become reddit while I was absent ?
>as I feared
He should have said thank you and read it instead of this.
>>941597
>lately
Python is a good software for newbie training in simple logic imo. Problem is that it was so massively used (to the point of replacing lecture 6.001 at MIT) that it become a shitfest of code and whatnot, simplest example of it is how much malware was discovered this year in it's packages.
▶ No.941865>>941866 >>941873 >>941891
What are people going to migrate to from Python?
▶ No.941866
>>941865
Another meme language
▶ No.941868>>942347
>>941687
python was good for quick scripts and its libraries for everything and the last decade has superseded perl but i know what you mean regarding the larpers who will learn basic python and suddenly are software devs. the blue hair brigade will kill the language now that its anarchy. realistically what options do we have for simple lightweight scripting languages? lisp of course but its lacking in libraries. maybe move to nim which as a bonus also rejects coc's.
▶ No.941870>>942029
>>941743
matlab is shit and so julias syntax is shit too. i will stick with c++, lisp, erlang, and nim.
▶ No.941873>>941874 >>941943 >>942325 >>942658
>>941865
There's no competitor, yet. Go was supposed to bridge the gap between shitlangs and goodlangs but it ended up so trash that it's struggling to replace Ruby. Rust shit the bed so thoroughly that even C++ programmers say it's overly complicated with slow compile times. And Perl 6 is everything that was wrong with Perl 5.
The rest are memes: lisp, erlang, nim, julia, haskell, scheme, brainfuck, D
▶ No.941874
>>941873
>The rest are memes
that means those are the good languages
▶ No.941891
>>941865
Javascript probably, and you always know a moron when they call it node.
▶ No.941934>>941941 >>941944
>>941705
BASIC was actually a good language, much closer to the hardware than any of the modern shit languages that want to abstract everything into OO or other lame constructs. Also, PEEK and POKE, motherfucker! That's your gateway to inline machine code.
▶ No.941939
>>941637
This but unironically.
▶ No.941941>>941965 >>942114
>>941934
BASIC was a million miles away from the hardware. The existence of a primitive "unsafe" to escape the language does not make it a low level language.
▶ No.941943
>>941873
>The rest are memes: lisp, erlang, nim, julia, haskell, scheme, brainfuck, D
Actually I worked full time in Erlang for good buck some time before. And I've seen unironical job postings for D and Haskell, too.
▶ No.941944>>942325
>>941934
Ctypes & FFI motherfucker.
▶ No.941957>>942072 >>942325
>nim
Never heard of this one but it looks promising.
▶ No.941963
I would like Guile to gain more traction, it is the canonical GNU extension language after all, but I fear Andy Wingo is an even bigger cuck than Guido.
▶ No.941965>>942013
>>941941
No you nigger, it gave you direct access to the entire memory and I/O ports. That's miles away from modern nanny OS and their related languages that try very hard to hide the hardware from you, because only Intel and cianiggers are allowed full access on the botnet.
▶ No.942013>>942548
>>941965
How is giving more people unsafe access is good?
Do you want every run of the mill program to be able to re-flash your BIOS or something?
▶ No.942027
>>941595 (OP)
The problem I have with this is that a copy of SICP was wasted on someone who was completely hopeless.
▶ No.942029
>>941870
If you like the taste of shit then I'm not going to stop you from eating it, but I won't allow you to contribute to any meal I'm making.
▶ No.942030>>942035 >>942043 >>942058 >>942068 >>942072 >>942101
Welp, I did a small backend in Python with minimal import as an exercise. Will have to replace it but I'm not sure with what.
▶ No.942035
▶ No.942043
▶ No.942058>>942080
>>942030
>Will have to replace it
Why?
▶ No.942068>>942080
>>942030
>rewriting your shit because of anonymous nodev opinions
If you're convinced it's appropriate for your application you're fine.
▶ No.942072>>942080 >>942101
▶ No.942080
>>942058
>>942068
Well it's working, but I need to leech some data somewhere and import it.
For now I set it aside and I'm on another personal project.
>>942072
Thanks, I'll put that in the infinite to-do list
▶ No.942101>>942111
>>942030
Julia.
Don't listen to >>942072. Nim is one-man spaghettifest that's slow as hell even though it compiles down to C due to poor, throw everything into it without thinking design which makes its compiler output hard to reason about.
▶ No.942112>>942114
>>942111
Fuck, forgot another link.
> http://www.zverovich.net/2016/05/13/giving-up-on-julia.html#performance
< For example, a trivial hello world program in Julia runs ~27x slower than Python’s version and ~187x slower than the one in C.
▶ No.942114>>942325
>>941941
BASIC is more low level than C.
>>942112
The author is a retard. That benchmark is garbage as it includes the time it takes for the runtime to start (this applies to even C which has to run some initial code). What it really should be doing is measuring the time from inside the program itself and also looping over what you are benchmarking many many times.
Additionally, I'm skeptical about the claims he is making under the libraries section. In C it's simply just calling snprintf, but with the other language it is also showing both the call and "snprintf" itself. It also don't give any evidence that it doesn't reuse the same conversion code if you were to use it somewhere else in the codebase.
▶ No.942122>>942143
>>941595 (OP)
lol at all this retarded e-drama surrounding some piece of shit PL. I just started two new projects that are compromising by using an existing piece of shit PL instead of writing my own, and was going to go with Python, but then when I started having to subclass shit to call a function I noped the fuck out and went with SML. Fuck your stupid PLs that have all kinds of complex dynamic semantics because someone in the programming memester community came up with a "problem" that "needs to be solved" last week. Something like SML or C (if C wasn't a complex piece of shit in practice due to loose specs) in contrast is how PLs should be. They just do what you say. Once in a while you have to say 2 or 3 things instead of 1 thing, but big fucking deal, at least you know how the PL works. I'd even put Go in this list even though half of its mechanics are retarded (and it's a product of masturbation by programming memesters).
▶ No.942143>>942660
>>942122
>then when I started having to subclass shit to call a function
What was it?
▶ No.942197
>>941619
how about you remove the jew in your country
▶ No.942325>>942357
>>941597
>feature creep
It should be well-rounded for Django, SciPy, Machine Learning and cryptocurrency applications.
>>941606
Perl 6 is too damn slow, until anaconda help them it is worthless.
>>941710
Python should always be faster, just like Perl 6 except they suck
>>941707
Ruby, JS, PHP, Typescript, Kotlin (scripting) vs Lua, Go, Rust, D, Ada, Nim (compiled)
Anyway the world just got worse
>>941873
Wew
>>941944 and >>941957
Make it happen then
>>942114
Exactly
▶ No.942331>>942714
>Julia
It's a MATLAB replacement, not a good general-purpose programming language.
▶ No.942332
>>941680
There are a number of other existing assignment syntaxes.
import ..., from ... import ..., import ... as ..., from ... import ... as ...
with ... as ...
for ... in ... (both for loops and generator expressions!)
except ... as ...
def ...(...)
I'm probably forgetting some.
▶ No.942347>>942362
▶ No.942354>>942355 >>942365 >>942740
It is depressing how far Python has fallen since the mid 2000s. I still use it on a daily basis, and am slightly embarrassed about it. The only languages that will still be worth knowing in 15 years are C, Lisp, Java and maybe Haskell. Also, Julia a shit.
▶ No.942355
>>942354
>Java
Please kill yourself.
▶ No.942357>>942358
▶ No.942358
>>942357
>being this ignorant
>luac
▶ No.942362
>>942347
Lua is good for embedding in another application written in C, but it is not meant to be used standalone.
▶ No.942365
>>942354
What do you mean Python has fallen?
▶ No.942548>>942564
>>942013
He wants everything to run in ring 0 as God intended
▶ No.942564>>942572
>>942548
Seriously, people freak out about ME and PSP in their CPU's, which was designed intentionally to provide businesses with hardware level remote access to their devices, but then likewise freak out about anything that doesn't forbid users from accessing their hardware at its fundamental levels. Really boggins the noggins.
▶ No.942572>>942579 >>942596
>>942564
>TIL the C LARPers are the glow in the darks
Big if true
▶ No.942579>>942596 >>942614
>>942572
What did you think? They are either soyboys or corporate/ABC shills. Anyone who has a problem with low level languages like C is either one of them. It does not imply C is irreplaceable. But these shills does not point out any alternatives that are improved version of C. All they want is people to use their hipster languages. That's why I go extra length to ensure that their platforms are not supported in my machine. If I find it installed or enabled, I remove them or gimp them deliberately.
▶ No.942596>>942749
>>942572
>>942579
keep talking to yourself lispcuck
▶ No.942614>>942615 >>942761
>>942579
>But these shills does not point out any alternatives that are improved version of C.
Ada?
▶ No.942615>>942618
>>942614
Ada is really shit. There's reason why the military ditched it for C++.
▶ No.942618>>942625 >>942632
>>942615
>There's reason why the military ditched it for C++.
Yes, because C++ programmers are easier to find these days; and work for less. Ada has all kinds of excellent features that are still not used today (subtypes). GNAT is a great compiler too... I feel the language is underrated.
▶ No.942625>>942637
>>942618
What I don't understand is who was the bozo that thought it's the language that makes hardware secure?
▶ No.942632>>942635 >>942636
>>942618
Ada is the most verbose piece of shit ever made. Even the government found it too expensive to use. Stop LARPing and try to (properly) port something small to Ada and compare.
▶ No.942635>>942637
>>942632
The verbosity doesn't distract from readability like, say, Java. Porting a C program to Ada is very hard; because while they have a focus on performance, Ada focuses on security and reliability just as much. The compiler errors can be annoying; and it does seem like a bondage and discipline language at first, I agree. But, don't forget that C makes you do a lot of boilerplate crap if you want to harden your program.
▶ No.942636
>>942632
The verbosity doesn't distract from readability like, say, Java. Porting a C program to Ada is very hard; because while they have a focus on performance, Ada focuses on security and reliability just as much. The compiler errors can be annoying; and it does seem like a bondage and discipline language at first, I agree. But, don't forget that C makes you do a lot of boilerplate crap if you want to harden your program.
▶ No.942637>>942643
>>942635
> Ada focuses on security
Why don't you answer my question bozo?
>>942625
▶ No.942641>>942643 >>942647
>942637
That's a load of crap? C allows many exploits that other languages simply will not allow you to do. Ada (and its compiler) has all kinds of security/debugging features built in. It's not a language that you can simply "hack" in and create something that will compile. I didn't want to respond to you, because criticizing C invariably turns every thread into shit flinging, and I've had enough of that.
▶ No.942643
▶ No.942647>>942652 >>942712
>>942641
C doesn't allow anymore exploits that the binary instructions with data allow. Your argument is wrong. Besides, there are plenty of examples of Ada code screwing up (because the programmer is shit) and losing millions of dollars of hardware.
▶ No.942652>>942661
>>942647
There are not "plenty of examples" no, not at all. It's not Ada that fails:
http://www.adapower.com/index.php?Command=Class&ClassID=FAQ&CID=328
>When the rocket flew for the first time, both dual-redundant computers detected the overflow condition. Both presumed that the cause was a hardware failure.
>Both shut down in an attempt to leave the other side in control. They did *exactly* what they were designed to do and in that sense behaved flawlessly.
If an Ada program compiles, you can almost be sure it is safe; because the compiler is highly pedantic. Even if there are actual examples of Ada failing (they exist, of course) insisting that we use C instead is patently stupid.
▶ No.942658
>>941873
welcome to the meme age
I am going to singlehandedly bring back Forth using videogames.
▶ No.942660>>942734
>>942143
for example to make a GUI in GTK. another retarded as fuck example I remember was asyncore, and lots of other big meme libraries have shit like this, not to mention further nightmares like "metaclasses" and other ways of dressing up your objects with weird undefined code
▶ No.942661>>942672 >>942676 >>942712
>>942652
Your argument doesn't substitute Ada for other languages either. You claim Ada is secure and safe, and show that Ada has problems, yet you conclude it's better than properly training programmers to understand their hardware. Ada can't have a compiler for every architecture that is used without any kind of programmer flaw in the compiler itself. This is why your argument is flawed. It's not the language; it's the programmer that's flawed. Programmers need to understand their hardware, and it is the hardware that should be secure from design. It shouldn't be that anything must rely on software except variable and user input. Anything else should be hardware concern, not language.
▶ No.942672>>942673 >>942712 >>942748
>>942661
Do you think the military contracts amateurs to write their programs? Obviously not; these people have a lot at stake if their programs fail. I'd assume that both the Ada and the C/C++ programmers have a similar (erudite) level of skill. At that point, obviously languages that are more reliable, debuggable, and secure matter. The Ada mandate wasn't a "flip of the coin" decision.
>Programmers need to understand their hardware, and it is the hardware that should be secure from design.
I agree, but we don't live in that kind of world. However, like I said, I doubt the people the military contracts are greenhorns that are fresh out of college.
▶ No.942673
>>942672
Do you think that government contractors never scam the government?
▶ No.942676>>942701 >>942712
>>942661
You can train programmers to understand their hardware and use Ada too, right?
▶ No.942701
>>942676
Do government contractors scam the government?
▶ No.942712>>943487
>>942647
>C doesn't allow anymore exploits that the binary instructions with data allow.
Bullshit. The hardware doesn't do these "optimizations" that C compilers do like replacing a loop with "return true" which wouldn't make sense to anyone who worked on compilers in the 80s or earlier. C is less reliable than binary instructions, which is just another reason why it sucks.
>>942661
>>942672
>>942676
The C/C++ weenies they're hiring can't understand the hardware or learn Ada. They may be mentally capable, but weenie brainwashing makes them not want to do it.
Yes, and they've succeeded. Hordes of grumpy C hackers
are complaining about C++ because it's too close to the
right thing. Sometimes the world can be a frightening
place.
I've been wondering about this. I fantasize sometimes
about building better programming environments. It seems
pretty clear that to be commercially viable at this point
you'd have to start with C or C++. A painful idea, but.
What really worries me is the impression that C hackers
might actively avoid anything that would raise their
productivity.
I don't quite understand this. My best guess is that
it's sort of another manifestation of the ``simple
implementation over all other considerations'' philosophy.
Namely, u-weenies have a fixed idea about how much they
should have to know in order to program: the amount they
know about C and unix. Any additional power would come at
the cost of having to learn something new. And they aren't
willing to make that investment in order to get greater
productivity later.
This certainly seems to be a lot of the resistance to
lisp machines. ``But it's got *all* *those* *manuals*!''
Yeah, but once you know that stuff you can program ten times
as fast. (Literally, I should think. I wish people would
do studies to quantify these things.) If you think of a
programming system as a long-term investment, it's worth
spending 30% of your time for a couple years learning new
stuff if it's going to give you an n-fold speed up later.
▶ No.942714>>942760
>>942331
It's actually both. It's just more famous for the latter because they wanted to focus on something important programming for scientists, engineers, and mathematicians rather than web dev shit that other languages Pythong, PHP, Perl are king of.
There are webservers written in Julia and they are highly performant.
https://github.com/essenciary/Genie.jl
It works perfectly fine as a general purpose language.
▶ No.942715
>C is less reliable than binary instructions, which is just another reason why it sucks.
this.
t. C and assembly programmer
▶ No.942721
I like python and I'm going to continue to program using it.
▶ No.942734
>>942660
I don't think PyGTK requires classes. It just makes things more manageable in most cases. I haven't used it before, but the example on https://python-gtk-3-tutorial.readthedocs.io/en/latest/introduction.html#extended-example is equivalent to
import gi
gi.require_version('Gtk', '3.0')
from gi.repository import Gtk
def on_button_clicked(widget):
print("Hello World")
window = Gtk.Window(title="Hello World")
button = Gtk.Button(label="Click Here")
button.connect("clicked", on_button_clicked)
window.add(button)
window.connect("destroy", Gtk.main_quit)
window.show_all()
Gtk.main()
I have used other libraries that did require writing classes, but it seemed like a sensible decision. If you need handlers for foo, bar and baz, then this is a nice way to lay out the code:
class MyHandler(Handler):
def on_foo(self, event):
...
def on_bar(self, event):
...
def on_baz(self, event):
...
There are alternatives, but this version saves you the trouble of manually hooking up the functions and gives you internal state management and inheritance for free.
It could have been made like this (in fact, it's trivial to write a wrapper around it to enable it):
def on_foo(handler, event):
...
def on_bar(handler, event):
...
def on_baz(handler, event):
...
run_handler(on_foo=on_foo, on_bar=on_bar, on_baz=on_baz)
But that's less convenient to write and rarely offers any advantages.
Metaclasses are well-defined, and even fairly simple (although possibly hard to wrap your head around the first time). But you rarely need to use them, and when you do need to use them it's typically easy to understand their behavior without fully understanding metaclasses in general.
As in any language, there are things that seem bizarre at first. But they're sensible once you do understand them, it doesn't take all that long to understand them, and they're usable even before you understand them.
Then again, it's true that it has a lot more layers than idealized C and SML. Its abstraction is decent, but it's still abstraction. I'm continually surprised by how many of the moving parts between my mental model of the code and the actual computational model of the code I can ignore without writing misbehaving programs. So it's reasonable to dislike that.
▶ No.942737
>>941723
Javascript was a mistake. So pathetic that so many people on /tech/ do not even know the extent of the js cancer. Feel shame all of you fake-ass /tech/ lurkers. Maybe go suck dick back on reddit.
▶ No.942740
>>942354
>Java
leave and don't come back.
▶ No.942748
>>942672
>Appeal to authority
▶ No.942749
>>942596
Kys nigger, I've never written a single line of LISP
▶ No.942760
>>942714
It's slow as shit except when doing what it was made to do: repetitive, formulaic calculations. The REPL is slow as shit and its memory usage for simple I/O is horrendous. The links are in this very thread. Oh and the syntax is shit.
▶ No.942761>>947690
>>942614
>Ada
>replace C
The same way some retards claim that C++ or Rust are replacement for C. They're not, they're more complex and abstract by several orders of magnitude.
▶ No.943484>>943789
>>941614
Hello my fellow sandnig. Kifak.
▶ No.943487>>943491 >>943540
>>942712
If Ada is so great then why is no one using it anymore?
And try to give me an actual answer. Don't whine about
>wahhhh muh wheenies who make 100k more than me are not using my shitty LARPing language wahhh
Oh wait, you can't.
▶ No.943491>>943509
>>943487
The thing about it, it's not the programming language that makes software secure, it's the hardware. Always has been, always will be. Arguing about how to tell the hardware what to do is moot, and in safety and security situations, programming in any language other than its native binary ops and data is wrong.
▶ No.943509>>943516
>>943491
Then x86 is absolute pile of shit.
▶ No.943516
>>943509
It's provided a lot of technological improvements over its generations. I personally would rather use another ISA, but it serves it's purpose. If I could, I would get a bunch of dedicated ASICs to talk to each other to produce output, but I doubt the efficiency of that kind of setup would even match late mid-00's performance. At least it wouldn't be botnet though.
▶ No.943519
>>941595 (OP)
>Note that there's still the CoC
Get me the hot coffee
▶ No.943540>>943542 >>943544 >>943586 >>943723
>>943487
1) You can't say an Ada implementation is standards compliant without it passing a validation suite, and you have to pay for that test. GCC can claim to be C standards complaint even when it's not. The only free Ada compiler is government-sponsored.
2) Pascal does stupid shit with arrays. I don't know if Ada makes the same mistakes, but it is based on Pascal, and that may leave a bad taste in people's mouths.
3) Writing shitty code is job security, so programmers prefer shitty languages.
4) Rendezvous. I bet you can't even tell me what rendezvous is.
▶ No.943542
>>943540
>I bet you can't even tell me what rendezvous is.
It's what I had with your mother last night. Bitch may not be much to look at, but she can suck the chrome off a trailer hitch.
▶ No.943544>>943547
>>943540
It didn't look like anon was talking about GCC ever. Besides, GNU has a terribly large history of being not compliant with standards and adding their own quirks into shit.
▶ No.943547>>943548
>>943544
>It didn't look like anon was talking about GCC ever
It applies to any C compiler. I just singled one out.
▶ No.943548
▶ No.943586
>>943540
>(((government-sponsored)))
Into the trash it goes.
▶ No.943723>>943762
>>943540
>Writing shitty code is job security, so programmers prefer shitty languages
shit code is possible in every language you dinghole
▶ No.943725
Looks like Ruby finally won.
▶ No.943762
>>943723
I don't think it's true in the first place, but if you were going for that, shitty languages are better than good languages. They give you deniability. Idiomatic PHP will make you look better than poor Ada, even if they're equally unmaintainable.
▶ No.943781>>947696
>>941683
lets just clone python 3 into a new language (no coc gayness)and depercate python all together im so sick of having to have 10 different python versions on linux even with the python3only profile
▶ No.943789
▶ No.947690>>947721 >>947755
>>942761
lolwat.
C++ is simply an extension of C to support the OOP paradigm. HOLY SHIT THEY RENAMED STRUCTS TO CLASSES AND ALLOWED YOU TO INCLUDE MEMBER FUNCTIONS AS WELL AS MEMBER VARIABLES. ZOMG.
▶ No.947696
>>943781
We can call it mamba.
▶ No.947721
>>947690
You know damn well that's not all C++ is, and "you don't have to use those features lol" is meaningless. Someone _else_ will use those features and I'm going to have to deal with the mess.
▶ No.947755
>>947690
>renamed structs to classes
Changing the language to couple data and behavior together is not simply renaming structs to classes, you codemonkey
▶ No.948478>>948526
>python is dying
>webshits and normalfags will shill J's everywhere
Damn you monkey paw
▶ No.948526>>948538 >>948543 >>949162
>>948478
At least JS doesn't have faggot fucking whitespace issues.
▶ No.948538
▶ No.948543>>948550
>>948526
Are you implying that it's better than Python? m8, Python is shit, but it's far from JS (even if JS has good JITs).
▶ No.948550>>949112
>>948543
Yes, as a matter of fact I am implying that it's better than python. Last time I checked I didn't have to install node2 and node3 because half the fucking shit ever written in it was deprecated in newer versions.
▶ No.948571>>948577
So did they already choose the gaggle of gay trannies to replace him?
▶ No.948577
>>948571
yes. they're bust conducting a thorough code review, find-replacing all references to "white privilege space" in the documentation.
It will now be referred to as "safe spacing".
Also scope is highly problematic and oppressive. All variables will are now global scope. Open borders!
▶ No.949099
>>941864
>Did this place become reddit while I was absent ?
Reddit/4chan. It's actually gotten better since the 2016-2017 /pol/ election frenzy attracting all the fucking 4chan and the_donald shitposting morons. Seeing some /g/ tier cancer threads and Faglioli posting was a legit lowpoint in the board.
▶ No.949112
>>948550
>half the fucking shit in it
Nigger, please. Any libraries anybody cares about has already been rewritten in 3, and most of the syntax changes are trivial enough that they can be done by an automated tool.
▶ No.949162
>>948526
This 100%. I cant stand writing nested loops and having to deal with figuring out the indents.
copy/pasting code should not break your program logic.