[–]▶ No.829637>>829641 >>829651 >>829658 >>829787 >>829924 >>832111 [Watch Thread][Show All Posts]
What is it useful for? Is it gay? Is it worth learning? What makes it superior/inferior to other languages?
▶ No.829641>>829645 >>832757
>>829637 (OP)
rewrite it in rust
▶ No.829645
>>829641
ebin :-----------------------------D
▶ No.829651>>829657 >>829663 >>830363
>>829637 (OP)
It expands your mind, but I wouldn't use it as a day to day programming language. It's useful for symbolic computation, eg, the kind of math they have you do in high school like calculating derivatives: (e^(e^x))' = (e^x)*(e^(e^x)) and studying program transformations. Once it gets under your skin you'll be better at structuring programs tough.
http://www.paulgraham.com/rootsoflisp.html
▶ No.829657>>829658
>>829651
So I wouldn't be writing small scripts or anything with it? What makes it a bad day-to-day language? Sorry for my retardation, but I've never really gotten into programming and don't really understand it yet.
▶ No.829658>>829669 >>829674 >>829696 >>830000 >>832086 >>832140 >>832147
>>829637 (OP)
It's mental poison. It abstracts you away from how the machine works because math autists wanted things to work more like how math works, when what you should be doing is changing your thinking to be more like machines work. The whole language was masturbation by people who didn't want to make software, and it shows as they made almost nothing with it over 60 years.
>>829657
>What makes it a bad day-to-day language?
Try to make a stand-alone "Hello, World!" executable with it, for starters.
▶ No.829663
>>829651
It's not used for anything. It's a masturbation language. Actual math work is done in languages like R.
▶ No.829669>>829670 >>829672
>>829658
What would you recommend that I use instead? I'm almost completely experianceless when it comes to programming.
▶ No.829670>>829674 >>829696 >>829916
>>829669
I start people on javascript as it's the easiest to get started with by far and can make things that you'd find worthwhile. /tech/ will hate it, though. They'll tell you to install arch and write code in erlang or something.
▶ No.829672>>829682 >>829696
>>829669
Btw, languages I use at least once a week:
C, C++, python, javascript
I'm a low-level guy but python is great as glue (don't use shellscript), and javascript is often necessary because so many things depend on it.
▶ No.829673
I know Racket well, and it's very useful for quickly prototyping ideas and writing simple programs (not to say they'd have to be). If performance isn't critical, a lisp language is what I'll turn to, because it's so quick to develop in. I use Racket for things others would choose python for.
▶ No.829674>>829675 >>830000
>>829670
>I start people
Please stop, you're in no position to be advising anyone if you're responsible for the idiocy in this post >>829658
▶ No.829675>>829719
>>829674
Sorry arch user, but I actually write software for a living and teach people how to get shit done.
▶ No.829682>>829686
>>829672
How do you handle fork and process faggotry in python?
▶ No.829686>>829804
>>829682
You try not to. Multiprocessing is pretty awful as the standard code to do so is written poorly and signals and such will cause misbehavior. To be fair, it's VERY tricky code to write that virtually no one does 100% correctly even though they all think they do.
In some cases, knowing the risks, I create process pools via python's multiprocessing library and dispatch work across them. If you need a more industrial solution, do one thread per python and design the application like a cluster (this is a general solution that works for almost any shitty language).
▶ No.829696>>829702 >>829710
>>829658
>>829670
>>829672
Do you even fucking get the point of functional programming, ya dumb fags?
▶ No.829702>>829708 >>829775
>>829696
Yeah, to keep people from getting anything done. I've been hearing "functional is the future!" for almost 30 years now. It continues to not be the future.
▶ No.829708>>829715 >>829716
>>829702
>I've been hearing "functional is the future!" for almost 30 years now.
Never heard such things, and I'm getting a strong sensation of larping here.
Functional programming is, as far as I know, never used in enterprise, it's mostly limited to academic environments, because it's there where it was born, as they rely on mathematical concepts.
Scientists, mathematicians and physicists use functional languages, pajeets lack of the knowledge to even comprehend how they work.
Stallman is known for Emacs Lisp, and guess what, he's a physicist.
▶ No.829710>>829720
>>829696
I stilll have a hard time understanding what functional programming means. Is it that every function evaluates to another function?
▶ No.829714>>829721 >>829724 >>829736 >>829774 >>830792
I don't know jack about Lisp, but I read this story years ago and always wondered why more people didn't use the language: http://www.paulgraham.com/avg.html
My first instinct is to believe that managers just don't like giving individual programmers that much power. The rise and popularity of Java reinforces that thought. Also in parallel the same happened with Perl vs. Python. Managers are allergic to individuality and freedom. They want everything to be done by the book, in a perfectly predicatble boring fashion. That's why that story in above url doesn't have a "happy ending". Sure the authors made lots of money (and so did Yahoo), but it didn't serve to create more Lisp jobs or whatever. They instead just rewrote (badly) the Lisp code into other languages.
BTW, on the same topic of "de-skilling", I recommend watching this video: https://www.youtube.com/watch?v=4rnJEdDNDsI
Not sure I'll ever learn Lisp, but I've been playing with Forth. Main advantage here is it runs *very* well on even old 8-bit computers or microcontrollers with tiny memory.
▶ No.829715
>>829708
They don't use it for anything useful in academia other than wanking in CS. I used to do some supercomputing and the code was all built on top of Sun's custom CC. Even AI was a mix of C and PROLOG. Voice recognition did use some LISP but only because the early papers used it and to remain paper-compatible they continued the practice. When you run into LISP outside of academia and open sores, it's usually connected to voice recognition because that's the language those people are familiar with.
Webdev faggots use it for some things as they always want to be using something hip and new even if it's untested or shit. Like you might have heard of Discord. Functional code tends to cripple companies over time as it is extremely difficult to modify without rewriting, the talent pool is small, and the lifetime of the toolchains short.
▶ No.829716
>>829708
So what your saying is that functional programming isn't useful outside of academic environments? So it's useless anywhere else? Or is it just too difficult to be practically used in large projects?
▶ No.829719>>829723
>>829675
Yes, I can tell you're a code monkey by the derision you show towards mathematics and anything abstracting your notion of low level. In academia, it is important to teach concepts, not to churn out drones who can spit out codes for a particular machine type.
▶ No.829720
>>829710
Functional programming means computation is done by evaluating expressions, rather than statements.
▶ No.829721
>>829714
Perl vs Python had nothing to do with managers the way you're thinking. Perl was designed to be very flexible, with a motto like "There's more than one way to do it". And yes, there were easily 10 different ways to do anything. And if you had 10 programmers working on some code, you'd have the same problems solved in 10 very different ways all over that code. So to effectively work on existing code, you'd need to have learned all these different ways to do the same thing, almost like needing to learn 10 languages. Very few programmers ever had that level of mastery of Perl, which was kinda a problem when you needed programmers.
Managers had to come down super hard on Perl projects with coding standards. The longest coding standards I've ever seen were for a Perl project. They'd, through a lot of effort, try to reduce the language to "One good way to do it". But this was very difficult to enforce and imo hopeless as you ended up having to basically retrain anyone you hired.
The lesson learned (which should have already been learned from LISP) was that unnecessary flexibility makes a language /less/ efficient.
So Python came around and recognized this problem with Perl and mostly avoided that trap. While it wasn't as quick to use as Perl for a lot of cases, 'good' was better than 'great' due to the reduced flexibility making it much faster to write with holistically, and much larger programs could be made with it before collapsing in on themselves. If you do "import this" in Python, you'll see an easter egg that contains, "There should be one-- and preferably only one --obvious way to do it.". That's a riff on Perl's motto, and why it lost to Python.
▶ No.829723>>829728
>>829719
I have a Masters in CS from a top UC school and several well-cited papers I'm author/coauthor on so I understand academia and how they do nothing of value. Think of a great programmer and you'll find that they were a great programmer before university.
▶ No.829724
>>829714
Like Forth is good for being able to run on small memory. Lisp is good because you can write most of the language in 100 lines or something retarded like that.
▶ No.829728>>829730
>>829723
Who cares if they were good before, the point is they become greater. By sixteen I was a "great" programmer if industry standards are anything to go by, but then of course you learn how much you do not know.
▶ No.829730>>829735
>>829728
They don't. They all realize it's bullshit and drop out before getting a PhD. Universities panic and give them honorary PhDs later. Again, think of a great programmer and then see if they have a PhD.
▶ No.829735>>829737
>>829730
I think you've lost touch with your own argument. Earning a PhD is only important if you're interested in research, of course many choose to work in industry instead.
▶ No.829736>>829741 >>829774 >>830000
>>829714
Thanks, anon. I'm going straight for lisp.
▶ No.829737>>829739
>>829735
Do you think great programmers aren't interested in research?
▶ No.829739
>>829737
You're talking about two totally different career paths. Some may straddle both, but that is rare. I do not think pursuing a PhD is important for all programmers, but an undergraduate is beneficial.
▶ No.829741
>>829736
Good choice. Ignore the fools telling you it's a meme or a waste of time. Even if you never use it (you will), you'll benefit from exploring a new paradigm.
▶ No.829774
>>829714
>I don't know jack about Lisp
>>829736
>I'm going straight for lisp
fucking lol. You deserve everything you get.
▶ No.829775
>>829702
<functional is the future!
>but never the present
▶ No.829787>>829830 >>830113
>>829637 (OP)
Common Lisp should have been just a stopgap solution in the history of Lisp. When the various Lisp dialects were dying out a new Lisp was needed which would have the best of all the other Lisps and make porting programs easy. I this Common Lisp did succeed. However, the end result is a horrible hodgepodge of a language. It should have been just a stopgap solution until a proper modern Lisp gets developed.
The real problem with Lisp (any Lisp) is that few people know it. Reddit for example was originally written in Common Lisp, but they couldn't find enough programmers so they ended up rewriting it in Python.
https://redditblog.com/2005/12/05/on-lisp/
The library ecosystem is something to take into consideration as well. I haven't used much of it, so I have no idea how well it compares other languages. Really, if you intend to use Common Lisp you will find yourself with a very powerful tool, but it will be a lonely experience.
Another problem is that the standard leaves a lot up to the implementation. For example, there is no portable way of opening a file. There is a standard function, but each implementation has its own quirks on how exactly to handle the arguments. The best advice would be to pick one implementation that you expect to be around for a while and stick to that one.
Personally, I have chosen Racket as my Lisp language. It has its own issues, and some things are worse than in Common Lisp, some are better.
▶ No.829804>>829810
>>829686
>do one thread per python and design the application like a cluster (this is a general solution that works for almost any shitty language).
>spoken like a JS fag "muh node.js multithreading solution".
>pretends the GIL issue hasn't been solved/worked around in a gorillion ways...
Please dont. Depending on your needs use one of the many proven task queues such as Celery (a bit involved) or a dead simple one such as RQ, if task queues is out of the question Stackless makes threading a treat, stupid simple and runs circles around Go "goroutines". Used all of them in various ways in the industrial space, telephony, energy and never ran into an instance of "Oops! Muh app shit the bed."
Modern Python isn't tricky by any means jump right in. Don't let a crusty old "Pythonista" scare you away.
▶ No.829810>>829837
>>829804
>don't design it like a cluster
>use a message queue
Uhh.. anon?
▶ No.829830>>830352
>>829787
You seem to know a thing or two about lisps.
What do you think about GNU Guile?
▶ No.829837>>829840
>>829810
Task Queue != Message Queue sweetheart.
You've never used a task queue to divvy up work to utilize all cores? Like any other grown up serious app that might need to scale down the road and on top of that, makes it easy to just throw more machines/instances at it as needed?
Who the fuck runs multiple single threaded app instances these days to utilize all cores? Oh that's right node.js niggers.
▶ No.829838>>829899 >>829922
Common Lisp is a supurb language, as are most of the other Lisps. It's very fun to programme in, there's a decent community, and the literature surrounding Lisp is some of the best.
There's a Lisp thread on Lainchan which is steadily active and usually has good discussion going on. The only problem is that most other languages will seem terrible by comparison.
If you want practical software to hack around with you can look at Stumpwm which is a nice window manager.
▶ No.829840>>829842
>>829824
>>829827
>>829829
>>829837
I think you're defective, anon.
▶ No.829842>>829888
>>829840
So you can all admire this.
▶ No.829888
>>829842
Thank you. It's magical.
▶ No.829899
>>829838
install frankenwm
▶ No.829916>>829920
>>829670
erlang is a beautiful language
▶ No.829920>>829927
>>829916
and is doubly beautiful when running on Arch
▶ No.829922>>829965
>>829838
>Stumpwm
<If you want a minimalist tiling window manager, then StumpWM is not what you're looking for. The code base is ~15k sloc, the binaries produced are ~60mb.
Why after 60 years can't Lisp compile shit like that down to 5k?
▶ No.829924>>829980
>>829637 (OP)
>What makes it superior
▶ No.829927
>>829920
and erlang is pow(beautiful, beautiful) if running on a compute cluster running arch
▶ No.829965>>829984
>>829922
Metaprogramming and the ability for self-evaluation means that the entire runtime has to be packaged with any CL programme.
A compiler like SBCL can produce much smaller binaries if you guarentee it that you will not be using those features, but that does defeat some of the purpose of using Lisp.
▶ No.829980
▶ No.829984>>829987
>>829965
What a great language. So it's unfixable, then.
▶ No.829987>>830017
>>829984
Only in an environment like Unix that is oriented towards small, standalone binaries. On a Lisp machine you would have a system-wide lisp listener running that can evaluate arbitrary programmes as the user pleases. This is also a sensible structure for samething that's constantly running like a WM or Emacs.
It should be noted that Lisp is not slow anymore, like it was in the 70s. SBCL can generate very tight x86 assembly. Optimised CL is about a third the speed of C for a trivial programme, which makes it much faster than Java, Python, or most other high level languages.
▶ No.830000>>830004 >>830034
>>829658
>It's mental poison. It abstracts you away from how the machine works because math autists wanted things to work more like how math works, when what you should be doing is changing your thinking to be more like machines work. The whole language was masturbation by people who didn't want to make software, and it shows as they made almost nothing with it over 60 years.
It gives you a bird's eye view of programming. Learning only low level-languages you won't be able to see the forest from the trees.
I say that as a guy who's first programing language was C. It was all fine and dandy until I actually needed to write something with more levels of abstraction. (eg. a program that takes the string of an elementary function and returns the string of its derivative). Then I got lost in book-keeping details and didn't know how to begin. Then I found lisp. Now, am I able to write the program in C? Of course, but only because I learned lisp. Granted, I wouldn't recommend lisp as anyone's first language, that should be C. You have a point there. >>829736 But lisp should definitely be your second language to learn.
>>829674
This whole thread is a mess. This is the reason /tech/ needs ID's .
▶ No.830004
>>830000
Quads confirm, we need ids. Good post too.
▶ No.830017>>830101
>>829987
>Only in an environment like Unix that is oriented towards small, standalone binaries. On a Lisp machine you would have a system-wide lisp listener running that can evaluate arbitrary programmes as the user pleases.
We can simply this statement to: it solves a very limited set of problems. Today, every target I want to deploy software to is not a lisp machine. That's not good news for lisp it seems.
>It should be noted that Lisp is not slow anymore, like it was in the 70s. SBCL can generate very tight x86 assembly.
Yet whenever we have a programming contest here lisp gets absolutely destroyed. I think in the last one the lisp entry was literally close to 100,000x slower than the top entry, and that was run via SBCL. Until the lisp community here can show us their language actually performing, I group these claims with those of the Java community. Maybe it only performs well if willing to spend obscene amounts of time tweaking the code?
▶ No.830034>>830063
>>830000
>It gives you a bird's eye view of programming
I don't believe it does. I'm somewhat old for a channer and started out with BASIC and while extremely simple that was a language designed to fit hardware limitations. You learn a little bit about negotiating with the machine through learning BASIC. LISP is a very different beast. It's designed specifically to remove hardware limitations and move programs into a more pure mathematical space. You end up not learning programming, you end up learning some variant of math. LISP users usually get hooked on that purism and they end up mired in the Haskell community or whatever other pure functional language is trendy and never really get into programming. Programming isn't pure. It's dirty and violent. Languages learned along the way should acclimate a programmer to that.
>I say that as a guy who's first programing language was C. It was all fine and dandy until I actually needed to write something with more levels of abstraction. (eg. a program that takes the string of an elementary function and returns the string of its derivative). Then I got lost in book-keeping details and didn't know how to begin
I don't recommend starting with C because that's all the gore up front. But I do recommend languages that gradually increase the gore along the way. You eventually want to have mastered it, as that is a close approximation of the machine that you are expected to control. You also want to learn some assembly at the end of that path so you can debug the really hard problems rather than have to go beg your company's Steve for sage advice.
I recommend javascript as a first language despite absolutely despising javascript and hating every second I'm forced to use it. It's a language where there is some hardware pressure felt, it can scale from absolute beginner to billion dollar company, it's trivial to start with as you already have the tools, it's easy to make useful things for yourself and share them with your friends, it familiarizes you with the huge class of languages with C-like syntax, it covers many domains as it can run on the browser, server, desktop, embedded, etc. today, and you could literally write nothing but javascript your whole life if you choose to as we'll still be dealing with it for at least 100 years. All great qualities of a first language.
▶ No.830063>>830075
>>830034
>math autists wanted things to work more like how math works, when what you should be doing is changing your thinking to be more like machines work
A machine, in and of itself does nothing meaningful. All it does it move registers around.
The programs we make are meaningful because we impose meaning on them. The question is not 'how do I move registers around?' but how do I command the programing language to move registers around in a way that is meaningful to me? Lisp makes that easier to express what you want to do, while C gets you close to the hearth of the machine and the gritty details. Unless you're able to put aside the machine and think of the problem at hand from a high vantage point you'll never be able to solve the problem at hand. Sure, the code you can write will be efficient, but you'll never be able to write code that does what you actually want to do. Assuming of course, that you want to do anything non-trivial. C provides the survival gear and training for going trough amazonian jungle by foot, and lisp gives you the map of the whole place, so you can plot where you're going. Knowing them both, you'll be able to reach your destination.
<I recommend to start with a language that's as close to non-abstracted as possible whilst still being a scripting language and offering no direct access to the machine. And a fucked up type system, to boot.
What kind of middlebrow masochist are you?
<today, and you could literally write nothing but javascript your whole life if you choose to as we'll still be dealing with it for at least 100 years
Now I know you're just baiting.
Learn C. Learn Lisp . 90% of whatever other languages you'll encounter will be a combination of the two, with varying levels of fuckery .
▶ No.830075>>832265
>>830063
Here's an example. Cryptography is something that would like to exist in that pure math space, but if you treat it that way and don't consider how your algorithms will be executed by the actual machine you'll end up with cache timing attacks and similar problems. While the algorithm is fine, the program is not, and it's an algorithm problem, not an implementation problem.
If you intend to write programs and not formulas, you need to learn how to think like the machine.
>Unless you're able to put aside the machine and think of the problem at hand from a high vantage point you'll never be able to solve the problem at hand.
Untrue. The machine can solve any software problem. Math and programming are two separate things people try to combine. It's good to know both, but you don't learn one from the other.
>Now I know you're just baiting.
I'm not. C stayed relevant for ~60 years and likely 60 more despite a much smaller initial boom. Javascript is everywhere and I'm quite certain people will be working with it for at least 100 years. Even if people don't want to use it, all that code out there needs to not only be maintained, but to have features added to it. We're stuck with it at this point.
▶ No.830101
>>830017
By the same token, Lisp solves a subset of problems extremely well. I know I prefer to use specialised tools instead of dealing with the general, but poorly suited every time.
Lisp Machines are not the only environment where Lisp is suited, anything running long term where it would be nice to inspect and evaluate code at runtime is very well suited to Lisp. Web servers in CL are very nice to use for example.
>Maybe it only performs well if willing to spend obscene amounts of time tweaking the code?
Not obscene amounts of time, but you do have to provide directives to the compiler and tell it to optimise for speed, neither of which benchmarks sites bother with. Like all projects you write once, profile, and optimise the performance critical parts. Good thing CL provides a wealth of tools to accomplish this.
Here's a good introduction
http://blog.kingcons.io/posts/Going-Faster-with-Lisp.html
▶ No.830113>>830352
>>829787
> The real problem with Lisp (any Lisp) is that few people know it.
That doesn't matter. When you're a (real) programmer, you can adapt to and use any language, even write your own if need be. It happens all the time, for various reasons. Pratical example: http://www.anotherworld.fr/anotherworld_uk/page_realisation.htm
▶ No.830352>>830451 >>830723
>>829830
I am by no means an authority on Lisps, I dabbled in some of them before setting with Racket. I have used Guild a bit, but I cannot comment on its technical merits. The documentation in Info format was decent and it's the official GNU extension language, so I guess if you are looking into a Lisp to integrate with your application or one that won't be collecting dust on some repo in a far off corner of the internet it's a good choice. On the other hand, its maintainer is LARPing as a WW2 resistance fighter on the internet, so there is a chance he'll snap one day.
>>830113
Yes, you can do that and re-invent the wheel or create a language that's so domain-specific that it will be useless for anything else. Let's say you want to parse some JSON data, in most languages you just grab a JSON library and move on. If you want to roll your own you will have to waste a significant amount of time on a side-task.
▶ No.830363>>830629
>>829651
know what else expands your mind?
PCP
you don't see programmers using PCP all day either.
▶ No.830383>>830422
I don't like dynamic typing. C is shit too and it's no wonder these retards recommend both C and dynamic typing. Even Go has a better type system.
▶ No.830422
>>830383
Common Lisp has type annotations you can supply. There is also Typed Racket, a variant of Racket that forces you to do static typing.
https://docs.racket-lang.org/ts-guide/index.html
I like to write my code in untyped Racket first to get it working and then port it to Typed Racket. Looking at the extra work you have to go through really puts into perspective how much hassle dynamic typing saves you, but I still think that having an entire class of bugs caught at compile time is worth the hassle.
▶ No.830451>>830583 >>830720
>>830352
Do you mean Guile? Read the creators blog for more than 10 seconds and if you're anything other than leftypol, you'll never want to hear of it again. That guy is a mess.
▶ No.830583>>830720
>>830451
<the hardest thing about hiring is avoiding the fash
It makes me really sad to see all these people who were good programmers in the '90s having been mindfucked into a suicide cult.
<Never mind that CS was women's work until it became high-status
He's not even in the same reality as the rest of us anymore. His history's been rewritten.
▶ No.830629>>830631
>>830363
A small dose of LiSP can go a long way in improving programing mindset. ;) .
▶ No.830631>>830648 >>830726
>>830629
Druggies usually produce terrible code, but I've never personally seen code from a LSD user. We did have a pothead and he produced the most terrifying spaghetti I've ever seen. Huge sections of like 1,000 lines just copypasted with one variable changed, a single C file that #includes all other C files, 'event driven' code where there are at least 5 separate event loops on the same thread, etc..
▶ No.830648>>830654
>>830631
With pot I can imagine how that might be the case.
The article is about micro-dosing lsd. I pasted it because this 'seeing the whole system at a high level' thing seemed analogous to my experience of coding in C after having learned and coded in Lisp. As response to that 'PCP' comparison.
Sage for off-topic.
▶ No.830654
>>830648
LSD microdoses have been a meme for decades. If they worked, they'd already have been abused for all the things people claim they work for. As far as programming drugs go, Adderall helps shit programmers enter the autist zone temporarily to compete with the true autists. It's quite popular with the hacks in SV.
▶ No.830720
>>830451
Yeah, I meant Guile. As long as there is no other maintainer for it I guess he's better than nothing.
>>830583
>He's not even in the same reality as the rest of us anymore. His history's been rewritten.
It's one of these memes like "Ada Lovelace was the first programmer, hurr durr" people keep repeating without thinking. The "CS work" he is referring to was not actual computer science, these women were basically human assemblers and linkers. Back in the old days when an engineer needed a program he would explain the problem to a group of mathematicians, who would then find an appropriate algorithm, write the code and hand it to a secretary who would type it onto a paper tape, splicing in routines from a (literal) library when necessary.
In those primitive days just crunching numbers was enough of a work. As the field advanced the job of the secretary got automated away and replaced by the assemblers, compilers and linkers as we have them today. I remember watching a YouTube video a while ago, but I can't find it. It was an old black&white video from back in the day.
▶ No.830723>>832105
>>830352
> Let's say you want to parse some JSON data, in most languages you just grab a JSON library and move on. If you want to roll your own you will have to waste a significant amount of time on a side-task.
Yeah well the thing is, sometimes you want to do exactly that. There are people who somehow believe that everyone should always use the same library all the time. That's the Python mentality: do it this way only. Then there's the Perl mentality: just do it however you want. In reality, practical uses will be somewhere in between those two, and you have to be able to make the right judgement, not just blindly follow some ideology.
▶ No.830726
▶ No.830776
I'm just goning to poke in and give my opinion.
Common Lisp is very much worth learning... but not actually using.
It has shit library support, they say it's highly portable but as it turns out, it ain't if you want to use anything besides basic I/O. It doesn't support threads or libraries by default, and it demands you use Emacs (aka gtk bloat). Worst of it all, Common Lispers love to boast about their super advanced language and will bash every other language in existence telling you why your taste is shit.
So why is it worth learning? It does have some nice interesting things that most other languages don't, my favorite being the Metaobject Protocol. Not that you're at all likely to end up using it, but it's a very nice read indeed.
I do all my programming in lanugages that don't get in the way, yet I study lisp because it's so interesting, it's like magic. Only not the kind of magic that you would actually want to use in your project.
▶ No.830792
>>829714
>I don't know jack about Lisp, but I read this story years ago and always wondered why more people didn't use the language: http://www.paulgraham.com/avg.html
Because it was fucking 1995 and the better options didn't exist yet. For fast development of fast-running web applications, look into Go, since Google created that language explicitly to replace Python with better performance.
▶ No.832086
>>829658
;;how to make a 50 MB hello world executable in SBCL
(save-lisp-and-die "helloworld.exe" :executable T :toplevel (lambda () (format t "Hello World!~%")))
Its not mental poison and it's not really meant to generate standalone executables. You are right that it abstracts away the lower levels a bit much. But I think its a good idea to operate on 2 levels: near the hardware with C/C++, far from the hardware with Lisp.
If it is such a mental poison why does every language reimplement some feature of lisp poorly and pass it off as a brand new innovation.
▶ No.832102>>832103
If you want to learn a lisp learn Clojure. It's a modern practical lisp and pushes you more into functional programming than clisp. It's also a small language and really easy to learn.
▶ No.832103
>>832102
>Clojure
>JVM, ever
If you want sluggish code that depends on a mishmash of Java libraries, listen to this fool.
▶ No.832105>>832109
>>830723
If there's a library available that doesn't stop you from rolling your own. But if there's no library available you're forced to roll your own.
If you want to go inbetween you need a language with lots of libraries.
▶ No.832109
>>832105
> But if there's no library available you're forced to roll your own.
Not necessarily, if your language lets you leverage an existing librariy for other language. Look at SDL, it's for C, but there's bindings for many others. Same idea with Tk, Gtk, and so on. You only need to write the glue parts.
▶ No.832111>>832231
>>829637 (OP)
I use CL somewhat regularly myself, it's a pretty good language but not without drawbacks I'll admit.
Executable size is indeed a bitch but CL works best if you just use ASDF to import libraries and create small wrappers around it. Most implementations have a script function for a reason.
Common LISP is simply the best for numerical computing. It has built in fraction types that make its floating point calculations significantly more accurate.
Many may say LISP is very slow, and it's true I suppose but optimization is incredibly easy as you can measure function performance down to how many CPU cycles it took to complete.
LISP is a very batteries-included language. Just about anything you need is included by default, usually you don't need to import anything. While LISP has very few libraries, very few are actually needed as they tend to follow the batteries-included approach.
Overall, I wouldn't think to do any extremely performance important task in the language but it excels at simple number processing and list processing. It's also very easy and fun to write your own small libraries for small utilities here and there.
▶ No.832140
>>829658
>they made almost nothing with it [for] over 60 years.
Hacker News and Reddit* were made in Lisp :^)
*originally, now it's in C++ IIRC
▶ No.832147>>832161
>>829658
>what you should be doing is changing your thinking to be more like machines work
Suck it.
Machines are to serve their masters, not the other way. Thinking like a machine is only a prerrequisite to make better machines, or fixing problems you can't solve abstractly. (e.g. make secure crypto, etc.)
▶ No.832161>>835790
>>832147
Try writing GPGPU code with that "I shouldn't have to think like a machine" attitude and see how far you get.
▶ No.832204>>832232 >>832240
Lisp isn't more "math" than any other programming language (the most "exotic" math is bignums and rational numbers), and C sucks even for low level pointer-heavy programming. C is a huge problem and causes billions of dollars in damage and wasted time, but Lisp is an "Emperor's New Clothes" situation. Lisp doesn't even give 10% of the benefits they claim. There are some benefits, comparable to a scripting language, and it is probably 100 times more productive than C, but any good language is 100 times more productive than C.
That brings me to all of the C shilling in this thread. Shills do not want to replace C because C is the cause of suffering and problems, which leads to an infinite cycle of "researchers" and other charlatans. The suffering allows them to beat dead horses from the 70s like "functional programming" instead of focusing on solutions that worked for decades outside of the C/UNIX/PC ghetto. Haskell is not a C replacement and anyone who thinks it is is a moron.
▶ No.832231>>832233 >>832725
>>832111
> It has built in fraction types that make its floating point calculations significantly more accurate.
So does Python. However, Python also has numpy.
What makes it better than Python?
▶ No.832232>>832241 >>832264
>>832204
>shills
Experienced programmers know there's nothing that can replace C today and are sick of you kiddos that just jumped on board in the last couple years due to the facebooks telling you you needed to be STEM pulling a dunning-kruger and saying we can replace C with whatever this month's popular language on HN is. The arguments are just painful as you'd expect from 'programmers' who have at best managed to write a Hello World arguing with people who have been doing this for a living for decades.
>the compile times are absurd and impractical for even small amounts of code
<sounds like you just need to git gud and stop recompiling so much, faggot!
>it doesn't integrate into build systems
<still using build systems, grandpa? Get with the times and use webshitpackager with docker and duplicate the whole gentoo build environment and rewrite every dependency in the same language only on Intel x86-64!
>it's garbage collected
<there's literally nothing wrong with garbage collection today! It usually freezes the whole process for no more than 10ms, you're afraid of a delay so small that it is impossible to notice!
>it's slow
<raw speed doesn't matter, it's all about algorithms! And parallelism! C can't do parallelism!
>its libraries are unsafe as they are unmaintained one-offs by nobodies and are not managed by a distro maintainer or archived
<hah, cfag talking about unsafe! omg unsafe pointers, UB, overflow! that disqualifies this entire conversation so we won't even talk about how we have no idea what code we're incorporating or what quality it is!
>it isn't easily used with/by other languages
<why would you ever need more than one language? everything works if you just rewrite everything in our language, just do it!
>it can only run on {web,mobile,desktop,server}
<why would you ever need to have only one language? you should use these other languages for those things and write all your library code multiple times! should be easy to find programmers that are masters of all of those languages.
▶ No.832233>>832237
>>832231
>some faggot invests in a good library for a shit language
>shit language is forever the best
This attitude will serve you well when JavaScript and PHP are the best languages on the planet, after infinite amounts of PHD students have been ground up and used to force benefits into PHP and JS implementations.
▶ No.832237>>832239
>>832233
I use python for numerical computing. Most people around me use python for numerical computing.
If common lisp is better for numerical computing I'd like to know. However, all the reasons that post listed also apply to python. They're not enough to make me try common lisp, because properly trying it is a major investment of time and effort. But if there are other reasons that don't also apply to python, or apply moreso to common lisp than to python, that might change things, so I asked.
▶ No.832239>>832277
>>832237
>numerical computing
I have a feeling people in this thread have very different ideas about what that phrase entails.
▶ No.832240>>832242 >>832698
>>832204
>Lisp isn't more "math" than any other programming language
It's certainly better for expressing math, seeing as a Scheme program is built entirely out of recursive expressions-to-be-evaluated. It makes bringing concepts like Peano arithmetic and lambda calculus into computer science (=/= programming) studies natural, where they would seem out of place even in a high level derivative of Algol which has come closer to Lisp (like Python).
You can write in it as if it were any other language (albeit an awkward but powerful one), but that's missing the point of why academics and Lisp hackers love it.
>C is a huge problem and causes billions of dollars in damage and wasted time
Most programmers struggle with C and write buggy code because they're lazy and don't understand UNIX philosophy, beside that C is smushed in where it shouldn't be because of tech-illiterate managers who see that C is popular and decide to use it. That much is true.
If you're getting segfaults and buffer overflows though, that's on you. It's a fine language if you learn it right. Pointers aren't that hard.
>Lisp is an "Emperor's New Clothes" situation.
You don't even realize what a powerful language you're dealing with when it comes to Lisp, do you? Are you Paul Graham's theoretical "Blub programmer"? Have you actually tried out Lisp? Have you ever used a macro? It's unlike anything else out there.
>Lisp doesn't even give 10% of the benefits they claim.
That's mostly because other languages have taken so many of the innovations which made Lisp special and incorporated them. Lisp is still the only one to combine them into something off-the-scales powerful, though.
>instead of focusing on solutions that worked for decades outside of the C/UNIX/PC ghetto
What do you suggest to replace C? Pascal? Pascal probably would be better for making low-level code for OSes (Pascal microkernel when?) when considering where things have headed (high security software demands readable, hackable code a la BSD) and the decline in programmer quality (codemonkeys taught HTML, Java, and Python need simplicity shoved down their throats), but it's dead because no one uses it. Even if it is a good all-rounder of a reasonably low-level language, it has no momentum now and no calling card to gain any momentum.
Compare this to how there will always be a few Lisp hackers every generation because of its unique mix of a lack of syntax, s-expressions, homoiconicity, dynamic typing, macros, etc.
That's the problem with replacing C, whether you like C or not - the language does so much that any language which could entirely replace it would either never be noticed or be too stretched out (C's generalism plus a more powerful feature, all from scratch at a level of abstraction which no one wants to deal with because we don't need to milk a 256MiB RAM machine for performance). It's a legacy relic in critical but ugly niches.
The one instance in which something new might be built is that of programming languages for microcontrollers like Arduino and the like, a modern fork of Forth to address C's issues. There are also quantum and neuromorphic computers which need their own low-level languages (although Lisp can already be ported to the latter, and maybe even the former as well).
>Haskell is not a C replacement and anyone who thinks it is is a moron.
There's a Haskell compiler which converts it to C-- (C stripped down to portable assembly code). IDK if it's any good or not, I don't use Haskell.
▶ No.832241
>>832232
this anon gets it.
90% of programming advice on the internet takes the form of "you should eat that plant and get back to me on whether it's poisonous or not"
▶ No.832242>>832264 >>832268
>>832240
>Most programmers struggle with C and write buggy code because they're lazy and don't understand UNIX philosophy
But writing buggy code because of laziness is completely in the spirit of Unix. I'm not even joking.
▶ No.832264
▶ No.832265
>>830075
Think how fast tech was advancing in the 70s vs today. If you think that plays no role in the longevity of a language then you have water in your skull. Keep in mind that the entire computer industry was built on COBOL and C. The role of C has stayed the same because it its design left little to desire in its field (that'd be very low level stuff not some userland buggy software). In 10 years the web will be the first application your mobile computer starts up and it sure as hell won't be written in javascript.
▶ No.832268>>832692
>>832242
Maybe it was when compared to "The Right Thing", but not any more. Code was written for a group of users which included the programmers themselves. Lazy was something that kind of worked, did its one job, and didn't take up resources.
Now, you shit out as much code as possible (in whatever language your idiot boss tell you to use) to make a hackneyed bells-and-whistles solution that barely works because otherwise some $2-a-day guy in India will do it first, and people eat it up and throw it out anyways. By comparison, UNIX philosophy looks austere, out of touch, and impractical.
Different times, different lazy.
▶ No.832277>>832289
>>832239
When python programmers say they are doing numerical computing, they're using their language to call other libraries, nearly all of which are written in C/C++/Fortran/Asm and do the heavy lifting.
▶ No.832289>>832294
>>832277
Python('s most popular implementation) itself is also implemented in C. Numpy is written specifically for Python and integrates very well with it. What's your point?
▶ No.832294>>832295
>>832289
>What's your point?
That it isn't strange for Python to be used for numerical computing.
▶ No.832295>>832299
>>832294
Right, sorry. Your post pattern-matched to some weird arguments I've seen in the past.
▶ No.832299
>>832295
I think it's because people know Python is a slow language which seems antithetical for number crunching. Yet those same people wouldn't bat an eye if someone told them they were doing numerical computing in Matlab or Mathematica, which are both (very) slow languages. Of course in all three cases, you're calling highly optimized routines, and funny enough, it always seems to be GMP, BLAS and LAPACK at the bottom.
▶ No.832692
>>832268
The Right Thing is not complicated. It just means you throw away code and start from scratch every so often. That is the secret to clean, elegant solutions.
The web is UNIX philosophy. The whole incrementalist approach is favored by UNIX and C hackers. If it was made by some other group, they would have made a HTML 2 and a Script 2 based on everything they learned in the last 20 years, and completely broke backwards compatibility. These incremental things all suck because they "kind of work" instead of work, which is the UNIX philosophy (Worse is Better) at its finest.
▶ No.832698
>>832240
>It's certainly better for expressing math, seeing as a Scheme program is built entirely out of recursive expressions-to-be-evaluated.
Some posts said Lisp is a "variant of math" or "how math works" or "a more pure mathematical space". I would be more likely to believe that if it was a Haskell or Coq thread, but even those languages depend on the machine.
>Most programmers struggle with C and write buggy code because they're lazy and don't understand UNIX philosophy, beside that C is smushed in where it shouldn't be because of tech-illiterate managers who see that C is popular and decide to use it. That much is true.
The creators of C are lazy and UNIX philosophy is cancer. The real UNIX philosophy is "Make something that sucks and form a cult around it so you don't have to fix it."
>If you're getting segfaults and buffer overflows though, that's on you. It's a fine language if you learn it right. Pointers aren't that hard.
C is a bad language even for pointers, which are the only thing C is supposed to be good at. C continues to cause billions of dollars in damage but programming languages older than C solved these problems. Why are we using something 50 years later that is worse than what they used in the 60s on computers with 64 KB of memory? It makes no sense to me. Does that make any sense to you?
>You don't even realize what a powerful language you're dealing with when it comes to Lisp, do you? Are you Paul Graham's theoretical "Blub programmer"? Have you actually tried out Lisp? Have you ever used a macro? It's unlike anything else out there.
Since you're the one defending C, you sound like the "Blub programmer". Lisp increases productivity by orders of magnitude compared to C, but not compared to a good language.
>That's mostly because other languages have taken so many of the innovations which made Lisp special and incorporated them. Lisp is still the only one to combine them into something off-the-scales powerful, though.
Scripting languages have a lot of Lisp features anyway, so they add more Lisp features because they're the best fit. Other languages don't really copy Lisp unless some Lisp shill does it so they can say everyone is copying Lisp, like lambdas in Java and C++.
>What do you suggest to replace C? Pascal? Pascal probably would be better for making low-level code for OSes (Pascal microkernel when?) when considering where things have headed (high security software demands readable, hackable code a la BSD) and the decline in programmer quality (codemonkeys taught HTML, Java, and Python need simplicity shoved down their throats), but it's dead because no one uses it. Even if it is a good all-rounder of a reasonably low-level language, it has no momentum now and no calling card to gain any momentum.
Pascal is one choice. The universities were shilling that FP meme since the 70s and what really happened? Programming got worse, education got worse, software got less secure and more bloated. None of that is FP's fault, mostly because FP didn't catch on at all, but what it did do was "steal oxygen" and researchers from more viable languages. There were a lot of known problems in CS, but the FP shills were talking about getting rid of von Neumann machines and other things that never happened, so they decided not to solve them.
>Compare this to how there will always be a few Lisp hackers every generation because of its unique mix of a lack of syntax, s-expressions, homoiconicity, dynamic typing, macros, etc.
There are Pascal, BASIC, COBOL, APL, and MUMPS fans.
>That's the problem with replacing C, whether you like C or not - the language does so much that any language which could entirely replace it would either never be noticed or be too stretched out (C's generalism plus a more powerful feature, all from scratch at a level of abstraction which no one wants to deal with because we don't need to milk a 256MiB RAM machine for performance). It's a legacy relic in critical but ugly niches.
C doesn't do much at all, and what it does do, it does poorly. Just add pointers and pointer arithmetic, and your language does everything that's special about C and does it better.
▶ No.832706>>832720
>thread about a lisp
>ada fag still manages to make it about UNIX and C
lol
▶ No.832720>>833328
>>832706
106 replies in, and you're the first to mention the Ada language. Mentally ill?
▶ No.832725>>832748
>>832231
Well, CL has fractions built in, not requiring another library. Fractions are also handled generally exactly the same as standard integers and the like. According to the benchmarks game, CL is generally faster than Python with numerical calculations as well. Python has no function that can load data and code from any arbitrary file, with CL, saving data and loading it is a breeze because you can easily load LISP data out of a file.
CL can be used very efficiently to process lists of numbers as well, LISP stands for LISt Processing for a reason you know.
▶ No.832748>>832793 >>833407
>>832725
Python has fractions in the standard library. They implement an abstract base class, which ensures they're handled properly.
Python with numpy is really fast. It's the standard for numerical processing in Python when you care about performance.
Pickle can supposedly store and load arbitrary Python objects, but I haven't tried it. The usual data structures convert cleanly to and from json.
Numpy is vector/array-based, which is sort of like lists. Vanilla Python is famous for its list comprehensions.
Do you have experience with numpy?
▶ No.832793>>832803 >>832968
>>832748
>Python with numpy is really fast.
It's usually even slower than regular python, which is already impossibly slow. Time this with and without the float32 casts.
#!/usr/bin/python
from numpy import float32
d = float32(0.1)
n = float32(0)
for i in xrange(1000000):
n += d
print n
▶ No.832803>>832829 >>832968
>>832793
$ for x in *.py ocaml; do cat $x; echo; echo -n "-- timing: "; ./$x; echo; echo; done#!/usr/bin/python
from numpy import float32
import time
before = time.time()
d = float32(0.1)
n = float32(0)
for i in xrange(1000000):
n += d
print time.time() - before
-- timing: 0.12046289444
#!/usr/bin/python
from numpy import float32
import time
before = time.time()
d = 0.1
n = 0
for i in xrange(1000000):
n += d
print time.time() - before
-- timing: 0.0890231132507
#! /usr/bin/env ocamlscript
Ocaml.packs := ["unix"]
--
let before = Unix.gettimeofday ()
let n = ref 0.
let d = 0.1
let () =
for i = 0 to 1000000 do
n := !n +. d
done;
print_float (Unix.gettimeofday () -. before)
-- timing: 0.00366306304932
well that's not fair. the parts that are optimized are optimized you know. you can't just expect everything to not be shittier than usual.
▶ No.832829
>>832803
numpy in general is for accuracy and compatibility. Doing anything other than array operations it will be quite slow as it has to go through several additional layers of python.
▶ No.832895
The problem with Lisp is that it was just too damn slow and bloaty back in the day, and when hardware finally caught up more sensible high-level languages with easier to grasp syntax became available to use instead. There's no real reason to learn it now unless you wanna customize Emacs.
▶ No.832968>>832984 >>833036
>>832793
>>832803
Okay, now I know you've never seen numpy code before. Here's the proper way to do it:
#!/usr/bin/python
import numpy as np
import time
before = time.time()
ns = np.linspace(0, 100000, 1000000)
print time.time() - before
numpy is based around vectorized operations. If you're using loops you're doing it wrong.
Now, imagine we do something almost useful, instead of incrementing a number without using it for anything. Let's say we want to square the numbers, and calculate their sum. This is how you'd do it with numpy:
#!/usr/bin/python
import numpy as np
import time
before = time.time()
n = np.sum(np.linspace(0, 100000, 1000000)**2)
print time.time() - before
▶ No.832984>>832995
>>832968
>Here's the proper way to do it:
.. that's not even remotely the same thing, though.
>numpy is based around vectorized operations. If you're using loops you're doing it wrong.
.. I explained that about array operations. But it's also wrong to suggest loops are 'doing it wrong' as many algorithms can't be expressed in terms of big array operations.
▶ No.832995>>833004
>>832984
>.. that's not even remotely the same thing, though.
It doesn't make any sense to do the same thing, because the same thing is useless. It's something you'd only ever do as a step in the process of doing something useful. Numpy has a useful way to do whatever that thing is using linspace, probably.
The method using linspace calculates all the same numbers your loop calculates. It's the numpythonic way of getting a million evenly spaced numbers.
>.. I explained that about array operations. But it's also wrong to suggest loops are 'doing it wrong' as many algorithms can't be expressed in terms of big array operations.
Can you give an example of a calculation that can't be expressed in numpy that way, in that case?
Numpy uses plenty of loops to implement these operations, but if you're writing your own loops you're usually doing it wrong. If the algorithm requires loops then you can still leave those loops to numpy.
Numpy derives its performance from not spending a lot of time executing python code. If you loop through xrange(1000000) you execute a million python loops. If you use a numpy function to generate a million numbers then you only execute a single python function. The bottleneck is moved out of your own code and into numpy's own compiled code.
▶ No.833004>>833022
>>832995
>It doesn't make any sense to do the same thing,
Creating a sequence instead of printing a value and having that sequence sum to 50000000000 rather than 100000 and calling it "the proper way" made more sense?
>because the same thing is useless
That code is similar to what I used to graph percentage error in accumulating frame time from Unity. It's not useless. A more complex version of that was used to show some devs why things were going haywire after a few days played on 144hz monitors. I needed numpy for the float32 support.
>Can you give an example of a calculation that can't be expressed in numpy that way, in that case?
Seems I already did, but in general, data dependencies limit array/vector operations. Something like AES might be a good example for you as while the math can be expressed in terms of arrays, those arrays are intentionally limited for various reasons and are so small that you'd likely be better off just writing a big loop than dealing with numpy's overhead.
▶ No.833022
>>833004
>Creating a sequence instead of printing a value and having that sequence sum to 50000000000 rather than 100000 and calling it "the proper way" made more sense?
For almost all numerical computing where I would be tempted to use that loop, yes. It looks like a common pattern for doing calculations in python. For your purpose, no, but I didn't know what your purpose was.
>That code is similar to what I used to graph percentage error in accumulating frame time from Unity. It's not useless. A more complex version of that was used to show some devs why things were going haywire after a few days played on 144hz monitors. I needed numpy for the float32 support.
That makes sense. But it's not a good way to benchmark numpy, because you're not even making numpy do the heavy lifting.
I can't find a fast way to do it in numpy that keeps all the imprecision you're looking for. But if you're not deliberately trying to make your answer inaccurate, this is a good equivalent:
>>> np.sum(np.full(1000000, 0.1, dtype=np.float32))
100000.09
It's about a hundred times as fast as your code. It doesn't do the same thing, because the answer it gets is much closer to the real answer, but that's the opposite of a problem in most computations.
>Seems I already did, but in general, data dependencies limit array/vector operations. Something like AES might be a good example for you as while the math can be expressed in terms of arrays, those arrays are intentionally limited for various reasons and are so small that you'd likely be better off just writing a big loop than dealing with numpy's overhead.
AES does look like a good counterexample.
But many other things with data dependencies can still be made a lot faster using numpy as long as you can express those data dependencies without using python loops. Numpy uses loops all over the place internally, but because they're not implemented in Python they're a lot faster.
▶ No.833036>>833038
>>832968
the point I thought was that numpy slows down normal-ass numpy code. If the numpy-ass numpy code wasn't much faster people wouldn't be using it. I think people who use Python have bad taste and poor morals; I don't think they have an alternate experience of the passage of time.
▶ No.833038
>>833036
>normal-ass numpy code
Nobody uses numpy that way.
▶ No.833113>>833120
I've been searching for a way to write common lisp in nvim, but I have not come across a single good andd working plugin. Does anyone know if there is a good way to write clisp in nvim?
▶ No.833120>>833208
>>833113
What do you mean by a "good way"? Rainbow parentheses? Auto-indenting? Auto-matching parentheses? A REPL plugin? You have to be more specific.
▶ No.833208>>833307
>>833120
I have no idea what im doing anon.
▶ No.833307
>>833208
Here is what I have installed. Keep in mind that I know how to use Common Lisp, but I have not yet written anything big in it.
< Rainbow Parentheses
I use https://github.com/luochen1990/rainbow mostly because it is not limited to parentheses, it also works with stuff like HTML tags.
< Auto-mathing parentheses
https://github.com/Raimondi/delimitMate Not just for Lisp languages, this is handy for pretty much anything. Not limited to parentheses, works with anything
< Auto-indenting
No idea, it just works out of the box
< REPL
There are a couple, I use https://github.com/HiPhish/repl.nvim
< Something like Slime
There is SlimV https://github.com/kovisoft/slimv, but I haven't tried it yet. May downside is that it's a Vim plugin, so it's subject to Vim's limitations. I wanted to try my hands at writing a proper Neovim plugin, but when I looked at the source of Swank (the server part of Slime) I threw up my hands and just gave up. I don't have to autism super powers to dig through that mess.
▶ No.833328>>833332
>>832720
Ada did nothing wrong.
▶ No.833332>>833340 >>833678
>>833328
80% of the pre-Rust mentions of Ada I read describe it as a giant inelegant overdesigned clusterfuck. Sometimes it's compared to PL/I.
I haven't used it, but it makes me sceptical.
▶ No.833340
>>833332
>taking "$foo did nothing wrong" seriously
of course it's bloated, it was designed for military use and high assurance meme. I still want to learn it so I can show off, but it's not a 'brainlet repellant' like glorious Lisp, nor does it contain zero cost abstractions.
▶ No.833407>>833409 >>833422 >>833476
>>832748
Try 999/201*201 in Python and compare with (* 201 (/ 999 201))
I'll wait.
▶ No.833409
>>833407
So that's why lisp is so insanely slow then.
▶ No.833422>>833461
>>833407
Don't get me wrong, I love Scheme, but I'm pretty sure numpy would offer that. And any language can get that functionality with https://gmplib.org.
▶ No.833461
>>833422
any language can get that functionality with floating point:
utop # 999. /. 201. *. 201.;;
- : float = 999.00000000000011
if given machine integers you can take them and compose bignums, rationals, a CAS if you want. Scheme just takes away machine integers, giving you instead tools that are 'better' but which are no longer a solid platform for anything you might make.
but python still sucks▶ No.833476>>833478 >>836106
>>833407
>>> from fractions import Fraction
>>> Fraction(999) / Fraction(201) * Fraction(201)
Fraction(999, 1)
A predecessor of Python, also developed at the CWI, used fractions by default. It turned out to be a mistake - most calculations didn't need that precision, but it made them significantly slower and in some cases made programs endlessly slower and slower as the factors kept piling on. It's not something you should have by default. So Python puts it in the standard library, but makes it optional.
▶ No.833478
>>833476
This works too, by the way.
>>> Fraction(999) / 201 * 201
Fraction(999, 1)
>>> sum([5, Fraction(999), 201, 100])
Fraction(1305, 1)
You don't need to make every single number you use a Fraction manually. It propagates.
▶ No.833678
>>833332
>mentions of Ada I read describe it as a giant inelegant overdesigned clusterfuck.
What languages do they want you to use instead?
>Sometimes it's compared to PL/I.
That's because there are actual similarities to PL/I. They both have fixed-point numbers, tasking, nested subprograms, exception handling, decimal numbers, region-based memory management, dynamic arrays in records, variables declared at specific addresses, etc. C does not have any of this (Blub paradox).
>I haven't used it, but it makes me sceptical.
If someone said that about Lisp, you would call them a Blub programmer.
▶ No.833735
Honestly, give me anything with strong and/or static typing and no garbage collection. I was almost interested by Haskell until I saw the GC.
▶ No.834737>>834749
What distinguishes scheme from common lisp?
Have a webm in exchange for your knowlage.
▶ No.834749
>>834737
It's a different language with similar syntax.
Common Lisp is large. Scheme is small. Scheme's standard is shorter than the table of contents of Common Lisp's standard.
Scheme used to be popular for teaching. It's the language used in well-known meme book SICP.
▶ No.835790>>836112
>>832161
>or fixing problems you can't solve abstractly. (e.g. make secure crypto, etc.)
can you even read?
▶ No.836106>>836113 >>836192
>>833476
You're still missing the point. Fractions are automatic in CL but must be imported in Python.
▶ No.836112
>>835790
All problems are better solved thinking like a machine, 'tard. That you can abstract 1+1 in the most dirty, inefficient way possible and have it limp across a special olympics finish line does not make it good, and if you compete for real you will be destroyed.
▶ No.836113
>>836106
Defaulting to use fractions by default is awful. It's why no one uses CL.
▶ No.836192
>>836106
Python's behavior is better. Most of the time you shouldn't use them. Making them available is fine, defaulting to them is stupid.
van Rossum knew this. He worked on another programming language that did default to fractions, and lo and behold, it sucked.
>Numbers are one of the places where I strayed most from ABC. ABC had two types of numbers at run time; exact numbers which were represented as arbitrary precision rational numbers and approximate numbers which were represented as binary floating point with extended exponent range. The rational numbers didn’t pan out in my view.