[–]▶ No.924106>>924118 >>924143 >>924180 >>924365 >>924499 >>937727 [Watch Thread][Show All Posts]
https://www.hooktube.com/watch?v=oKg1hTOQXoY
https://archive.is/kzpa
(This is all the dialogue from the video, for those who will not watch/scrape it.)
Instead of an *actual* revolution, we have the following:
Java, C++, Javascript, Web engineering [sic], Windows, modern Unix derivatives (including Linux, BSDs, OS X, iOS, Minix, Android, Plan 9), x86, ARM, and the whole "Cloud" business. (Also, decades before all this happened, C and Unix essentially extirpated more interesting languages (Pascal, Forth, Fortran, Lisp) and Operating Systems (ITS, Multics, VMS.)
Basically, post-Smalltalk and post-Symbolics computing is snake oil. We're in the dark ages, utilising repainted hardware and software from the 70s. The only thing that prevents it from collapsing is Moore's law (failing).
All the things named were pushed as a "next big thing" while the _real_ important technologies have faded away into obscurity. x86 should have died at the hands of Symbolics Ivory or DEC Alpha, any Unix or Windows systems should have been replaced by some offspring of Genera and Smalltalk-80, C++/Java are bloated and inferior to contemporaries like Common Lisp, et cetera, et cetera. The modern computer industry is a joke, it's unable to discern buzzword-powered kludges from the real thing.
I've also provided the essay by Alan Kay in pdf format.
▶ No.924118>>924124 >>924141
>>924106 (OP)
(Disclaimer, I haven't read the PDF.)
>Also, decades before all this happened, C and Unix essentially extirpated more interesting languages (Pascal, Forth, Fortran, Lisp) and Operating Systems (ITS, Multics, VMS.)
You know what that's called? Natural selection.
>inb4 there's nothing natural about it because this isn't nature so let me focus on your terminology rather than idea
Get over it. The ugly brute that is C was better, practically speaking, than the rest. Nobody's stopping you from using Pascal if you want to. Get a compiler and IDE, grab a tutorial, register on a newbie forum and start coding like hell.
>inb4 but nobody else is doing it so I feel alone
Tough shit, you being a genius and all. The tyranny of the mediocre majority, stepping on each other's toes rather than standing on each other's shoulders, etcetera.
>The modern computer industry is a joke, it's unable to discern buzzword-powered kludges from the real thing.
Are you going to include OOP with its design patterns in that? I would. But back on track:
>The Real Computer Revolution Hasn’t Happened Yet
Yes it did. Every animal has a handheld computer with internet connection, high res video camera, touchscreen, functioning as a wireless telephone sometimes. That's the real computer revolution, and it happened. Giving magic technology to the plebs and them being able to use it even without understanding it.
What is it that you wanted, exactly, in the "real" revolution? Tech that was so arcane and difficult to use that only specialized CS doctors were allowed to touch it?
▶ No.924124>>924141 >>924180 >>924418
>>924118
>Get a compiler and IDE, grab a tutorial, register on a newbie forum and start coding like hell.
Why do you assume I can't program? I do program in FreePascal/SBCL CLisp for (believe it or not) end-user (personal, to be fair) programs. I'm planning on learning Ada soon (that is another language that wasn't allowed to shine because of C/C++). Anyways, that is beside the point.
>Are you going to include OOP with its design patterns in that
Alan Kay does not consider Java/C++ to be OOP. His definition of OO is Lisp and Smalltalk.
>What is it that you wanted, exactly, in the "real" revolution? Tech that was so arcane and difficult to use that only specialized CS doctors were allowed to touch it?
Alan Kay spearheaded the idea of personal computing* and Lisp Machines had far more documentation than Unix (and every other OS today, really).
>Giving magic technology to the plebs and them being able to use it even without understanding it.
This could have been done with the elegance and dignity of computers retained. *http://history-computer.com/ModernComputer/Personal/Dynabook.html
▶ No.924141>>924187 >>924251
t-this entire board is gonna be nothing but lisp meanies, isn't it?
>>924118
>You know what that's called? Natural selection.
This. Where's your modern Lisp OS now? Don't say it's too hard to do. There's tons of super-niche software projects like Genode and 9Front and Haiku, and even super-niche hardware projects like EOMA68 and TALOS II.
>Are you going to include OOP with its design patterns in that? I would. But back on track:
a lot of silly anons think OOP is stupid and for pajeets ^.^
I dont think so, but I dont think its really necessary for a lot of stuff.
maybe making the OOP stuff optional would be a better idea. Languages like Java and C# require you to do all the OOP class and object stuff even if you're not really using it.
>What is it that you wanted, exactly, in the "real" revolution?
apparently for C-like languages to not exist anymore because they hurt his feels UwU
My "real" revolution would be a Free Software revolution, but that's just me.
>>924124
>Alan Kay does not consider Java/C++ to be OOP. His definition of OO is Lisp and Smalltalk.
It feels like you meanies' definition of everything is Lisp and whatever the other dead OSes used.
>Lisp Machines had far more documentation than Unix (and every other OS today, really).
*giggles*
>This could have been done with the elegance and dignity of computers retained.
once again, you lisp meanies' definition of everything good seems to be "not C or C-like."
▶ No.924143>>924146
>>924106 (OP)
>Chosen Lisp spam
▶ No.924146>>937826
>>924143
Do you understand the difference between a family of programming languages and one implemention of one dialect of a programming language?
Lisp is jewish, like AWK, but that screenshot is the least coherent way to try to prove it.
▶ No.924147>>924148 >>924378 >>924381
The real redpill is database filesystems. UNIX spread the idiotic idea of hierarchical filesystems, while database & metadata-based filesystems are objectively superior.
▶ No.924148>>924150
>>924147
That's too far back. We need to go back to the paradigm of early Unix, when filesystems were simply directed graphs, no tree structure imposed.
▶ No.924150>>924151 >>924153 >>924187
>>924148
Yeah, no. We need tag-based filesystems only. No filenames, no file extensions. No one needs that shit.
▶ No.924151
>>924150
All persistent data should be stored in a single flat file, directly mapped to disk storage. The rest of the filesystem is kept around only as an abstraction layer for operating system facilities.
▶ No.924153>>924162
>>924150
That'd be so shit. I'd need to explicitly tag all the shit I create. It would be easy to do bad shit like forget to keep the metadata accurate as the data changes. How would the computer even explicitly reference one particular file?
▶ No.924162>>927963
>>924153
Per-file IDs? Also, the OS itself would probably add per-file tags, but I didn't think of that. Like if it has an image it has an image tag, etc.
Here's an actual working concept of what I'm describing http://www.eecs.harvard.edu/~margo/papers/hotos09/paper.pdf
▶ No.924180>>924356 >>926795
>>924106 (OP)
>(Also, decades before all this happened, C and Unix essentially extirpated more interesting languages (Pascal, Forth, Fortran, Lisp) and Operating Systems (ITS, Multics, VMS.)
UNIX leaders say they're supposed to be bad and obsolete. UNIX weenies think Multics is primitive and bloated.
It's the same with programming languages. "Why Pascal is not my favorite programming language" is C marketing. He says "Comparing C and Pascal is rather like comparing a Learjet to a Piper Cub" which is like saying Learjets are unreliable, have a lot of design defects, and cause billions of dollars in damage. Comparing Pascal to C is how I know it's marketing. "The size of an array is part of its type" is true in Ada, PL/I, and Fortran too, but the bounds of parameters are handled better. The C "solution" is array decay, which sucks, so C can't do bounds checking or multidimensional arrays properly. He talks about #include as a good thing, but new languages in 1981 already had modules. Pascal now has modules too and C is stuck with #include in 2018 and it can't be fixed. If it wasn't marketing for C, it would have included a few more languages to show how they solved those problems, but that would make C look worse than Pascal.
>The modern computer industry is a joke, it's unable to discern buzzword-powered kludges from the real thing.
What really sucks is that the buzzword-powered kludges resemble real solutions made by other communities, like dynamic linking, OOP, high level systems languages, and so on, but they don't have the same benefits as the real thing.
>>924124
>I'm planning on learning Ada soon (that is another language that wasn't allowed to shine because of C/C++).
Why am I retraining myself in Ada? Because since 1979 I
have been trying to write reliable code in C. (Definition:
reliable code never gives wrong answers without an explicit
apology.) Trying and failing. I have been frustrated to
the screaming point by trying to write code that could
survive (some) run-time errors in other people's code linked
with it. I'd look wistfully at BSD's three-argument signal
handlers, which at least offered the possibility of provide
hardware specific recovery code in #ifdefs, but grit my
teeth and struggle on having to write code that would work
in System V as well.
There are times when I feel that clocks are running faster
but the calendar is running backwards. My first serious
programming was done in Burroughs B6700 Extended Algol. I
got used to the idea that if the hardware can't give you the
right answer, it complains, and your ON OVERFLOW statement
has a chance to do something else. That saved my bacon more
than once.
When I met C, it was obviously pathetic compared with the
_real_ languages I'd used, but heck, it ran on a 16-bit
machine, and it was better than 'as'. When the VAX came
out, I was very pleased: "the interrupt on integer overflow
bit is _just_ what I want". Then I was very disappointed:
"the wretched C system _has_ a signal for integer overflow
but makes sure it never happens even when it ought to".
It would be a good thing if hardware designers would
remember that the ANSI C standard provides _two_ forms of
"integer" arithmetic: 'unsigned' arithmetic which must wrap
around, and 'signed' arithmetic which MAY TRAP (or wrap, or
make demons fly out of your nose). "Portable C
programmers", know that they CANNOT rely on integer
arithmetic _not_ trapping, and they know (if they have done
their homework) that there are commercially significant
machines where C integer overflow _is_ trapped, so they
would rather the Alpha trapped so that they could use the
Alpha as a porting base.
▶ No.924186
It was better when everyone had to use 8.3 filenames on a floppy disk without subdirectories. All "improvements" eventually just get leveraged to a greater extent by big brother and big data.
▶ No.924187>>924251 >>924362 >>937712 >>937955
>>924141
>This. Where's your modern Lisp OS now? Don't say it's too hard to do.
Lisp machines died because they needed dedicated hardware support and Intel microprocessors killed all of that.
>>924150
this
▶ No.924229
Computing for its own sake has been tainted by businessfags and normalfags. They vastly outnumber anyone actually interested in computing and are only interested in "getting the job done" whatever that means.
▶ No.924251>>924362
>>924141
>Where's your modern Lisp OS now?
>Don't say it's too hard to do.
What? Do you want us to make a FPGA implementation of a LISPM and develop an operating system for it as a one man team? That's a big undertaking.
>Languages like Java and C# require you to do all the OOP class and object stuff even if you're not really using it.
They don't though. Both of these languages have char, int, long, etc. This is one reason why those aren't object oriented languages.
>>924187
I was doing some LISPM research through my university's research database and encountered found this take on it.
Then came the summer of 1986. A lot of users were concerned with the high price
of LISP machines and started running applications on more economical delivery
vehicles, like IBM Personal Compu er ATs and Sun workstations. Sure, there was
a performance differential, but the trade-off was worth it. Why buy a Symbolics
3600 when you could get three Sun computers for the same price? The AI
consulting firm DM Data estimated that by the end of 1986, there were more than
6,000 LISP machines installed worldwide -- more than a third of them from
Symbolics. However, DM Data also estimated that there were probably fewer than
6,000 LISP programmers qualified to take full advantage of the machine. That
means there are LISP machines sitting in companies with nothing to do.
▶ No.924299>>924334
The real computer revolution has come and went and it has left most of humanity behind, OP.
Tech illiterates are just getting the handsmedown from technologists.
It hasn't helped man achieve that much in regards to mental work. It instead enabled people to amuse themselves in idle entertainment.
▶ No.924317>>924334 >>924582
LISP machines died because they served no purpose other than to fuel MIT's terrible computer science department.
>the early beginnings of the microcomputer revolution, which would sweep away the minicomputer and workstation makers, cheaper desktop PCs soon could run Lisp programs even faster than Lisp machines, with no use of special purpose hardware.
If LISP ran faster on UNIX and C based, even Windows, computers than LISP machines, why is it still constipated in practical use? LISP is nothing more than fanciful research language that has no real practical purpose.
If LISP runs faster on UNIX and C based, even Windows, computers, then why even spend the time implementing any kind of LISP hardware when it's unnecessary bloat?
▶ No.924334>>924335
>>924299
Entertainment will be the death of mankind.
>>924317
Lisp machines are from the 60s, you can't compare them to modern day computers. What if we make a LISP that actually runs fast though?
▶ No.924335>>924337
>>924334
>LISP machines from the 60's
You mean those LISP machines that Symbolics were producing in the 80's? Wow.
▶ No.924337>>924352
>>924335
Damn. I didn't know they made them up to that date. What kind of technology were they based on?
▶ No.924352
>>924337
See for yourself.
▶ No.924356>>924370 >>924461 >>924464 >>924582
>>924180
Hey (((LISPnigger))), if C is so bad and (((LISP))) is so great, why does every machine in existance use an operating system programmed mostly in C, and why is (((LISP))) seldom used in development?
If it costs millions in dollars than why do companies still use C and are completely fine? They wouldn't use C if it truly costed them money, and they would use (((LISP))) if it had any positives. Natural selection. Also why did Multics machines had such bad latency compared to a RISC machine like the SGI Indy?
Hard mode:
no obscure and old bbs newsgroup opinion that magically prove X
▶ No.924362>>924364 >>924367
>>924187
>>924251
>Lisp machines died because they needed dedicated hardware support
>What? Do you want us to make a FPGA implementation of a LISPM and develop an operating system for it as a one man team? That's a big undertaking.
ooh so it's because lispy machines need special snowflake hardware and can't work on normal processors.
And don't say that "those processors are made for UNIX! [insert Sun workstation 'General Purpose' blockquote]". Microsoft clearly has not had any issue creating their non-UNIX operating systems on there. They even had support for quite the variety of stuff too, with their now-dead CE version running on MIPS, SuperH, and PowerPC. What is it about Lisp OSes that makes them not run on 99% of CPUs?
Also, I wasn't implying you do it all by yourself, that would be crazy. But was just showing that it wouldn't necessarily take a company the size of Intel or AMD to do stuff.
>Both of these languages have char, int, long, etc. This is one reason why those aren't object oriented languages.
so wait, to be a """real""" OOP language it can't have data types? Keep in mind that data types in C# are actually objects and have methods such as TryParse and whatnot.
▶ No.924364
▶ No.924365
▶ No.924367
>>924362
>need special snowflake hardware
FPGAs are pretty much the step before ASICs. (The CPU you are using right now is 99.99% an ASIC) Not only are ASICs extremely expensive to manufacture, there is not yet a free software toolchain to produce one unlike the current situation with FPGAs.
A lisp machine refers to both the operating system AND the hardware it is running on.
<J-just use other people's processors
This is the same thinking that keeps us stuck with UNIX. We have to work around and fixing all this legacy software. There comes a point where it turns out that this experiment was a failure and you need to scrap it.
>it can't have data types
An OOP language pretty much must have data types. Every variable is an object and it's type is what object it is.
▶ No.924370>>924371 >>924375 >>924384
>>924356
>why does every machine in existance use an operating system programmed mostly in C
Because it's easy and every drooling programmer today knows how to program in C; despite Ada being much better for device drivers/critical systems, and Pascal/Lisp being better for end-user programs.
>They wouldn't use C if it truly costed them money
What does this even mean? I think it's idiotic that people use C/C++ for embedded systems/rockets/drivers. They used C++ for the F-35 because of the availability of C++ programmers. But, "Worse is Better" what the industry has stagnated on.
▶ No.924371
>>924370
>But, "Worse is Better" what the industry has stagnated on.
But, "Worse is Better" is what the industry has stagnated on.
▶ No.924375
>>924370
You know how I know you don't program?
▶ No.924378>>928586 >>928629 >>928716 >>937944 >>937955
>Genera
>Smalltalk-80
>LISP
>Java
Anything dependent on emulation in all its euphemism treadmill incarnations (interpreted runtime, bytecode, garbage collection, etc.) isn't a real programming language. Ultimate redpill is ALGOL/Modula/Oberon.
>>924147
Reminder M$ tried and failed at exactly this with Longhorn, while Apple's resource forks and Be's BeOS both dipped their toes in the idea.
▶ No.924381
>>924147
besides your opinion, what objective facts do you have to back what you said up?
▶ No.924384>>924385 >>924396
>>924370
I do it as a hobby; yes, I am not employed as a programmer. Why does that matter? C/C++ *is* the wrong choice for critical systems - that should be obvious. They're replacing Ada/SPARK with C/C++ for military systems too, that should be great.
Also, this:
http://www.adapower.com/index.php?Command=Class&ClassID=FAQ&CID=328
I shudder to think what would've happened had it been programmed in C.
▶ No.924385>>924387
>>924384
We already know you don't program.
▶ No.924387>>924391
>>924385
I still don't understand what gives you that impression. Why is it that if you hate C/Unix you're considered a LARPer here?
▶ No.924391>>924392
>>924387
You don't show any proficiency knowing what kinds of problems programmers face in actual practice. You roll around with these ideals that have no practical use, and when applied in real-world scenarios, they are proved time and again less powerful, less efficient, and costly experiments that have not aided in aiding the programmer, the user, or the company.
▶ No.924392>>924394
>>924391
Yeah, no. Ada/SPARK is far, far superior to C/C++ for anything critical. The fact that they're being phased out has everthing to do with the lack of Ada/SPARK programmers - and nothing to do with C/C++ being the pragmatic solutions.
▶ No.924394>>924396 >>924397
>>924392
Provide proof. Otherwise your opinion is shit.
▶ No.924396
>>924394
Prepare for blockquotes! OwO
>>924384
>They're replacing Ada/SPARK with C/C++ for military systems too, that should be great.
you mean this?
https://www.sbir.gov/sbirsearch/detail/826011
https://sel4.systems/Info/Docs/GD-NICTA-whitepaper.pdf
>seL4-Enabled Safe & Secure Soldier Helmet Display
>a complete high assurance system of significant complexity, ultimately a military UAV.
I think it will go really well!!
▶ No.924397>>924401 >>924571
▶ No.924401>>924402 >>924582
>>924397
>ada
>ada
>wiki
>ada
>ada
>wiki
Not exactly the best sources.
>Something about F-35
Did you write the programs for them?
Nothing you've provided except away the requirement of a properly trained and hard-working programmer. It's all bytes in the end, and whatever high-level language you use doesn't matter in the long run if you don't know how the low level works. In other words, shut the fuck up.
▶ No.924402>>924403 >>924406
>>924401
>complains about sources
You're captious and braindead.
>Something about the F-35
Yes, something incredibly apropos to what you were asking that supported my point - bug-prone languages like C/C++ should never be used in critical/military situations. You know how I know you don't program? You think that inherently bug-prone languages are better than (relatively) bug-free languages; because of "pragmatism" and "It's all bytes in the end" (sic) - stop drooling on your keyboard and go back to working your prole job.
Also, here's an article that directly compares C and Ada: https://archive.is/8F3cB
You'll find an petty, cavillous way to write it off as babble though - I just know it.
▶ No.924403>>924419
>>924402
Sources are everything. If you can't provide a decent, unbiased source, then your argument is opinion, and I don't care to argue about your opinion.
▶ No.924406>>924417
>>924402
>ou think that inherently bug-prone languages are better than (relatively) bug-free languages; because of "pragmatism" and "It's all bytes in the end"
Here (((you))) go again. Shut the fuck up. If you don't know the low-level then your are uneducated for any high-level application of that hardware. Shut the fuck up.
▶ No.924408
Linux isn't a UNIX derrivative. It was merely inspired by MINIX and is currently the best kernel
▶ No.924417>>924419 >>924421
>>924406
>hurr durr they are biased *dribbles onto shirt* *shifts corpulent mass in chair* C da best!
I just *knew* you'd pull this card. You didn't read them at all. I have no idea how you're some kind of savant "real" "pragmatic" programmer while never having heard of the safety features built into Ada/SPARK.
Another source: https://archive.is/tyHlI
You also need to read up on what "Steelman language requirements" are. Air traffic control systems and the fly-by-wire controls on the Boeing are all written in Ada for a reason - programming languages are tools, and some are better for the job than others.
>>924406
>jewish echo
>your are uneducated
>Shut the fuck up.
>Shut the fuck up.
You aren't even worth my time, mouth breather.
▶ No.924418
>>924124
> I'm planning on learning Ada soon
Good choice. Get the Barnes book, it's a pretty comfy read.
▶ No.924419
>>924417
The first part of my post was meant for >>924403
▶ No.924421>>924424
>>924417
All you can rely upon is ad hominem. Shut the fuck up.
▶ No.924424>>924426 >>924477
>>924421
All you can rely on are your Jewish parenthesis. Read about the Steelman language requirements and then realize that using C for safety-critical software is idiotic.
https://www.dwheeler.com/steelman/steeltab.htm
https://www.dwheeler.com/steelman/steelman.htm
▶ No.924426
>>924424
Coming from (((you))), no thanks.
▶ No.924461
>>924356
Enjoy the sound of silence friend.
▶ No.924464
>>924356
Your entire post is argumentum ad populum. Why do people praise Resnais and Weerasethakul if hacks like Kubrick and Lynch are what normalfags like?
▶ No.924478>>924482 >>924484
>>924477
I don't care what you think about the Jews (they are our humble, self-effacing benefactors :^)), but is the way he was spamming the jewish echo/parentheses not obnoxious? Why is it that (in recent years) you can't have discourse without meme verbiage?
▶ No.924482
>>924478
>Why is it that (in recent years) you can't have discourse without meme verbiage?
Because it's becoming impossible to have discourse in general. The memes are just part of the internet's scenery.
▶ No.924484>>924522
>>924478
> he was
Did (((you))) just appropriate my gender?
▶ No.924485>>924501 >>924505
Is there an Ada dialect that doesn't look like pascal?
▶ No.924499>>924527 >>926975
▶ No.924501>>924505
>>924485
No. You get used to it after a while though, and in my opinion, it makes the code very easy to read back later.
▶ No.924505>>924532
>>924485
>>924501
I'm also a fan of the syntax - the language being case-insensitive is another blessing.
▶ No.924507>>924513
Here, a file system that works
>>>/hydrus/
▶ No.924513
>>924507
I like Hydrus in principle, but its developer is almost as incompetent as the Calibre developer, and his unwarranted success doesn't redress the fact that Hydrus is a bloated, unstable pile of shit.
▶ No.924515
The reason all the better computer solutions lost is because of Moore's Law. Symbolics could not keep up with Intel and Moore's Law, not because they were inferior. We are in the perfect era to try something like a modern Lisp machine again.
▶ No.924520>>924582
6 months ago, every thread was turned into a RUST thread, a language which is now seldom mentioned. Now it's LISP machine shilled relentlessly. I'm starting to wonder if this is the same autist, someone who goes from idea to idea, and feels the need to evangelize his chosen path.
▶ No.924522
>>924484
Get out halfnigger.
▶ No.924527>>926970 >>926975
>>924499
Honestly, came here to say this.
Have fun decrementing nock, guys.
▶ No.924532>>924535
>>924505
>the language being case-insensitive is another blessing.
How's unicode support?
▶ No.924535>>924536 >>925826
>>924532
Unicode was a mistake. Supporting hieroglyph moonrunes is completely retarded.
▶ No.924536>>925826
>>924535
Into the trash it goes, then.
Of course, why would you elaborate on your statement to share your knowledge?
▶ No.924539>>924541 >>924582
So what stops you people from writing your own Lisp-based user space on top of Linux?
All you need to do is realize that Linux has a language-agnostic system call interface where you just need to put the right arguments in the right register and issue one instruction. This isn't heavy wizardry, kids. You can easily create a JIT compiler that generates exactly those instructions. Your lisp could easily have (CL probably does have) an address data type with peek/poke functions.
This would allow you to interact with the kernel as a first class citizen and with zero dependencies. You can run your Lisp directly on top of Linux with exactly 0 lines of C code being executed in user space. You can scrap all that Unixy GNU garbage and write your entire user space on top of this language. It wouldn't be POSIX compliant but really who gives a fuck?
Write the init system in Lisp. Write the service manager, and the services themselves, in Lisp. Rewrite the core utilities as Lisp functions. Render graphics from Lisp using kernel mode setting and DRM -- it's literally just a few ioctl calls away. Become the god of your new Lisp world and shape your promised land with your own bare hands.
Just don't attempt to make a Lisp OS for fuck's sake. Linux is too valuable to just throw it away. Use it, build on top of it. Trust me, pretty much all of the complaints about Linux are really complaints about Linux user space programs that retards think are part of the operating system. Just scrap all that and start over -- on top of Linux.
▶ No.924541>>924542 >>924546
>>924539
There's few enough kernels as it is.
▶ No.924542>>924545 >>924546
>>924541
Writing a fully functional modern kernel is pretty trivial, even something like VxWorks or QNX fits the bill.
The main asset of something popular like Linux is its panoply of actively maintained hardware drivers.
▶ No.924545>>924548 >>924553
>>924542
I guess that's what it all comes down to. Hardware drivers. This shit would be easier if homemade hardware was possible. Then we could intimately tie hardware to software. Maybe that's the next technological revolution?
▶ No.924546
>>924541
Do you want to realize your Lisp dream? Or do you want to waste time rewriting Linux?
Look at Emacs. It's the closest we have to the Lisp dream today. Instead of reaching out and covering all user space with Lisp, it contains Lisp inside itself, a safe space, and people little by little write Lispy interfaces to the big scary C world out there so that fellow Lispers don't have to leave Emacs ever again.
I say go out, write your Lisp compiler and let the Lisp code dominate user space.
>>924542
Of course it is about the drivers. That's what I implied by "Linux is too useful to throw away". Using Linux allows your Lisp code to work with hardware minimal work on your part. You can concentrate on your dream.
▶ No.924548
>>924545
As someone who occasionally pokes at meme platforms like Haiku, Plan 9, and HURD, the absolutely crippling driver situation sometimes makes me wonder why the first program kerneldevs write isn't something along the lines of NDISwrapper.
▶ No.924553
>>924545
>Then we could intimately tie hardware to software.
To what end? Drivers would just be yet another library. Sounds like you're trying to reinvend dpdk.
▶ No.924571
>>924397
>keeps mentioning reliability
>doesn't mention Erlang
Fuck off
▶ No.924582>>924586 >>924611 >>925808
>>924317
The benefits of Lisp machines come from the tagged architecture, garbage collection, and single address space. They're made for Lisp, but there is nothing forcing you to use Lisp. Dynamically typed languages would be fast and all programs would be simpler, especially GUI programs and compilers. RISC and UNIX are made for C, so everything else is slow, but you can still use other languages on a RISC.
>>924356
>C is so bad and (((LISP))) is so great, why does every machine in existance use an operating system programmed mostly in C, and why is (((LISP))) seldom used in development?
C is brain dead and incompatible. You used to be able to combine different languages into a single program because they were compatible. The Lisp machines, Multics, and VMS are examples of that. When languages are compatible, you don't have to be stuck with a language you don't like because you can rewrite parts of the program one part at a time. Everyone outside AT&T understood the importance of this, but you don't appreciate it because you can just download GCC or Clang and make your program depend on millions more lines of bloat. With C, you are forced to do everything the way C does it, like null-terminated strings, array decay, bad memory management, broken error handling, and other bullshit. That means every language has to reinvent wheels and can't share anything with other languages. With Lisp, everything on the computer could share the same packages, objects, classes, functions, bignums, rationals, strings, arrays, structures, GC, and error handling system.
>If it costs millions in dollars than why do companies still use C and are completely fine? They wouldn't use C if it truly costed them money, and they would use (((LISP))) if it had any positives.
They have been told that C is "simple" and that the language doesn't matter at all, so they hire C weenies who had to learn C to keep their university UNIX computer running for more than 15 minutes, and because of the incompatibility and poor design of C, they start depending on null-terminated strings and other C bugs and then they can't get rid of C. When someone compares C to Lisp, Ada, Fortran, or other languages, C shills don't reply by saying how C is better, they just say some bullshit like "meme" or "it doesn't matter", which sucks. C is another example of the extreme contempt the AT&T employees who were paid to make UNIX have for their users.
>>924401
>It's all bytes in the end, and whatever high-level language you use doesn't matter in the long run if you don't know how the low level works.
Bullshit. Why is it that only people who are shilling UNIX languages like C, JavaScript, and PHP say the language doesn't matter but at the same time don't want a non-UNIX language?
>>924520
I don't like Rust because there's too much UNIX brain damage and it looks like C++/Perl, but it's better than C and C++ (so is Java). I never made a Rust thread or promoted it.
>>924539
Because Linux sucks. Lisp machines eliminate complexity and bloat from the computer and increase code reuse. Lisp machines don't have panics, OOM killers, broken system calls, broken signals, and all that other UNIX bullshit. They don't need 15 million lines of broken code.
I suspect that the True Un*x Way to do this is to print, not
to the printer, but down an Intestine (this is what I call
p*pes) that leads to some parser "utility" that is just
barely general enough to scan for the page headings and
suppress pages that don't have the right page-numbers at the
top. The result is intestined directly to the stool file --
uh, that's "spoop file" -- DAMMIT! "SPOOL file" -- whose
name you have to know and is different on every machine. Of
course if you don't do it right (and you won't) a couple of
blank lines will sneak in, and your page headings will come
out about five lines down, preceded by text from the south
end of the preceding page. Your true weenie won't even try
again, he'll be so pleased at having made this cludge almost
work.
I need hardly say that I do not want to see a gleeful
posting explaining how to do this in gory detail. I remind
all listening weenies that I have a Symbolics machine. I
can network to your Ux*x, and when it asks me whether my
local machine has authorized my UID and GID before it lets
me delete your directory, I can simply, and truthfully, say
"yes". And it will believe me.
I am reminded of the Charles Addams cartoon depicting a
college reunion. Underneath the "Welcome Class of '54"
banner sit the alums, all of whom are bums and winos.
One bum turns to another and says "I used to think it
was just me, but maybe it's the damned school!"
▶ No.924586>>924616
>>924582
>tagged architecture
Waste of memory.
>garbage collection
waste of CPU cycles
>and single address space
But we already have that?
>You used to be able to combine different languages into a single program because they were compatible.
But you can easily do that with C.
>just download GCC or Clang and make your program depend on millions more lines of bloat.
I can easily compile programs to contain little to no bloat with gcc or clang.
>With C, you are forced to do everything the way C does it,
With LISP, you are forced to do everything the way LISP does it
>like null-terminated strings
no
>array decay
no
>bad memory management
no
>broken error handling
arguable
>and other bullshit.
You don't know what you are talking about.
>AT&T
But C wasn't invented by AT&T.
>UNIX languages like [...] JavaScript
wat
>block of code that nobody ever reads
Look, you're so full of shit I don't know where to begin with. You're confusing language, operating systems, and computer architecture.
▶ No.924611
>>924582
Lisp machines don't exist anymore, and are inferior to modern hardware in pretty much every measurable aspect. Linux is the most successful operating system ever created: it runs on a metric fuck ton of architectures, is supported by tech giants and hardware makers, has all the drivers you need, lets you shape its user space into whatever you want and is free software.
You can either make a software-based Lisp ecosystem that runs on top of Linux, or you can try to rewrite Linux in Lisp and fail miserably just like all the other purists who came before you.
▶ No.924616>>925800
>>924586
The rest of us just ignore him and hope that he will eventually focus his interest in LISP in a constructive way.
▶ No.925632
Can the Unix hater tell me the benefits of Smalltalk?
▶ No.925800>>926102
>>924616
I would love to see him write a "pure Lisp" OS. Something tells me that he won't do it any time soon.
▶ No.925808>>925811 >>925816 >>926084
>>924582
You still haven't answer my question on
1. >why does every machine in existance use an operating system programmed mostly in C
2. >and why is (((LISP))) seldom used in development?
3. >If it costs millions in dollars than why do companies still use C and are completely fine?
All your statements contridict themselves are you're on some weird downward spiral on how the shills are out to get you and your shitty language. You seem massively out of touch with reality and specifically use ad-hominim as well as many false equivilences and other logical falicies. Are you a Jew? Because you sound like one.
Let's talk about hardware complexitity. Don't (((LISP))) machines require tons of glue logic to operate to implement the interpreter and memory management instead of a normal general purpose CPU like M68K, ARM, or MIPS? If the interpreter is what makes (((LISP))) special, why haven't you gotten a BASIC stamp and made the interpreter (((LISP))) based?
▶ No.925811>>925824 >>937756
>>925808
Not him, but I already told you that what you're saying is argumentum ad populum. Also, Hitler was an idiot who killed tons of Europeans and is essentially responsible for the state Europe is in now.
>Hurrdurr 14/88 EVRVPE kill all (((niggerz))) XD.
That is what you sound like.
▶ No.925816>>925827
>>925808
Think about the questions you're asking, dude. What operating systems are there today besides those that were already written in C-based languages, like Windows and Unix-derivatives?
C continues to be used simply by the huge momentum it has, much like the x86 architecture continues to exist for that reason, even though it's buggy shit that needed to be replaced at least a couple decades ago.
▶ No.925824>>925825
▶ No.925825>>925842
>>925824
>You must be le Jewish
Epic!
▶ No.925826
>>924536
>>924535
Use Big 5 + HKSCS then, remove kebabs and poo-in-loos
▶ No.925827>>925838
>>925816
No one is forcing you to use C. In fact, most documentation for Windows and Mac API's completely ignore C bindings in favor of .NET and Swift. If you want to use something else, go ahead.
▶ No.925830
>>muh revolution
0s and 1s will work even in Monarchy
▶ No.925838>>925840
>>925827
Note that he was specifically asking about the OS, not what you "can" use to write some software with.
> 1. >why does every machine in existance use an operating system programmed mostly in C 2
Personally, I don't write anything new in C, and typically only modify existing C code for my needs (or to fix some bugs). On Unix, I normally just use Perl instead, since it can leverage most of the system and library stuff. But Perl itself is in C, and so are many of its modules, and the same probably also holds true for Ruby, Python, etc. So in the end, you're using C one way or another on modern systems.
▶ No.925840>>925845
>>925838
If you are talking about scripting languages, then whatever the interpreter is written in is what you are using. I don't see how it's difficult to understand that regardless of whatever high-level language you use, everything is ultimately binary instructions and data.
▶ No.925842
>>925825
You do realise 8chan is "/pol/ the imageboard" right? We own this place
▶ No.925845>>925848
>>925840
Well that doesn't sound right. If I was using C, then I'd have to deal myself with undefined behavior cases and its nefarious string handling. I use Perl to avoid that hassle (and also of course because I don't want to write a main function and explicitely do #include's for every single one-liner).
Yes, everything does come down to binary instructions in the end. But when your tools are in C, you those tools can exhibit undesired side-effects to a greater extent than if the same tools were written in Ada or even Pascal. The C problems will exist so long as the foundations (OS itself, compilers, runtimes, libraries) are writen in that language.
▶ No.925848>>925853
>>925845
If you want to use a different programming language, have at it. However, I don't mind taking the extra time to allocate/free my memory and handle my strings appropriately. It's a challenge in my opinion when I have to fix bugs, a challenge which I know makes me a better programmer than I was before. I have no problems with Perl. That syntax looks difficult to master, but I don't see a pressing need for myself to learn something like that right now.
The problem is the handful of posters coming on here with their insisting demands that everyone defame and stop using C because they don't like it. That's tough nuts. They can take that back to their hugbox and instead let people have real discussions here.
▶ No.925853>>925857
>>925848
>However, I don't mind taking the extra time to allocate/free my memory and handle my strings appropriately.
>It's a challenge in my opinion when I have to fix bugs, a challenge which I know makes me a better programmer than I was before.
You have C stockholm syndrome, much like I did. Give FreePascal a chance and free yourself.
▶ No.925857
>>925853
No thanks. I am learning C because I want to learn C and enjoy learning C. I don't have time to run through another language. Besides, it's insulting to dismiss my reasons for prefering C to pitch something that is inconsequential.
▶ No.926084>>926089 >>926161
>>925808
>1. >why does every machine in existance use an operating system programmed mostly in C
That's a false statement. The most popular OSes are either UNIX/UNIX-like or Windows but that does not mean "every machine in existance" uses C anywhere.
2. >and why is (((LISP))) seldom used in development?
Because not many people know it anymore. Academics have become volunteer AT&T employees. They spend all their time shilling and "fixing" UNIX and C even though AT&T doesn't pay them anything.
https://pdos.csail.mit.edu/~rsc/plan9.html
>This is not Plan 9 hacking per se, but it would be interesting to fix dot-dot in Unix. There might even be enough issues involved to make it a good senior or master's thesis. I don't use Unixes enough to do it myself, but if someone were interested I'd be glad to talk through problems and lend help.
If a product has "enough issues involved to make it a good senior or master's thesis" when nobody else's product had those problems, that means it sucks. It takes more time to fix that bullshit than it did to invent the hierarchical file system on Multics.
3. >If it costs millions in dollars than why do companies still use C and are completely fine?
The cost of C shows up in canceled or delayed products like the F-35, WinFS, Longhorn, the Hurd, Taligent, and Workplace OS. It also shows up in countless bugs and security patches and in all the UNIX "academic" papers that are "reinventing" worse ways of doing what better operating systems did in the 60s, which they weren't taught because it makes the UNIX way look bad.
https://en.wikipedia.org/wiki/Workplace_OS
>Written in C
>With protracted development spanning four years and $2 billion (or 0.6% of IBM's revenue for that period), the project suffered from feature creep and the second-system effect. In October 1995, the product was commercially introduced for a few models of IBM PowerPC hardware and the entire project was immediately discontinued.[1] In 1996, a second and final version was released that also supported x86 and ARM processors.[1]:22
IBM listened to the C weenies, which cost them $2 billion. They blamed microkernels instead of C, which is the real problem, and microkernels got a bad reputation for years.
>Let's talk about hardware complexitity. Don't (((LISP))) machines require tons of glue logic to operate to implement the interpreter and memory management instead of a normal general purpose CPU like M68K, ARM, or MIPS? If the interpreter is what makes (((LISP))) special, why haven't you gotten a BASIC stamp and made the interpreter (((LISP))) based?
Lisp machines are simpler and need much less code. If you wanted a browser or a complex game on a Lisp machine, it would use a fraction of the code and programs would be more reliable and less buggy. This is due to code reuse, the object system, the greater productivity of Lisp, and because the hardware itself is easier to program.
If there's one thing which truly pisses me off, it is the
attempt to pretend that there is anything vaguely "academic"
about this stuff. I mean, can you think of anything closer
to hell on earth than a "conference" full of unix geeks
presenting their oh-so-rigourous "papers" on, say, "SMURFY:
An automatic cron-driven fsck-daemon"?
I don't see how being "professional" can help anything;
anybody with a vaguely professional (ie non-twinkie-addled)
attitude to producing robust software knows the emperor has
no clothes. The problem is a generation of swine -- both
programmers and marketeers -- whose comparative view of unix
comes from the vale of MS-DOS and who are particularly
susceptible to the superficial dogma of the unix cult.
(They actually rather remind me of typical hyper-reactionary
Soviet emigres.)
These people are seemingly -incapable- of even believing
that not only is better possible, but that better could have
once existed in the world before driven out by worse. Well,
perhaps they acknowledge that there might be room for some
incidental clean-ups, but nothing that the boys at Bell Labs
or Sun aren't about to deal with using C++ or Plan-9, or,
alternately, that the sacred Founding Fathers hadn't
expressed more perfectly in the original V7 writ (if only we
paid more heed to the true, original strains of the unix
creed!)
▶ No.926089>>926719
>>926084
>Academics have become volunteer AT&T employees
You're spending too much time in the past. AT&T has no relevance to present-day Unix.
Don't just regurgitate old mailing lists, read them critically.
▶ No.926102>>926109
>>925800
You can't write a “pure LISP” OS without a dedicated hardware.
▶ No.926109
>>926102
>you can't do it because I don't know how
Why don't you study a little more?
▶ No.926161>>926719
>>926084
>IBM made an operating system in c, and it sucked. That means c sucks!
No, it means IBM sucks. Are you going to use watson to rail on the machine learning field next?
▶ No.926259>>926262 >>926719
According to a conspiracy theory long popular among ITS and TOPS-20 fans, Unix's growth is the result of a plot, hatched during the 1970s at Bell Labs, whose intent was to hobble AT&T's competitors by making them dependent upon a system whose future evolution was to be under AT&T's control. This would be accomplished by disseminating an operating system that is apparently inexpensive and "easily" portable, but also relatively unreliable and insecure (so as to require continuing upgrades from AT&T). This theory was lent a substantial impetus in 1984 by the paper referenced in the back door entry. In this view, Unix was designed to be one of the first computer viruses (see virus) --- but a virus spread to computers indirectly by people and market forces, rather than directly through disks and networks. Adherents of this ‘Unix virus’ theory like to cite the fact that the well-known quotation “Unix is snake oil” was uttered by DEC president Kenneth Olsen shortly before DEC began actively promoting its own family of Unix workstations. (Olsen now claims to have been misquoted.) If there was ever such a conspiracy, it got thoroughly out of the plotters' control after 1990. AT&T sold its Unix operation to Novell around the same time Linux and other free-Unix distributions were beginning to make noise.
▶ No.926262
>>926259
you've got to pull in conspiracy theories now? worthless, jaded opinions didn't help your case enough?
▶ No.926719>>926814 >>928767
>>926089
>AT&T has no relevance to present-day Unix.
AT&T is relevant because they are responsible for UNIX existing and sucking. The universities spend more time fixing AT&T's bugs than they do on real research. Every paper about C or UNIX is wasted effort on bullshit that was already solved in the 60s and 70s. If they do fix some major flaw in C or UNIX, it still sucks. If C was good, it would still be unproductive and need a lot more code to do anything. C adds bugs, exploits, slowness, and unnecessary duplicated work on top of that.
>>926161
>No, it means IBM sucks.
Bullshit. IBM wrote a lot of OSes in assembly language and didn't have these problems. IBM's PL/I and related languages didn't have these problems. You conveniently ignored the fact that all these other projects were canceled and/or delayed too because C is not designed for robust and quality software.
>>926259
>disseminating an operating system that is apparently inexpensive and "easily" portable, but also relatively unreliable and insecure
>but a virus spread to computers indirectly by people and market forces
That makes a lot of sense. UNIX weenies are so brainwashed that if the creators of UNIX came out and told everyone that UNIX intentionally sucks and they wanted everyone to suffer, they would still defend the UNIX mistakes.
So what kind of operating system has standard utilities with
undocumented diagnostic messages? Oh, I see. If someone actually
documented Unix in its entirety, just the "Bugs, Lossage, and Error
Messages" section would look like a set of VMS documentation.
UNTALK, without a screen refresh routine, was never this bad.
Subject: why Unix sucks
Some Andrew weenie, writing of Unix buffer-length bugs, says:
> The big ones are grep(1) and sort(1). Their "silent
> truncation" have introduced the most heinous of subtle bugs
> in shell script database programs. Bugs that don't show up
> until the system has been working perfectly for a long time,
> and when they do show up, their only clue might be that some
> inverted index doesn't have as many matches as were expected.
Unix encourages, by egregious example, the most
irresponsible programming style imaginable. No error
checking. No error messages. No conscience. If a student
here turned in code like that, I'd flunk his ass.
Unix software comes as close to real software as Teenage
Mutant Ninja Turtles comes to the classic Three Musketeers:
a childish, vulgar, totally unsatisfying imitation.
▶ No.926814>>926817
>>926719
What do you not understand about "present-day"?
Also
>AT&T is relevant because they are responsible for UNIX existing and sucking.
What is Bell Labs?
>more incoherent ramblings
>quotes that no one ever reads
▶ No.926816>>926817 >>926819
>Alan Kay
>Adjunct Professor at MIT
This guy isn't exactly free of bias.
▶ No.926817>>926819 >>926822
>>926814
>What is Bell Labs?
pic related
>>926816
<not knowing who the creator of Smalltalk is
▶ No.926822>>926830
>>926817
Smalltalk. You mean that thing that pushed the idea of object-oriented programming? How does that have anything to do with the fact that he was a professor at MIT?
>>926819
I don't any reference that he worked for Bell Labs, but he was a fellow for HP Labs.
▶ No.926830
>>926822
>he was a professor at MIT
The bias seen in his talk is much more to attribute with him being the creator of Smalltalk than being a professor at MIT.
>>926822
>I don't any reference that he worked for Bell Labs, but he was a fellow for HP Labs.
UNIX and C (and C++ for that matter) were created at Bell Labs.
▶ No.926970>>926975
▶ No.926973
>970
Yeah, but that wasn't meant ironically.
It actually is fun.
▶ No.926975>>931030
>>924499
>>924527
>>926970
I just went to the urbit website, it doesn't work without javashit. That said, (from what I can surmise) urbit is a distributed functional userspace over Linux which is easily hackable?
▶ No.927963
>>924162
Per-file ID? dang if only there was a simple way to uniquely identify a file... like some sort of a string?
▶ No.928586>>928590
>>924378
Oberon became Rust right?
▶ No.928590>>928710 >>937929
>>928586
No.
"There is a stampede of people rushing to program in Java right now. Java takes some features of Lisp from 1960 (e.g., automatic storage allocation and garbage collection) and some features from SmallTalk circa 1975 (e.g., object classes) and combines them with C syntax so that the current crop of programming drones doesn't get too bad a shock when adapting to these 20 or 35 year-old "innovations"."
Why is that paragraph relevant? Because Ada already did everything Rust does (and better) 30+ years ago. Rust is just like Java in that it gets accolades for old ideas.
▶ No.928629
>>924378
Smalltalk and Lisp were real enough to have/be their own OSes.
Also, don't Oberon and the Modula languages have garbage collection?
▶ No.928710>>928726 >>928773
>>928590
When someone makes an OS like this, the weenies want to add UNIX. What happens when you add the brain damage of C and UNIX to an OS that can fit on a floppy disk? It becomes bloated and sucks. They discovered that this technique also works for microkernels.
http://www.hpl.hp.com/techreports/Compaq-DEC/SRC-RR-21.pdf
>One example of interference caused by an interface state variable is the stream position pointer within a
UNIX open file [14]. The pointer is implicitly read and updated by the stream-like read and write procedures and is explicitly set by the seek procedure. If two threads use this interface to make independent random accesses to the same open file, they have to serialize all their seek-read and seek-write sequences. Another example is the UNIX library routine ctime , which returns a pointer to a statically allocated buffer containing its result and so is not usable by concurrent threads.
>While it is important to avoid unnecessary serialization of clients of an interface, serialization within the implementation of a multithreaded interface containing shared data structures is often necessary. This is to be expected and will often consist of fine-grain locking that minimizes interference between threads.
Multics solved these problems before UNIX created them.
The lesson I just learned is: When developing
with Make on 2 different machines, make sure
their clocks do not differ by more than one
minute.
Raise your hand if you remember when file systems
had version numbers.
Don't. The paranoiac weenies in charge of Unix
proselytizing will shoot you dead. They don't like
people who know the truth.
Heck, I remember when the filesystem was mapped into the
address space! I even re<BANG!>
▶ No.928716>>928782
>>924378
>ALGOL/Modula/Oberon.
Brainlet here, what are the advantages of these languages. I've never even heard of them. And why aren't they more widely known?
no bully
▶ No.928726
>>928710
>The lesson I just learned is: When developing with Make on 2 different machines, make sure their clocks do not differ by more than one minute.
Jesus christo
▶ No.928767>>928772
>>926719
>The big ones are grep(1) and sort(1).
Does Linux also have these bugs created by grep and sort?
▶ No.928772>>928834 >>937653
>>928767
This book that dude keeps quoting was written in the early 1990's by a bunch of butthurt MIT professors and grads. They were butthurt because there entire academic careers focused on using a language that has no real industrial application.
▶ No.928773>>928784
>>928710
>When someone makes an OS like this, the weenies want to add UNIX.
Stop making up excuses and Start Making Lisp Great Again.
▶ No.928782
>>928716
ALGOL was very important, but its real word use was crippled because the standard lacked I/O facilities among other things. It significantly influenced pretty much all modern programming languages.
I know less about Modula and Oberon. I know that one or two unusual but tasteful design decisions in Python were copied from some iteration of Modula, and that Oberon is a combination programming language/operating system/user interface that encourages re-implementations.
▶ No.928784>>928789
>>928773
You would think if LISP could be made anything that it isn't, they would've done so when there was commercial pressures through competition, yet that didn't make it any better, it made it worse. LISP is purely an academic language to teach the concepts of computer science. It's sad they don't teach these concepts in used languages rather than their academic one. It's also just a couple of schools pushing the LISP for academia meme. Most large state schools use industrial languages like Java to introduce freshmen into computer science.
▶ No.928789>>928799
>>928784
>large state schools teaching programming
absolute state tbh
CS isn't about programming.
▶ No.928790
My school eurofag engineer just assumes everyone can program C on their very first day. If you don't, you better learn quick!
▶ No.928799>>928844
>>928789
So what? My point is that to introduce students into working with the computer as a user, in the original sense of the term, that the majority of schools use languages which are widely familiar and expose the students to what they would expect to see in the job market. Academics always perpetuate teaching to secure their job. It's never to punctuate teaching with a skill to secure their student's job.
▶ No.928834>>928838
>>928772
>This book that dude keeps quoting was written in the early 1990's by a bunch of butthurt MIT professors and grads.
Surly their critique is based in at least partial truth...
▶ No.928838
>>928834
It's anachronism to use complaints from 20 years ago. They hold no ground with regards to contemporary versions. They are also biased academics who realized they wasted the poignant years of their academic career chasing the dragon instead of doing some constructive.
▶ No.928844>>928849
>>928799
>My point is that to introduce students into working with the computer as a user, in the original sense of the term,
In what sense? Today's computers are used by users to play candy crush. In the past computers have been used to calculate stuff with pre-written programs. That's what users do. They use abstract user interfaces that hide technical details because they don't care, or they don't have time to care. And they most certainly don't write fancy algorithms. They never did.
▶ No.928849>>928861
>>928844
Today's computers are consumed by consumers so they can consume candy crush. A user tells the computer what he wants it to do. A consumer expects the computer to behave a certain way after pressing some button.
▶ No.928861>>928868
>>928849
>A user tells the computer what he wants it to do.
>A consumer expects the computer to behave a certain way after pressing some button.
There's no difference.
▶ No.928868>>928981
>>928861
You derail my argument with your petty conflated nonsense about what makes a user a user and a consumer a consumer. Get lost.
▶ No.928981>>928987
>>928868
Your argument is conflated nonsense. Or just try again?
▶ No.928987
>>928981
Or you can get hurt.
▶ No.931030
>>926975
>it doesn't work without js
then again what does
i dont really get it but urbit is supposed to be a hermetic environment because the computing itself happens in the arvo vm not in the linux environment itself.
so i guess something like a repl?
▶ No.937653>>937765
>>928772
>This book that dude keeps quoting was written in the early 1990's by a bunch of butthurt MIT professors and grads
What's the name of the book. It seems interesting tbh
▶ No.937712
>>924187
forth machines are still around, what's your excuse now?
▶ No.937727
>>924106 (OP)
I warmly suggest you study Japan's Fifth Generation computer project, especially the part where it failed to keep up with what you decree as inferior approach.
▶ No.937756>>937831 >>937871
>>925811
>Hitler was an idiot
wrong
>who killed tons of Europeans
because they attacked/were forced to attacked by jews
>and is essentially responsible for the state Europe is in now
no he actually prevented a lot of damage
the communism from the east would have fallen in either way and the capitalist west made young people go to the army for bread because of the great depression you retard
under him there were more Germans after the war than before
▶ No.937765
>>937653
It's the unix haters handbook.
Read it, you can learn some history.
https://homes.cs.washington.edu/~weise/uhh-download.html
▶ No.937826
>>924146
>Lisp is jewish, like AWK
What the fucking fuck in this retardation.
▶ No.937831>>937852 >>937925
>>937756
You're misunderstanding what "responsible for the state Europe is in now" means; and it's silly how you're defending killing other (yes, white) Europeans because of the Jews. Germany has a ballast around their neck with "holocaust" on it. They will *always* have to bend over backwards because of Hitler, and we'll be seeing more people like Merkel until the day Germany is turned into an Islamic state. (By the way, I don't understand the fetishism for Germans on imageboards, even Poland is more respectable - Germany sat on their land for years and years, but you hear nothing about this.)
▶ No.937852>>937933
>>937831
>I don't understand the fetishism for Germans on imageboards, even Poland is more respectable
You can praise 40s Poland nearly anywhere. There aren't many places on the internet you can do so for 40s Germany without getting quickly censored or worse.
▶ No.937871>>937928
>>937756
>under him there were more Germans after the war than before
>1933: 65,362,115
>1946: 65,137,274
$ python
Python 2.7.12 (default, Dec 4 2017, 14:50:18)
[GCC 5.4.0 20160609] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> 65362115 < 65137274
False
Hmm...
▶ No.937925
>>937831
>even Poland is more respectable
t. Polack
Go back to stealing our cars.
>Germany sat on their land for years and years
>their land
>nobody else is allowed to live in border regions
>only jews and subhumans are allowed to live in Pooland
A shit, they slaughtered us. That's why drove over and restored order within 4 days. There was no resistance to speak of.
▶ No.937928
▶ No.937929
>>928590
But, is Ada standardized like Rust?
▶ No.937933>>937934
>>937852
So, contrarian for the sake of it?
▶ No.937934
>>937933
Go to inclib.i2p and pick up some books. Learn some history. If not you probably are a newfag.
8chan is /pol/-chan since Gamergate.
▶ No.937944
>>924378
>>>924147
>
>Reminder M$ tried and failed at exactly this with Longhorn, while Apple's resource forks and Be's BeOS both dipped their toes in the idea.
WinFX was brilliant but it was never released. I bet it returns one day; then spies don't have to parse everyone's custom file formats anymore. The incentive is too great not to bring it back.
▶ No.937955
>>924187
there have been multiple lisp-based OSs that have run on x86, such as interrim or mezzano, and D-wave's quantum computers run on an in-house lisp-OS with deprecated python bindings.
>>924378
don't you know that compilers and assemblers emulate you writing the machinecode? anything but machinecode isn't a real programming language.