[ / / / / / / / / / / / / / ] [ dir / agatha2 / b2 / choroy / dempart / freeb / mde / trap / vichan ][Options][ watchlist ]

/tech/ - Technology

You can now write text to your AI-generated image at https://aiproto.com It is currently free to use for Proto members.
Email
Comment *
File
Select/drop/paste files here
Password (Randomized for file and post deletion; you may also set your own.)
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Expand all images

File (hide): 339964ed9398b40⋯.jpg (17.49 KB, 425x346, 425:346, reality is a mistake.jpg) (h) (u)

[–]

 No.1059252>>1059325 >>1059379 >>1059544 >>1059566 >>1060014 >>1060447 >>1064709 [Watch Thread][Show All Posts]

Why the hell is the GNOME shell written in Javascript?

I thought it was just the extension engine but no, Javascript has literally permeated the entire thing, this is completely absurd. No wonder it's so slow.

 No.1059263>>1059265 >>1060014

> 2019

> Using GNOME

lol wew laddie


 No.1059265>>1059569

>>1059263

Look, it's well supported and has plugins ok. I can't go back to WindowMaker.


 No.1059295>>1059353 >>1059393 >>1059449 >>1060448

Javascript is much easier to use than C. Programmer time is much more important to the Gnome application developers and computer time is less important to them.


 No.1059325

>>1059252 (OP)

That's funny because node.js is written in C and C++.


 No.1059353>>1059357

>>1059295

>Programmer laziness is much more important to the Gnome application developers and user experience is less important to them.

yes, they've made that painfully obvious to anyone whose tried to use that slow bloated mess.


 No.1059355>>1059358 >>1059359

>using a DE

>using a WM

>not just running straight X enviroment and launching your programs via the tty

enjoy your bloat, plebs.


 No.1059357

>>1059353

Who are you quoting?


 No.1059358

>>1059355

>running X

>not just running everything on the frame buffer and calling it a day


 No.1059359

>>1059355

I use Gnome3, Gnome Shell, and Xmonad as my DE.


 No.1059379>>1059451

>>1059252 (OP)

>Javascript has literally permeated the entire thing

Explain. The entire codebase is 100% is javascript? I doubt that.


 No.1059387

Is it just Gnome 3?


 No.1059393>>1059398 >>1059403

>>1059295

>Javascript is much easier to use than C.

I've been programming in Javascript and it is worse than programming in C. You get the worst of both worlds if you program in Javascript, its a weakly typed, poorly defined dynamic language. It makes you wish you were programming only in the most restrictive ball and chain language with absurdly strong static typing. JSON is the only good thing to come out of this mess.


 No.1059398>>1059403 >>1059424

>>1059393

If you don't document and plan out the data structures and processes that your JS application uses, then yes it is trivial to run into type mismatching problems. I don't have type matching problems in any of my software regardless of the language because I take the effort to formally document multiple various aspects my application and I keep the documentation up to date. What's much more difficult in C is the resource accounting and pointer accounting that is inherent in any C application.


 No.1059403>>1059424

>>1059398

>>1059393

I've been coding JS web apps and node.js servers professionally for years now and we never plan out data structures or any of that shit. Almost never run into type mismatch issues.

Like, how fucking draconian and "enterprise" is your codebase that you constantly run into type issues? When you know where your variables come from and where they go, you don't have an issue.


 No.1059424

>>1059403

>>1059398

>What's much more difficult in C is the resource accounting and pointer accounting that is inherent in any C application.

Sure, but you don't go into C not expecting to manage memory or pointers really. There are many dynamically typed languages that just do a better job than JS when it comes to types.

>Like, how fucking draconian and "enterprise" is your codebase that you constantly run into type issues? When you know where your variables come from and where they go, you don't have an issue.

Its more of a problem when you program with idiots or people who simply don't care. But most of the time JS's behavior is just annoying, not really too harmful most of the time.


 No.1059433>>1059450 >>1059540

That's not even the worst part, the whole display server and kernel I hear is written in C. What weenie came up with that idea?


 No.1059449>>1059465

File (hide): cdf78028c1828ab⋯.png (6.21 KB, 337x60, 337:60, image (3).png) (h) (u)

File (hide): ca4749c722d8278⋯.png (10.29 KB, 631x64, 631:64, image (2).png) (h) (u)

>>1059295

>Javascript is much easier to use than C

Pic related


 No.1059450>>1059525

>>1059433

Well considering Rust and D didn't exist at the time it was a sensible choice.


 No.1059451>>1059621

File (hide): 6b51c6b8e529328⋯.png (48.79 KB, 1301x325, 1301:325, ClipboardImage.png) (h) (u)

>>1059379

55% of it is in Javascript, the rest is C (although most of that I'm guessing is just code from Mutter).

https://gitlab.gnome.org/GNOME/gnome-shell


 No.1059465

>>1059449

>why my string not the same as int ?!?!

please off yourself.


 No.1059525>>1059540

>>1059450

To be fair, Ada and Eiffel existed at the time, as well as Pascal and some Pascal-likes.


 No.1059540

>>1059433

>>1059525

>Ada

Ada took years to get decent compilers, let alone fast ones. X began development shortly after the first validated Ada implementation and as far as I know, there weren't any FOSS Ada compilers back when Linus began working on Linux.

>Eiffel

Postdates X and I doubt it had a FOSS compiler when Linux came around either.

>Pascal

Only one of Kernighan's complaints was addressed in time for X. The rest came in Extended Pascal and six million incompatible, compiler-specific extensions, so even if it had a FOSS compiler in time for Linux the kernel would be very closely tied to the compiler, far closer than it's currently tied to GCC.


 No.1059543>>1059545

GNOME uses JavaScript because, even though it sucks, it's better than C. All these weenie scripting languages like JavaScript were created to avoid C or work around some C brain damage (like not having a working type system). A lot of languages were used for systems programming like Lisp, Fortran, Ada, Algol, Pascal, and PL/I, but they did not develop this scripting language philosophy. Instead of scripting, any language with compatible calling conventions was used directly by the programmer. This goes with dynamic linking and the ability to compile extensions and add or remove them while the program is running. UNIX uses scripting languages for extensions instead of dynamic linking.

This appeared in a message requesting papers for the "USENIX
SYMPOSIUM ON VERY HIGH LEVEL LANGUAGES (VHLL)":

UNIX has long supported very high level languages:
consider awk and the various shells. Often programmers
create what are essentially new little languages whenever
a problem appears of sufficient complexity to merit a
higher level programming interface -- consider
sendmail.cf. In recent years many UNIX programmers have
been turning to VHLLs for both rapid prototypes and
complete applications. They take advantage of these
languages' higher level of abstraction to complete
projects more rapidly and more easily than they could
have using lower-level languages.

So now we understand the food chain of advanced UNIX
languages:

level languages analogous organism
----- --------- ------------------
low assembler (low-level PDP) amoeba
intermediate regular expressions tadpole
high C (high-level PDP) monkey
very high awk, csh, sendmail.cf UNIX man


 No.1059544

File (hide): 11c2af7fb01310a⋯.webm (1013.66 KB, 480x360, 4:3, get gnomed on.webm) (h) (u) [play once] [loop]

File (hide): 7ddb444afa473c3⋯.webm (518.84 KB, 640x360, 16:9, gnomed.webm) (h) (u) [play once] [loop]

>>1059252 (OP)

I always wondered why it seemed to run like shit


 No.1059545>>1059947

>>1059543

>GNOME uses JavaScript because, even though it sucks, it's better than C

Quite possibly the most unbased thing you've said yet.

>All these weenie scripting languages like JavaScript were created to avoid C

Fun fact: Brendan Eich just wanted to put Scheme in a web browser but his bosses kept telling him to add more features and make it look like Java. He had ten fucking days to get it ready for a Netscape Navigator beta and, surprise surprise, the niggerrigged language was shit.

>Pascal

>used for systems programming

When? Certainly not before ISO 7185.


 No.1059566>>1059621

>>1059252 (OP)

GNOME Shell isn't written in JavaScript, it's written in C. GNOME Shell Extensions are written in JavaScript, which is a different thing.


 No.1059569>>1059573 >>1059625 >>1059636

>>1059265

just use unity, it's like gnome but not retarded


 No.1059573

>>1059569

Canonical doesn't care to maintain Unity any more. Are you aware of the Unity community fork?


 No.1059621

>>1059566

>

GNOME Shell isn't written in JavaScript, it's written in C. GNOME Shell Extensions are written in JavaScript, which is a different thing.

See >>1059451


 No.1059625>>1059627

File (hide): 125c40f3f2cbbaa⋯.png (1.06 MB, 1920x1200, 8:5, cinnamon.png) (h) (u)

>>1059569

No, that would be Cinnamon.


 No.1059627

>>1059625

i almost want something with a taskbar but i just cant find a good place for it on this dual monitor system. it just looks weird if its only on one monitor but even weirder if its on both


 No.1059636

>>1059569

This but unironically


 No.1059947>>1059995 >>1060041

>>1059545

>Quite possibly the most unbased thing you've said yet.

Perhaps, but it's true. JavaScript is at least as good as C, in other words, it's better than C. Strings and arrays are better in JavaScript and it has exception handling, bounds checking, GC, and objects. JavaScript sucks and is worse than most scripting languages, but all the parts about it that suck come from C. The problem is that too many "programmers" have no knowledge of anything besides UNIX languages, so they don't know what a good systems language or a good dynamically typed language look like. Plenty of low level systems languages have real strings, for example, and a lot of times they were added to make use of specific string instructions on the computer. C is incredibly bad when it comes to syntax, semantics, and types. Code shouldn't have to be wrapped in a useless "do while(0)" loop to be able to treat it as a statement, but that is the "correct" way to do it in C. Arrays shouldn't "decay" to pointers because they're totally different things. Assigning an array is not the same as assigning a pointer in any other language, because one copies all elements and one copies an address. Everything is broken and what would be considered terrible kludges in any other language (even ones as bad as PHP and JavaScript) are taught as how "programmers" (aka UNIX weenies) do things.

>Fun fact: Brendan Eich just wanted to put Scheme in a web browser but his bosses kept telling him to add more features and make it look like Java.

That's just more proof that the bad parts in JavaScript came from C (directly or indirectly). JavaScript without UNIX brain damage would be Scheme with map-like objects and prototypal inheritance.

>He had ten fucking days to get it ready for a Netscape Navigator beta and, surprise surprise, the niggerrigged language was shit.

Just like B, C, C++, sh, awk, Perl, UNIX, HTTP, and so on.

>When? Certainly not before ISO 7185.

https://en.wikipedia.org/wiki/PERQ

https://en.wikipedia.org/wiki/Pascal_MicroEngine

Subject: Advisory Locks are Like Advisory Array Bounds

And like "advisory" type systems.

You know, the systems that check for problems only when it
the checks can't possibly fail. (That way, you can optimize
better, right?)


 No.1059987

>all these butthurt unix-weenies complaining because a desktop environment actually managed to be good

Keep whining but i'm going to keep enjoying my heavily customizeable and human guided experience that GNOME provides.


 No.1059995>>1060294

>>1059947

Languages and operating systems are more than a collection of features, what truly matters is how well those features work and fit together. JavaScript suffers from the C++ problem of piling on features regardless of whether they actually fit and like C++, writing good code in it means restricting yourself to a small subset which shuns most of the fancy features.

Saying a scripting language is better than a janky systems programming language because it has six million poorly implemented features instead of a few is retarded, especially when that scripting language needs a legendarily fat VM to support it.

>That's just more proof that the bad parts in JavaScript came from C (directly or indirectly)

Again, if you keep using the genetic fallacy to blame Unix and C for every way their descendants screwed up, there's absolutely nothing stopping me from taking it a little further and blaming Multics and PL/I for spawning Unix and C.

>PERQ and Pascal MicroEngine

Both of which relied on custom extensions of the language, fitting the time-honoured Pascal tradition.


 No.1060014

>>1059252 (OP)

What this guy said

>>1059263

Just a few minutes into Ubuntu, and I was scrabbling for something else. Currently going with KDE.


 No.1060019>>1060041

why cant everything be in C and C++ like everything in Windows?

Linux own downfall is the fragmentation of user space


 No.1060041>>1060294

>>1059947

>GC

I'm sure you mean runtime GC, in this case, kill yourself. Promoting laziness and enormous source complexity in a system language is not based. To be really honest, I'm sure guys like you have never tried to implement a GC or seen how something like V8 or OpenJDK's GC(s) look like to be able to perform remotely well.

If you were talking about stuff like deterministic build-time GC (Rust, Mercury or Carp), then you'd have a point.

>objects

Mainly solving the namespacing problem with retardation.

>inheritance

Worse thing ever invented. When you read procedural code, you just look at function input/output and you know what's available and what are the side effects (if you don't use globals too much, of course). In POO, you must do like the salmon and climb up the inheritance chain before you know what's there.

>good dynamically typed language

Why must you open your mouth to say such retarded drivel? Dynamic typing is just there to work around the lack of a good type system; look at how the ML family does it. It speaks for itself when the JS and Python faggots' new fad is adding type informations (Typescript, Mypy).

All you're whining about C is about its historical baggage, which does make C shitty, but still better than having your language's core ideas bad from the beginning.

>>1060019

What are you talking about, nigger. Windows is a lot of C#, too. In the end, choice is what makes GNU/Linux what it is.


 No.1060051>>1060059

use i3 or dwm like a real huwhite man


 No.1060059

>>1060051

Fuck you. OpenBox 4 Lyfe


 No.1060206

there is not a single good DE on loonix/bsd, so stick to window managers, plenty of decent WMs

Unity was halfway decent but Canonical abandoned that just like they abandoned everything else


 No.1060294>>1060300 >>1060302 >>1060305

>>1059995

>Languages and operating systems are more than a collection of features, what truly matters is how well those features work and fit together.

That's why I hate UNIX so much. Nothing in UNIX works together. Instead of proper APIs, there are commands that output text that have to be parsed by some kludgy shit "language" like awk or sed.

>Saying a scripting language is better than a janky systems programming language because it has six million poorly implemented features instead of a few is retarded, especially when that scripting language needs a legendarily fat VM to support it.

It has nothing to do with features. JavaScript VMs are bloated because the language is full of brain damage. Lua and Scheme are similar to JavaScript but are much smaller (and some would say more powerful). Most UNIX languages are like this, including C, C++, and Java.

>Again, if you keep using the genetic fallacy to blame Unix and C for every way their descendants screwed up, there's absolutely nothing stopping me from taking it a little further and blaming Multics and PL/I for spawning Unix and C.

No, because it would be dishonest to blame an OS and language that did it right for problems that are not present in that OS and language. PL/I doesn't have array decay, null-terminated "strings", "everything is a tape drive" I/O, or any of that other bullshit. It also has a condition system that can check and recover from a variety of errors, including overflows and array bounds. Multics also has a very different approach to users, access control, error handling, files, and so on. You can blame Multics for the hierarchical file system, the names of some commands, and some terminology, but that's about it. A lot of things from Multics that were added to UNIX years later, like memory-mapped files, were still worse than the originals. If these UNIX weenies made something this bad after using Multics and PL/I, imagine how much it would have sucked if they didn't know about them.

>Both of which relied on custom extensions of the language, fitting the time-honoured Pascal tradition.

It's like the Pascal community recognizes flaws in their language and fixes them.

>>1060041

>Promoting laziness and enormous source complexity in a system language is not based.

I'm not talking about real-time or embedded systems (in which case, Ada would be better), I'm talking about desktops and other home computers (phones and tablets) that use GC anyway.

>Mainly solving the namespacing problem with retardation.

>>inheritance

Objects are a lot more powerful than you think. Look at CLOS or IBM's System Object Model which was designed to support Smalltalk, C++, Common Lisp, Java, etc. objects in one framework.

>Dynamic typing is just there to work around the lack of a good type system; look at how the ML family does it.

The ML family doesn't do it.

https://en.wikipedia.org/wiki/Multiple_dispatch

>All you're whining about C is about its historical baggage, which does make C shitty, but still better than having your language's core ideas bad from the beginning.

C's core ideas were bad from the beginning. Brain damage that was known to be bad when it was made doesn't become acceptable just because it's old. PL/I doesn't have these problems. A lot of languages like Lisp, Pascal, Fortran, and BASIC had problems worth complaining about but most of them were fixed over the years. Some of the problems in Lisp were dynamic scope and not having strings (using symbols and lists of symbols instead). A lot of languages didn't have strings in the 60s but now they do.

Subject: Revisionist weenies


For reasons I'm ashamed to admit, I am taking an "Intro
to Un*x" course. (Partly to give me a reason to get back on
this list...) Last night the instructor stated "Before
Un*x, no file system had a tree structure." I almost
screamed out "Bullshit!" but stopped myself just in time.

I knew beforehand this guy definitely wasn't playing
with a full deck, but can any of the old-timers on this list
please tell me which OS was the first with a tree-structured
file system? My guess is Multics, in the late '60s.


 No.1060300>>1060302 >>1060305 >>1060396

>>1060294

>Look at CLOS or IBM's System Object Model which was designed to support Smalltalk, C++, Common Lisp, Java, etc. objects in one framework.

Same problem as other OOP implementations: complexity without real reasons. OOP should just have been the concept of instantiation and namespacing, no inheritance departing from the mathematics's model of function input/output and its powerful simplicity.

>The ML family doesn't do it.

That's my point. ML is statically typed but with a type system expressive enough to not need dynamism.

>C's core ideas were bad from the beginning

C's (and UNIX, for that matter) core ideas are historical baggage. It's a reaction to the extreme bloat of Multics and PL/I (which took so much time just to get usable performances or simply a fully compliant compiler). A reaction too extreme? Was it standardized too soon and without enough effort to think about the future? Perhaps, but still better than what was fundamentally the Entreprise (tm) Java (r) OS of its time.

Of course, now you can spout your drivel because powerful hardware exists, but implying that this applies to Multics' era and that UNIX won because of sheer luck and retardation amongst hackers is stupid.

>It has nothing to do with features. JavaScript VMs are bloated because the language is full of brain damage. Lua and Scheme are similar to JavaScript but are much smaller (and some would say more powerful). Most UNIX languages are like this, including C, C++, and Java.

This is where you don't make any sense. Praising stuff like PL/I, Ada and Multics then implying that bloat comes from UNIX and not from the Entreprise (tm) crowd responsible for your fat idols.

By the way, I use Tcl with C as an extension language, I'm not an OpenBSD tier masturbating monkey.


 No.1060302>>1060305 >>1060396

>>1060294

>That's why I hate UNIX so much. Nothing in UNIX works together. Instead of proper APIs, there are commands that output text that have to be parsed by some kludgy shit "language" like awk or sed

Just because something doesn't work exactly the way you want doesn't mean it doesn't work.

>It has nothing to do with features, it's brain damage

And where do you suppose that brain damage originates from if it doesn't come from poorly thought-out and implemented features? Is brain damage some ethereal quality that magically makes Unix languages (in your vague "anything remotely influenced by Unix or C that I don't like" sense) worse in every way?

>No, because it would be dishonest to blame an OS and language that did it right for problems that are not present in that OS and language

And yet you blame all the flaws of post-Unix and C operating systems and languages on Unix and C even when their flaws are not present in Unix and C. If you jump to the "but they're a reaction to how much Unix and C sucks" part, Unix and C were also reactions to Multics and PL/I yet you don't blame those two for Unix and C-specific problems.

>It's like the Pascal community recognizes flaws in their language and fixes them.

Except everyone and their mother chose a different way of fixing them, so a lot of Pascal code ends up tied to specific compilers and platforms.

>>1060300

Multics is a very interesting operating system and deserves more study (especially with today's more powerful hardware), but it certainly had its problems. From my research, many of the later speedups came from replacing PL/I code with assembly and Multics' performance issues were severe enough that Honeywell gave several universities free hardware upgrades to save face and keep their customers.


 No.1060305

>>1060294

based

>>1060300

unbased

>>1060302

unbased


 No.1060396>>1060428 >>1060468 >>1064349

>>1060300

>Same problem as other OOP implementations: complexity without real reasons.

There are real reasons, but they're not compatible with the UNIX way of doing things. Imagine an OS where everything is an object. Your PNGs and JPEGs are image objects that carry their methods with them. Text files embed fonts and can contain objects inside them. This was before web browsers, which spread UNIX brain damage to the rest of the world.

>no inheritance departing from the mathematics's model of function input/output and its powerful simplicity.

Are you talking about Haskell? What does that have to do with C and UNIX?

>C's (and UNIX, for that matter) core ideas are historical baggage.

No, they're brain damage. The "core ideas" were known to be bad when they were made. The only thing that changed is that a lot of people today have never seen the right way to do things.

>Was it standardized too soon and without enough effort to think about the future?

C was standardized in 1989, long after Lisp machines. There were more kinds of hardware at the time, including segmented and tagged architectures. Hardware "research" now is basically a worse way to do something we could do 50 years ago, but with the ability to run C and UNIX. Compare Intel MPX to real array bounds and descriptors.

>Perhaps, but still better than what was fundamentally the Entreprise (tm) Java (r) OS of its time.

Java is a UNIX language made by Sun.

>This is where you don't make any sense. Praising stuff like PL/I, Ada and Multics then implying that bloat comes from UNIX and not from the Entreprise (tm) crowd responsible for your fat idols.

None of the software that people complain about is written in PL/I or Ada, or runs on Multics. It's all written in C and C++ or in a language with an interpreter or VM written in C and C++. The Linux kernel is 7 MB or larger, plus a bloated RAM disk and all this other bullshit. Full PL/I ran on computers with 64 KB of memory. Multics supported tens if not hundreds of users on 1 MB.

>>1060302

>Just because something doesn't work exactly the way you want doesn't mean it doesn't work.

Having to generate and parse text, which wastes billions of cycles, really doesn't work. Look at dtoa.c some time to see how much C code is needed to convert floats to strings and back. That's called thousands of times to produce a table or XML, just to have it piped to more code to parse them back into floats.

>And where do you suppose that brain damage originates from if it doesn't come from poorly thought-out and implemented features?

Brain damage comes from anti-features that have no rational explanation. Arrays are a feature. Array decay is brain damage. Numbers in different bases are a feature. Treating numbers that start with 0 as octal is brain damage. Function types are a feature. C function pointer syntax is brain damage.

>And yet you blame all the flaws of post-Unix and C operating systems and languages on Unix and C even when their flaws are not present in Unix and C.

The flaws in C++ and JavaScript come from copying C directly. The flaws in GNU/Linux come from copying UNIX directly. The flaws and bugs in C operating systems come from being written in C.

>Unix and C were also reactions to Multics and PL/I yet you don't blame those two for Unix and C-specific problems.

No, because Multics and PL/I did all of these things correctly. If someone said "Ford cars are too complicated" and made a "car" out of sticks and poured gas in the "tank" and burned his garage down, you can't blame Ford for that.

>Except everyone and their mother chose a different way of fixing them, so a lot of Pascal code ends up tied to specific compilers and platforms.

That's better than not fixing them at all.

Date: Tue, 24 Sep 91 14:15:45 -0700
Subject: Decay

What I find totally incredible is that the general level
of systems seems to be lower in almost every respect than it
was ten years ago -- the exception being that the machines
run ten times faster. (Well, mine doesn't; it's a straight
3600 and thoroughly canine. But everyone else's does.)

I maintain a couple of large mailing lists, one of which
has a nine-year history. Nine years ago it ran perfectly on
ITS, and then for another five years on OZ. Then we moved
it to Reagan (a bolix) which really didn't work well, but
was tolerable, but got more and more broken. We've left it
there, despite lost mail and infinite maintenance
aggravation because *we can't find any machine that we have
access to that has more reliable mailing list service*.


 No.1060428>>1061023

>>1060396

>Your PNGs and JPEGs are image objects that carry their methods with them.

That sounds even worse than UNIX braindamage.

Otherwise your post is BASED as usual.


 No.1060447

>>1059252 (OP)

gnome was always slow you retarded niggerfaggot.

fuck this place. you stupid nignogs still havent fixed the captcha after weeks. literally solve 2 captchas per post and 100 if you have JS disabled.


 No.1060448

>>1059295

gnome has so much programmer time saved that it takes an entire minute for Eye of Gnome to open up and display an image on anything other than a high end gayming rig


 No.1060468>>1060534 >>1061023

>>1060396

>Your PNGs and JPEGs are image objects that carry their methods with them. Text files embed fonts and can contain objects inside them.

This is hilarious coming from the man who screeches about Unix's text autism being too bloated.

>Multics supported tens if not hundreds of users on 1 MB

Late in its lifecycle, maybe. I do know that Bristol started out with one and a half megabytes in 1979 and it took several memory upgrades (including a free half-megabyte from Honeywell, nothing to sneeze at back then) to run anything near a hundred without performance tanking.

>Brain damage comes from anti-features that have no rational explanation. Arrays are a feature. Array decay is brain damage.

Languages don't emerge from a vacuum. Array decay exists for partial backwards compatibility with BCPL and B code and has some performance implications. Was it worth it? Maybe not, but pretending there's absolutely no rational reason for it is a lie.

>inb4 sacrificing safety for performance is brain damage

That would make Amigafags the pinnacle of brain damage.

>The flaws in C++ and JavaScript come from copying C directly. The flaws in GNU/Linux come from copying UNIX directly. The flaws and bugs in C operating systems come from being written in C.

Again, this boils down to your refusal to admit flaws come from anything but Unix and C, even in inconsistent "muh features" dumpsterfires.

Let's rephrase your mantra. C++ and Javascript have no flaws outside C influence. GNU/Linux has no flaws when it deviates from Unix. None of the flaws and bugs in C operating systems come from anything but C. Does this still sound sensible to you?

>No, because Multics and PL/I did all of these things correctly. If someone said "Ford cars are too complicated" and made a "car" out of sticks and poured gas in the "tank" and burned his garage down, you can't blame Ford for that.

Congratulations: you completely missed the point. Not only that, in trying to make Unix sound like absolute dogshit you're making Multics' failure sound even more embarrassing. Imagine Ford actually being driven out of business by a fucking stick car because a Ford employee was tired of his car running out of gas before reaching the grocery store.


 No.1060482

There is no reason to use Javascript for anything.

There are better, faster and even easier scripting languages like Lua.

Is KDE faster than GNOME?


 No.1060483>>1060518

I think the real question is why are you still using Gnome?


 No.1060518

>>1060483

Because WindowMaker isn't well maintained and GNOME has good plugins.


 No.1060534

>>1060468

unbased


 No.1061023>>1061128

>>1060428

>That sounds even worse than UNIX braindamage.

>>1060468

>This is hilarious coming from the man who screeches about Unix's text autism being too bloated.

Alan Kay thinks it's a good idea. Instead of each program including decoders for various file types, there's only one on the computer and it only has to be there if you want to view that type of object.

http://worrydream.com/EarlyHistoryOfSmalltalk/

>Languages don't emerge from a vacuum.

That's one of the reasons I keep bringing up PL/I.

>Array decay exists for partial backwards compatibility with BCPL and B code

BCPL is a different language and didn't run on UNIX. C had a lot of changes that made it incompatible with B, like changing =- to -= and the string terminator from EOT to null.

https://www.bell-labs.com/usr/dmr/www/btut.html

>the unsuspecting might believe that u[0] and u[1] contain the old values of v[0] and v[1]; in fact, since u is just a pointer to v[0] , u actually refers to the new content. The statement "u=v" does not cause any copy of information into the elements of u; they may indeed be lost because u no longer points to them.

In B, arrays don't decay to pointers, but "vector" variables are pointers themselves that can be assigned. That sucks too, but it destroys your "compatibility" argument.

>and has some performance implications.

All bad ones. It makes optimization more difficult and needs more complex compilers to come close to the performance of real arrays. Most of the techniques that were known since the 70s can't be used. There's no way to assign arrays or use array operations either, and there is no way to add them to C even though they were added to Fortran. C weenies want a 4-byte array to be treated differently than an integer that happens to be 4 bytes, but in PL/I, Ada, and other non-UNIX languages, assigning the array does the same operation of copying 4 bytes.

>Was it worth it? Maybe not, but pretending there's absolutely no rational reason for it is a lie.

What's rational about "compatibility" with a language with totally different syntax that didn't even run on that OS or computer? Or "compatibility" with an older language that isn't actually compatible?

>That would make Amigafags the pinnacle of brain damage.

No, brain damage is when there is no rational explanation for something, in other words, whoever made it must have brain damage or it must have come from cosmic rays altering bits or something other than intelligent design. Sacrificing safety for performance is not brain damage if it really does perform better (and that's acceptable for that kind of software). The problem is that UNIX weenies don't actually care about performance. I have never seen anyone change GCC to use string length prefixes and compare speed and memory usage with null-terminated strings.

>Let's rephrase your mantra. C++ and Javascript have no flaws outside C influence. GNU/Linux has no flaws when it deviates from Unix. None of the flaws and bugs in C operating systems come from anything but C. Does this still sound sensible to you?

Flaws in languages based on C come from C. Flaws in UNIX clones come from UNIX. C++ and JavaScript have additional flaws that are not present in C, but those come from hacking features from other languages onto C or using C-isms for different semantics. You can't properly add OOP, arrays, generics, exceptions, dynamic linking, or anything else to C without it sucking. Bugs in C operating systems come from C. Nearly all of those BSoDs and exploits in Windows (even though it's not UNIX) came from memory bugs caused by C code. I'm not saying other languages and operating systems have no flaws, but blaming a language or OS that does something right because UNIX weenies copied it wrong doesn't make any sense.

>Imagine Ford actually being driven out of business by a fucking stick car because a Ford employee was tired of his car running out of gas before reaching the grocery store.

Except weenies don't realize cars are supposed to have an engine and they got other companies to stop including engines too.

   > ...
> There's nothing wrong with C as it was originally
> designed,
> ...

bullshite.

Since when is it acceptable for a language to
incorporate two entirely diverse concepts such as setf
and cadr into the same operator (=),
...

And what can you say about a language which is largely used
for processing strings (how much time does Unix spend
comparing characters to zero and adding one to pointers?)
but which has no string data type? Can't decide if an array
is an aggregate or an address? Doesn't know if strings are
constants or variables? Allows them as initializers
sometimes but not others?


 No.1061128

>>1061023

>In B, arrays don't decay to pointers, but "vector" variables are pointers themselves that can be assigned. That sucks too, but it destroys your "compatibility" argument.

It's a partial compatibility feature.


The semantics of arrays remained exactly as in B and BCPL: the declarations
of iarray and carray create cells dynamically initialized with a value pointing
to the first of a sequence of 10 integers and characters respectively. The
declarations for ipointer and cpointer omit the size, to assert that no storage
should be allocated automatically. Within procedures, the language's
interpretation of the pointers was identical to that of the array variables: a
pointer declaration created a cell differing from an array declaration only in
that the programmer was expected to assign a referent, instead of letting
the compiler allocate the space and initialize the cell.

Values stored in the cells bound to array and pointer names were the
machine addresses, measured in bytes, of the corresponding storage area.
Therefore, indirection through a pointer implied no run-time overhead to
scale the pointer from word to byte offset. On the other hand, the machine
code for array subscripting and pointer arithmetic now depended on the type
of the array or the pointer: to compute iarray[i] or ipointer+i implied scaling
the addend i by the size of the object referred to.

These semantics represented an easy transition from B, and I experimented
with them for some months. Problems became evident when I tried to
extend the type notation, especially to add structured (record) types.
Structures, it seemed, should map in an intuitive way onto memory in the
machine, but in a structure containing an array, there was no good place to
stash the pointer containing the base of the array, nor any convenient way
to arrange that it be initialized. For example, the directory entries of early
Unix systems might be described in C as

struct {
int inumber;
char name[14];
};

I wanted the structure not merely to characterize an abstract object but
also to describe a collection of bits that might be read from a directory.
Where could the compiler hide the pointer to name that the semantics
demanded? Even if structures were thought of more abstractly, and the
space for pointers could be hidden somehow, how could I handle the
technical problem of properly initializing these pointers when allocating
a complicated object, perhaps one that specified structures containing
arrays containing structures to arbitrary depth?

The solution constituted the crucial jump in the evolutionary chain
between typeless BCPL and typed C. It eliminated the materialization of
the pointer in storage, and instead caused the creation of the pointer when
the array name is mentioned in an expression. The rule, which survives in
today's C, is that values of array type are converted, when they appear in
expressions, into pointers to the first of the objects making up the array.

This invention enabled most existing B code to continue to work, despite
the underlying shift in the language's semantics. The few programs that
assigned new values to an array name to adjust its origin—possible in B
and BCPL, meaningless in C—were easily repaired. More important, the
new language retained a coherent and workable (if unusual) explanation
of the semantics of arrays, while opening the way to a more
comprehensive type structure.


 No.1061421

>If someone said "Ford cars are too complicated" and made a "car" out of sticks and poured gas in the "tank" and burned his garage down, you can't blame Ford for that.

Guys, guys! Multicsfag is looking into a mirror for the first time!


 No.1064349>>1064382

>>1060396

...so what you're saying is, you want to use an Array type with {pointer, size, type} instead of just pointers. Write that library and use it in your own code.


 No.1064382

>>1064349

Are you stupid or something? I am a no coder and even I know that in C arrays and pointers are the exact same thing, cancer

>C function pointer syntax is brain damage.

An arrary with one object translates to the exact same thing as specifying * in C. More then one object just means extended length. Correct me if I am wrong based unix hater. According to cuckexchange

>The reason that this works is that the array dereferencing operator in C, [ ], is defined in terms of pointers. x[y] means: start with the pointer x, step y elements forward after what the pointer points to, and then take whatever is there. Using pointer arithmetic syntax, x[y] can also be written as *(x+y).

There's got to be a more understandable way to communicate that. Especially if you are using multithreading of your instructions, what if pointer X that is 8 bytes into array Y changes because a instruction got passed early or something? I don't know honestly.


 No.1064709>>1064710

>>1059252 (OP)

Why is /tech/ just always about faggots complaining about /tech/ that other people developed. LEARN TO CODE AND CODE SOMETHING BETTER OR STFU.

/tech/gamer/


Can somebody please write 100,000 lines of code so muh vydia will run on Linux.

/tech/for_no_reason


Install Gentoo

Nobody uses arch, cruch, archbang, crunchbang or any of that homebrew shit either. Ubuntu, Debian, RedHat, SUSE, and CentOS are the majority of Linux install. But really the majority is Ubunut/Debian hands down.

/tech/cry_baby


The operating system philanthropists coders have donated their code base to is not good enough and I want to bitch even though I have contributed nothing to the code base. I want to complain about something that is free but still better than Windows 10 Pro which costs $200.


 No.1064710

>>1064709

butthurt gnome dev detected




[Return][Go to top][Catalog][Screencap][Nerve Center][Cancer][Update] ( Scroll to new posts) ( Auto) 5
61 replies | 4 images | Page ?
[Post a Reply]
[ / / / / / / / / / / / / / ] [ dir / agatha2 / b2 / choroy / dempart / freeb / mde / trap / vichan ][ watchlist ]