[ / / / / / / / / / / / / / ] [ dir / animu / arepa / leftpol / mde / newbrit / s / tacos / vg ][Options][ watchlist ]

/tech/ - Technology

You can now write text to your AI-generated image at https://aiproto.com It is currently free to use for Proto members.
Email
Comment *
File
Select/drop/paste files here
Password (Randomized for file and post deletion; you may also set your own.)
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Expand all images

[–]

 No.992006>>992015 >>992105 >>992182 >>992196 >>992247 >>992613 >>996067 >>996075 >>996146 >>996233 >>999386 >>999395 [Watch Thread][Show All Posts]

Why is the Unix philosophy of small, general-purpose tools you creatively string together with pipes and shellshit completely abandoned whenever people design graphical software and toolkits? It's like we forget everything that makes Unix good the moment we look beyond the shell and just copy whatever Windows and the other graphical OSes are doing.

One way of fixing this is abandoning the one window == one program idea and giving graphical programs some STDIN/STDOUT + pipes equivalent. We could break down software suites into smaller, independent programs and let users create their own workflows with these tools by dragging multiple programs into a window-like container, then connecting their STDIN/STDOUT equivalents with whatever the graphical version of pipes is. As an example, an image editing suite could be a collection of smaller programs sending their output to an image viewing program which takes input, sends its output to programs which request it (for example, filtering software), and maybe handles stuff like undoing and redoing.

An advantage of this approach is that we could solve many software suites' architectural, scalability, and responsiveness problems by breaking them down into smaller components and handing them off to the OS' process/threading model. It's also much closer to both the terminal workflow and most real-world workflows, where people prefer using small, general-purpose tools together in creative ways over monolithic and complicated single-purpose machines. The downside is that you'd need a really good process model, along with a new GUI toolkit and maybe a new display server. There are already suites like Blender which try creating what I've described within the confines of their program, but they all implement this differently and this experience never carries over to the rest of the OS.

So what do you fags think?

 No.992008>>992266

Isn't Imagemagick a front-end for a variety of image manipulation programs?


 No.992009

unix philosophy sux


 No.992011

>Unix

>good


 No.992013

Lisp machine academics please respond.


 No.992014

>pipes

>good


 No.992015>>992016 >>992052 >>996074

>>992006 (OP)

>Why is the Unix philosophy of small, general-purpose tools you creatively string together with pipes and shellshit completely abandoned whenever people design graphical software and toolkits?

Because that is an inherently inefficient and overly simpicistic philosophy.

>We could break down software suites into smaller, independent programs and let users create their own workflows with these tools by dragging multiple programs into a window-like container, then connecting their STDIN/STDOUT equivalents with whatever the graphical version of pipes is.

You're asking the user to code part of the program for you, and you aren't making things that much flexible because programs usually handle very specific kinds of data that doesn't really mean anything when interpreted as text.

Your very example of the image editor shows it: you don't stop to consider what the output of an image viewer should actually be, should it output a render of the image plus layers, an object including plain image and layers separately, should it include both?

What if a tool only needs to operate on certain layers, are you going to spam STDIN with a ton of mostly useless data or have one in/out channel for every layer?

Also the casual way you write

>and maybe handles stuff like undoing and redoing.

is telling, undoing and redoing might rely on state contained in programs other than the viewer so the viewer alone cannot possibly handle them.

And no, using pipes and shellshit to sync and rollback state across multiple programs is not saner than putting all relevant parts in a single program.

>An advantage of this approach is that we could solve many software suites' architectural, scalability, and responsiveness problems by breaking them down into smaller components and handing them off to the OS' process/threading model.

Most of those issues come from the program logic, not from shoddy implementation: you won't magically multithread things well by splitting every program in a stupid amount of coroutines.

>where people prefer using small, general-purpose tools together in creative ways over monolithic and complicated single-purpose machines.

This fantasy is literally the opposite of the entire history of technology.

Even right now, we use massive,complicated, mostly monolithic CPUs instead of ASICs because of that: single-purpose is much cheaper and faster, and we are clever enough to use it creatively anyways.

On the other hand, "small and modular" hides massive costs in the form of the time, expertise, and effort required to get all those parts to work together without issues.


 No.992016>>992018

>>992015

>mostly monolithic CPUs instead of ASICs

CPUs are ASICs


 No.992018

>>992016

Meant FPGAs, always misc up the two for some reason even though Field Programmable Gate Arrays should be a pretty good clue.


 No.992052>>992395 >>992789

>>992015

>Because that is an inherently inefficient and overly simpicistic philosophy

And the "just stuff everything into one program" mindset of today's monolithic software is much better and totally doesn't lead to severe performance and architectural issues. It totally doesn't hurt the system's consistency or user experience either.

>you don't stop to consider what the output of an image viewer should actually be

Not in that brief sentence, no.

>muh layers

>are you going to spam STDIN with a ton of mostly useless data or have one in/out channel for every layer?

Layer selection can be handled within the central image viewer and the external program is fed and spits out to whatever layers you've selected assuming said program even needs to read the layer, something like a brush engine doesn't unless it's doing shit like erasing or blurring. What if the user selects a different layer while a filter is processing? Add a layer ID system.

>undoing and redoing might rely on state contained in programs other than the viewer

>And no, using pipes and shellshit to sync and rollback state across multiple programs is not saner than putting all relevant parts in a single program.

I said maybe for a reason, faggot. There's multiple ways of handling it and I haven't settled on one yet.

>you won't magically multithread things well by splitting every program in a stupid amount of coroutines

I wasn't talking about multithreading alone of course you'd assume that, but threading smaller programs (if they even need it) is generally easier than larger ones. Dividing these into separate programs could also minimise time and data lost if one program locks up or crashes.

>This fantasy is literally the opposite of the entire history of technology.

Outside computer hardware design and the past couple decades of anti-consumer bullshit, mostly false.

>Even right now, we use massive,complicated, mostly monolithic CPUs instead of FGPAs because of that: single-purpose is much cheaper and faster, and we are clever enough to use it creatively anyways.

If you literally only need a single thing done quickly and efficiently, an FGPA is often the better option.

>On the other hand, "small and modular" hides massive costs in the form of the time, expertise, and effort required to get all those parts to work together without issues.

On the other hand, monolithic "do everything" software becomes an absolute nightmare to debug and fix the moment something goes wrong. Sure, designing smaller, interoperable programs requires more thought than stuffing everything into a monolithic blob but they're also usually easier to comprehend and troubleshoot in the end.

This also applies to appliances and machinery.

tl;dr

>why isn't anon's one sentence image editor example a detailed paragraph

I'm aware this isn't fully fleshed-out, it's just an example of some stuff I'm thinking about. That's why I asked what you fags think.


 No.992105>>992172 >>992179

>>992006 (OP)

Apple had something like this: You could edit a video in iMovie, select a portion of a clip and drag&drop it into Keynote to make it part of your slides.

This only worked because Apple made Quicktime a de-facto standard on their OS. On the terminal you have text as your standard format, but what is the standard format when dealing with binary data like images or video clips? Also, the shell defines only two contact points: stdin and stdout, so daisy-chaining programs is straight-forward, but how do you daisy-chain a video editor and a slideshow program?


 No.992172>>992179

File (hide): f51ba29b45226af⋯.jpg (12.62 KB, 300x225, 4:3, $_35.JPG) (h) (u)

https://en.wikipedia.org/wiki/OpenDoc

OpenDoc was pretty much exactly what you're talking about. Instead of being application-centric, it was document-centric, with each element in a document being editable by a "tool" that was actually an entire program, with OpenDoc itself serving as a transclusion mechanism to facilitate that in realtime.

Predictably, its main attraction for users was also its main flaw for developers.

Breaking monolithic software down into its individual features, there's no need for every competing package to reimplement every single "checklist feature", lowering entry barriers, and allowing new entrants to write the minimum amount of features needed for an innovative product.

For developers of dominant packages (MS Office, Adobe Photoshop, QuarkXpress, Macromedia Freehand, etc.), breaking apart their bloatware into OpenDoc components would have been not only laborious, but moreover suicidal.

>>992105

Leading up to OpenDoc, Apple had a variety of standards forcing interoperability:

>QuickDraw/PICT

>the clipboard

>publish & subscribe in System 7

>various other container formats like QuickDraw 3D 3DMF and QuickDraw GX fonts

>AppleEvents

>recordable AppleScript applications


 No.992179>>992258 >>992395

>>992105

>but what is the standard format when dealing with binary data like images or video clips?

Whatever the application supports, I guess.

>>992172

That's another issue. Even if you managed to overcome OpenDoc's bloat problems, many developers of big proprietary software suites would piss themselves and you'd have to rely on curious freetards to implement components.

Also,

>Another issue was that OpenDoc had little in common with most "real world" document formats, and so OpenDoc documents could really only be used by other OpenDoc machines. Although one would expect some effort to allow the system to export to other formats, this was often impractical because each component held its own data. For instance, it took significant effort for the system to be able to turn a text file with some pictures into a Microsoft Word document, both because the text editor had no idea what was in the embedded objects, and because the proprietary Microsoft format was undocumented and required reverse engineering.

>Another problem was the fact that each part saved its data within Bento (the former name of an OpenDoc compound document file format) in its own internal binary format, and it was very common to find one component could not open a document created by another, even though the internal data represented similar objects (spreadsheet data for instance). OpenDoc attempted to solve this problem by allowing developers to store multiple formats to represent the same document object. For instance, it was both possible and encouraged to store a common format like JPEG along with editable binary format, but in practice few developers followed this recommendation. This problem was not unique to OpenDoc, and in fact was also experienced by the Microsoft equivalent, Object Linking and Embedding (OLE). Indeed, many years later, XML documents which attempt to perform embedding of other XML formats also encounter similar issues.

What the fuck were these fags smoking?


 No.992182

>>992006 (OP)

This never happened because the idea of pipes for graphical processes is inherently difficult to handle.

How would we even pipe between windows?

How would we handle images? How would we even signify a link between windows?

Would that be cross-wm compatible?

How do we know where the output goes?

Would all these pipes also be graphically represented?

Would they obscure your view?

What do you do about processes trying to output to the same window?

What if two processes use two different toolkits?

Can we use this with terminal pipes?

How is all this data represented?

It's just too complex to make work.


 No.992196>>992210 >>992258 >>992390 >>992395

>>992006 (OP)

You mean like GIMP does?

>The GIMP (GNU Image Manipulation program) is a graphics editor designed to be driven through an interactive GUI. But GIMP is built as a library of image-manipulation and housekeeping routines called by a relatively thin layer of control code. The driver code knows about the GUI, but not directly about image formats; the library routines reverse this by knowing about image formats and operations but not about the GUI.

(from The Art of UNIX Programming)


 No.992210>>992211 >>992214 >>996139 >>996858 >>999395

File (hide): 9c40fc32433ffe7⋯.jpeg (110.46 KB, 1920x1038, 320:173, DoY0qnuVAAAlkpA.jpg:large.jpeg) (h) (u)

File (hide): 8321fa03d78362a⋯.jpeg (68.56 KB, 784x1026, 392:513, DoZHpHhUcAAtI6L.jpg:large.jpeg) (h) (u)

>>992196

Didn't the GIMP also integrate parts of Mypaintg recently? Might be why I've been seeing drawfags using it more recently.


 No.992211

>>992210

MyPaint*


 No.992213>>992258 >>992279

1. Imagemagick

2. Gimp already has a scripting mode where you can pipe together graphics operations making use of the Guile extension system.


 No.992214>>992216

>>992210

Gimp integrated the Mypaint library. You can paint in Gimp using the same Mypaint algorithm as Mypaint itself.


 No.992216>>992220 >>992221 >>996858

File (hide): 3397ec61ce4e414⋯.png (66.99 KB, 385x367, 385:367, qwe_download.png) (h) (u)

>>992214

So what's the point of MyPaint now?


 No.992220

>>992216

If your sole purpose is to paint digitally, Mypaint can help with that. It will have a lower memory footprint and faster startup speed because it doesn't offer all the features that Gimp provides.


 No.992221>>992357

>>992216

to not be bloated as gimp, i mean gimp for a long time loaded every god damn font before starting up


 No.992247

>>992006 (OP)

Start by implementing the display server in the kernel.


 No.992254>>992395

For this kind of thing to happen, a standard format should be decided upon. UNIX and POSIX had the good idea (small tools that you can compose endlessly) but a terrible implementation, because they didn't specify any interchange format. So, when you create POSIX-2 or whatever you call it, don't forget this part:

* TSV for tables (forbid \t and \n; in fact, only allow [:graph:] with the whitespace)

* Don't care about trees and graphs, they're probably not needed

* NetPBM for images

* What about vector images? svg is shit, we need something simpler

* Something to replace WAV and all its extensions that fixes its 4GB limit and all its bloat; the NetPBM of audio (the pam part, though)

Anything else?


 No.992258>>992279

>>992179

>Whatever the application supports, I guess.

But now you run into the same problem as that quote below: You can pipe data between Microsoft Word and Excel, but no other application. Text does not have the problem of obscurity, and if you have incompatibilities you can simply insert some short AWK script or something in-between the two programs to format the output of one into suitable input for the other.

>>992213

I thought Gimp uses its own Scheme-like scripting system called Script-Fu or Python, not Guile?

>>992196

How hard would it be to replace the Gimp UI? I feel like the thing holding Gimp back the most is its interface.


 No.992259

>making graphical software less usable and uglier

LOL


 No.992266

>>992008

It's a set of programs


 No.992279

>>992258

>>992213

Sorry I got it wrong. Gimp uses the TinyScheme library to implement the ScriptFu system. It was GNU Emacs that was updated to rely on Guile rather than its own Lisp interpreter.


 No.992357

>>992221

That was fixed. Try a new Appimage or Flatpak.


 No.992390>>992597 >>996136 >>999321

UNIX pipes are virtual PDP-11 tape drives. AT&T shills convinced DOS users that their virtual tape drives are "the only way" to do modularity and some even believe that it brought modularity into programming, which are bullshit and flat out lies. Modularity is improved by not using pipes, which suck.

Before UNIX existed, Multics used dynamic linking to combine different segments into one process, without pretending your RAM is actually a tape drive that can only read and write one byte at a time. Segments can share code and data between programs so you can use CPU instructions and addressing instead of reading and writing virtual tapes. UNIX weenies shit on dynamic linking because their implementation sucks and because they don't want to admit Multics did it right and the UNIX way is wrong.

https://multicians.org/multics-vm.html

>The fundamental advantage of direct addressability is that information copying is no longer mandatory. Since all instructions and data items in the system are processor-addressable, duplication of procedures and data is unnecessary. This means, for example, that core images of programs need not be prepared by loading and binding together copies of procedures before execution; instead, the original procedures may be used directly in a computation. Also, partial copies of data files need not be read, via requests to an I/O system, into core buffers for subsequent use and then returned, by means of another I/O request, to their original locations; instead the central processor executing a computation can directly address just those required data items in the original version of the file. This kind of access to information promises a very attractive reduction in program complexity for the programmer.

>>992196

>But GIMP is built as a library of image-manipulation and housekeeping routines called by a relatively thin layer of control code. The driver code knows about the GUI, but not directly about image formats; the library routines reverse this by knowing about image formats and operations but not about the GUI.

That's the anti-UNIX way and has more in common with Multics. If it was the UNIX way, all of those libraries would be separate processes in separate address spaces that pretend they're running on PDP-11s with virtual tape drives.

If there's one thing which truly pisses me off, it is the
attempt to pretend that there is anything vaguely "academic"
about this stuff. I mean, can you think of anything closer
to hell on earth than a "conference" full of unix geeks
presenting their oh-so-rigourous "papers" on, say, "SMURFY:
An automatic cron-driven fsck-daemon"?

I don't see how being "professional" can help anything;
anybody with a vaguely professional (ie non-twinkie-addled)
attitude to producing robust software knows the emperor has
no clothes. The problem is a generation of swine -- both
programmers and marketeers -- whose comparative view of unix
comes from the vale of MS-DOS and who are particularly
susceptible to the superficial dogma of the unix cult.
(They actually rather remind me of typical hyper-reactionary
Soviet emigres.)

These people are seemingly -incapable- of even believing
that not only is better possible, but that better could have
once existed in the world before driven out by worse. Well,
perhaps they acknowledge that there might be room for some
incidental clean-ups, but nothing that the boys at Bell Labs
or Sun aren't about to deal with using C++ or Plan-9, or,
alternately, that the sacred Founding Fathers hadn't
expressed more perfectly in the original V7 writ (if only we
paid more heed to the true, original strains of the unix
creed!)

In particular, I would like to see such an article
separate, as much as possible, the fundamental design
flaws of Unix from the more incidental implementation
bugs.

My perspective on this matter, and my "reading" of the
material which is the subject of this list, is that the two
are inseparable. The "fundamental design flaw" of unix is
an -attitude-, and attitude that says that 70% is good
enough, that robustness is no virtue, that millions of users
and programmers should be hostage to the convenience or
laziness of a cadre of "systems programmers", that one's
time should be valued at nothing and that one's knowledge
should be regarded as provisional at best and expendable at
a moment's notice.


 No.992395>>992408 >>992747

>>992052

>And the "just stuff everything into one program" mindset of today's monolithic software is much better

Yes.

It's not perfect and there is room for improvement (the widespread development of public facing APIs being a notable one in the last years), but it works, and products made with that model are the ones that are actually being used.

Again, nothing stops anyone from coding what you described, and if you make a good program out of it you'll be rich/famous/smugposting on a tibetan monkey shaving forum.

>>992179

>Whatever the application supports, I guess.

Congratulations, you reinvented command line arguments.

>>992196

I feel like that explains the awful UI.

>>992254

>Don't care about trees and graphs, they're probably not needed.

Is this bait?


 No.992408

>>992395

then you end up with systemd-gimpctl


 No.992597

>>992390

So basically mmaping everything is better than pipes somehow?


 No.992613>>992635 >>999447 >>999532

File (hide): 2d9d1bb30b10ac6⋯.jpg (3.88 MB, 3921x2619, 1307:873, P1200780.jpg) (h) (u)

>>992006 (OP)

Unix is garbage, so it's kinda hard to follow its philosophy when you're trying to produce quality. Linux and the BSDs don't fucking work (and they are really just Unix, lets not pretend that there is much of a difference), so there's no incentive to imitate them. I have been trying a couple of different OSs lately like AROS and RISC OS, and they work shockingly well, no issues. It has been way too long since I installed something that actually just worked, while half of the distros out there don't work on my computers and none of the BSDs are compatible. Unix is a waste of effort. The amount of work that it took for it to survive this long was enormous and it's still a broken piece of shit anyway. There has to be something wrong there when OSs made by 6 people work better than something as huge as Unix. Really, the only practical advantage it has over Windows is that you can customize it a lot more, but at the end of the day you are just wasting time polishing a turd. Even pirating Windows 7 may actually be a better choice.


 No.992635>>992650

>>992613

Your grievances seem more like you're mostly frustrated with hardware compatibility. What computers do you have?


 No.992650>>992656 >>992666 >>999512

>>992635

I am frustrated with how much time and effort I wasted trying to use shit that doesn't work.

>What computers do you have?

I have a lot of them. Some old cheap pre-builts and some that I made (and a Pi, that I got because people finally convinced me). Still, even without compatibility issues, I am still sick of dealing with shitty software. The OSs themselves have a lot of stupid issues that make no sense and the amount of problem solving required to do anything is big enough that I might as well develop my own software (I would be doing that, but the computer that I normally use for that is fucked and my setup is a mess because I have been testing a lot of things). I am really sick of having to read gigantic man pages and search for information on the internet to do fucking everything. Then I decide to check out some alternatives (because the Linus' cuckoldry made me finally do it, though I did want to do it for a while) and realize that I'm just inconveniencing myself for no reason and wasting a lot of time not being productive and not having fun. I kinda miss Windows XP at this point. Well, 95 is still my favorite so maybe I miss that. Maybe I should just install old shit and do my own thing. Fuck this decade.


 No.992656>>992663 >>992666

>>992650

I hear you. My take on it though is that *nix might not be perfect, but it's the best we have. I mean what would you even do with an old Windows install? I've fantasized about going back to 98 which was on my PC as a kid for years. But as soon as you plug an Ethernet cable into that shit, it's dicks up your ass. Anything XP and earlier is that way. You'd have to give up the internet, unless you're the type of person to have unprotected sex with south american prostitutes. So what would you do? Copy installers to CDs and try to get them to run? Buy old software off ebay and hope it's legit? It all just seems so hopeless.

At least now, I can waste time online, and play around with different programs and games. I can watch too much anime and talk to people on imageboards. I have all the entertainment in the world, constantly at my fingertips. So why does it feel so fucking unsatisfying? Why do we still wish we could go back to a better time?

I guess this turned into something totally different than what you were complaining about. I just get the feeling that everyone's dissatisfied for different reasons and nobody really knows what's wrong, or what needs to be done.


 No.992663

File (hide): e3804e91193b119⋯.png (Spoiler Image, 78.3 KB, 800x600, 4:3, untitled.PNG) (h) (u)

>>992656

>unless you're the type of person to have unprotected sex with south american prostitutes

I might be at this point. At least in this situation. Here's a picture of me doing it right now. At least the firewall is on. Wouldn't want to be too unsafe, right?

Anyway, I guess everything is horrible and nothing will ever be good again, and becoming a skeleton in a coffin will be an improvement, so the real solution is to get drunk(er) and watch more ancient horror movies. That's my solution right now, other than installing Windows XP for fun, though maybe OS/2 could be a better choice for this one. Maybe I can do that later, it's not like I have a life.


 No.992666>>992672

>>992650

That seems to be more of a product of these OS trying to run on a multitude of different systems on wildly various hardware. Yeah it sometimes sucks trying to keep what little shrinking freedoms we can find for us simple end users.

Linux (esp desktop) is mostly just a curmudgle of softwares built with wildly varying design philosophies and lots of times they don't want to play nicely. Linux is a fractured target too with various distributions and architectures and software that's installed. Run all that on closed hardware that was built specifically to run with another OS and you'll have headaches. Then there's the "rolling release" shit some distros pull which creates and even more unstable system.

BSDs are better in many ways in that the OS is designed as a whole system but then you start adding ancillary software which may or may not mesh together well or even support the OS. It ends up having lots of the same problems as Linux. They're still worse in others like hardware support. Honestly I'd be using a BSD full time if I didn't wan't to play games.

The only systems that actually work well are those whos software and hardware are designed to be an inseparable combo. You see this in places like phones, games consoles, appliances, and Apple products.

As for Windows it's almost the same... Every laptop is made to run Windows. Every desktop hardware is made to run Windows. Every application is made to run on Windows. Every alternative OS is forever playing catch up in the "just works" department. All this convenience comes at a price I'm not willing to pay though. I fully understand I have to compensate somewhat by using my own time but I don't mind it at all well mostly.

>>992656

>I just get the feeling that everyone's dissatisfied for different reasons and nobody really knows what's wrong, or what needs to be done.

Same here. Many times I wish I were ignorant enough to just stupidly use Windows, OSX, etc. completely oblivious to the amount of AIDS laden semen being blown up my ass. I hope I'm not just wearing misshapen nostalgia goggles but I think the future of technology is bleak.


 No.992667

because graphical software isn't shit


 No.992672

>>992666

>The only systems that actually work well are those whos software and hardware are designed to be an inseparable combo

Everything was like this in the past, but now only Apple really does it (while everyone else just supports Windows). Their hardware is absolute garbage, though. They are complete shit overall, but the ridiculously expensive inferior hardware already means that they aren't even an option. It's a shame that they suck, because they could be a viable alternative if they didn't, though really, Apple has been going to shit since the beginning. Just compare the Apple II to the Macintosh and the Apple III and you can see the beginning of the Jewish tricks. Honestly, I would rather use pirated Windows on old computers. At least it costs very little or nothing at all.

>Every laptop is made to run Windows. Every desktop hardware is made to run Windows.

Well, the current version of Windows. Compatibility with 7 will disappear eventually and I will never use Windows 10 no matter what. Not that I feel the need to buy new hardware anyway. There is nothing that I do that can't easily be done with hardware from 8 to 10 years ago.


 No.992747

>>992395

>>Don't care about trees and graphs, they're probably not needed.

>Is this bait?

I should precise by system tools, I guess.


 No.992789

>>992052

>Outside computer hardware design and the past couple decades of anti-consumer bullshit, mostly false.

Nothing from windmills to cars has become more modular in the last centuries.

Tools have become more capable, which inherently allows for some flexibility via inefficient uses, but the ever increasing complexity of them means it's harder and harder to even perform basic repairs, let alone having well separated parts that can be useful on their own.


 No.996067

>>992006 (OP)

>Why is the Unix philosophy of small, general-purpose tools you creatively string together with pipes and shellshit completely abandoned whenever people design graphical software and toolkits?

I don't think it always is. Think of it like this. When you string together a bunch of commands with pipes to accomplish something, that is the equivalent of the GUI application. Think of the GUI application as being the stringed together commands. The GUI application can be made with a variety of different libraries. Each library does a single thing and does it well (ideally). You are stringing together the functionality of these different libraries in order to solve a problem. Seriously, think about it. What is the difference between me issuing a string of commands using curl, jshon, and ImageMagick commands from the command line vs writing an application which leverages the libcurl, jsoncpp, and Magick++ libraries?

I'm not saying every GUI application follows the Unix philosophy. I just think your idea of the Unix philosophy feels a bit narrow.


 No.996074>>996874 >>999424

>>992015

How about using a simple gui to string the parts together and how they integrate?

It doesn't have to be a shell, but I like OPs idea of modularity and general purpose tools.

It will probably end up looking like a simulink gui


 No.996075

>>992006 (OP)

>with pipes and shellshit

this is quite retarded way to communicate once you go past trivial stuff


 No.996076

We don't have a decent image editor period, and you're asking to make it even more complicated.


 No.996136

>>992390

>one's knowledge

should be regarded as provisional at best and expendable at

a moment's notice

The programming world is going to kill itself with this attitude.


 No.996137

only way to make graphical software more like unix is to split the gui from the actuall use, creating a backend and a frontend.

Then, make the gui completely option and upto implementation.


 No.996139

>>992210

>gimp has mypaint and a functional text tool

KRITA BTFO


 No.996146>>998791

>>992006 (OP)

Smalltalk.


 No.996233

>>992006 (OP)

There are programs like that.

Like the "non" music DAW, which splits itself up into mixer, tracker, and so forth.


 No.996241>>996253 >>996388 >>999447

when the fuck do you compose anything in unix? almost never. all you can do is pipe some shit to some other shit and then if that doesn't crash it because it needs to be escaped in 30 different ways, it probably works. UNIX is shit nigger. don't compare UNIX to the pinnacle of composability. as for GUIs, they're all made by niggers, that's the only reason they're shit


 No.996253>>996874 >>998791 >>999424 >>999447

>>996241

Almost always. Let me describe a simple programmers workflow. You open a text file in your editor, which itself is running in your shell. You make some edits, then close the editor. You open a makefile, and add a few lines. The first takes the text file, and compiles it in to an object file. The next takes the object file, and links it with libraries to form a binary. You take the binary and open it in a debugger, which runs your program. In this simple example, you use a terminal, a shell, an editor, a compiler, a linker, and a debugger, for the relatively simple task of writing a program. None of the examples listed involve pipes, but we very easily could add an example. The programmer wants to find a file which he knows is in a directory tree. He uses find to list all files, and grep to filter only the ones he wants.

You might think this all sounds trivial but in fact the majority of modern software completely ignores this principle. Take the modest IDE. Every single one completely reimplements from scratch: text editor, terminal, debugger, file browser, build system, vcs (at least porcelain), and any other tools a programmer might need. This means that all IDEs are bloated pieces of shit, and you need to relearn them from scratch for every new language you program in.


 No.996388

>>996241

If you use a sane shell (zsh) and sane coreutils (GNU-like), you can have NUL in pipes AND in variables. No more problems, then.


 No.996858

>>992210

It looks like the artist used the standard hard /soft round brushes that always came with gimp and didn't use mypaint brushes at all.

>>992216

i find MyPaint's interface is a lot nicer compared to Gimp's for drawing. The biggest things mypaint needs to implement into their program is basic selection/transformation tools. Even AzPainter has them.

Basic adjustment filters like HSV wouldn't hurt either.


 No.996874

>>996253

>Take the modest IDE. Every single one completely reimplements from scratch: text editor, terminal, debugger, file browser, build system, vcs (at least porcelain), and any other tools a programmer might need. This means that all IDEs are bloated pieces of shit, and you need to relearn them from scratch for every new language you program in.

Modern IDEs and software suites are basically their own operating systems.

>>996074

>How about using a simple gui to string the parts together and how they integrate?

>It will probably end up looking like a simulink gui

That's what I'm interested in: a graphical way to represent inter-process communication and connect programs.


 No.998748>>998768 >>998808

Give us an example of a Unixlike UI.


 No.998768

>>998748

sxiv and imv, I guess.


 No.998791>>998849 >>998887 >>999395

>>996146

lol pretty much exactly

>>996253

That's fantasy workflow and it ONLY works when you are doing it nearly everyday. Once you stop for a month you'll forget all those retarded gdb commands. You'll have to look up gcc switches all over again, using your own Makefiles. God forbid you lose your text editor rice setup. And you'll realise how much you rely on shell history and autocomplete.

Compare that to any single IDE. The learning curve is non-existent.

So please let's not jerk off over how intuitive and simple one of the most unintuitive collections of software is.


 No.998808>>998887 >>998908

>>998748

Plan 9's rio.

A program sees tree files in /dev: window, for drawing; cons, for reading keyboard input; mouse, for reading mouse input and can be written to set pointer position.

These files can be multiplexed by the program, witch means you can have an instance of rio inside rio.

This allows things like to take a screenshot, you just filter the device through a png converter to a file:

[code]#entire screen

topng < /dev/screen > yourfile.png

#current window

topng < /dev/window > yourfile.png

#other window (n is the window's number)

topng < /dev/wsys/n/window > yourfile


 No.998849

>>998791

That's why simple stuff is the best. So you can't forget it.


 No.998887>>998908 >>998962

>>998791

>That's fantasy workflow

I can attest to it

>it ONLY works when you are doing it nearly everyday. Once you stop for a month you'll forget all those retarded gdb commands. You'll have to look up gcc switches all over again, using your own Makefiles.

If you use a tool a couple times, it will take a lot longer than a month to forget how to use it

>God forbid you lose your text editor rice setup.

God forbid. Remember to always keep backups

>And you'll realise how much you rely on shell history and autocomplete.

Not really. Both nice for saving time, but I have everything memorized. It's IDEfags that need autocomplete.

>Compare that to any single IDE. The learning curve is non-existent.

For trivial stuff, sure. Whenever I hit something more complicated I generally end up opening a terminal.

>let's not jerk off over how intuitive and simple one of the most unintuitive collections of software is.

I didn't jerk off over intuitiveness. Still, knowing how everything works underneath makes it a lot easier to fix when it breaks. Having to consult a man page doesn't make something unintuitive.

>>998808

That sounds really neat actually. What's the best way to try plan9? I used 9front in virtualbox, but I couldn't get past the install.


 No.998908>>998962

>>998808

plumbing rules and custom scripts is the most god tier gui possible.

>>998887

You really want to RTFM the install dude, it's confusing at first because it handles drives differently to what you expect.


 No.998962>>998991 >>999447

>>998887

Try 9front on a virtual machine. Regarding visuals, it won't blow your mind, but if you're a programmer, try writing trivial programs and you'll see how Plan9's simplicity shines. For instance, there's no "select" call to see which FID has data available, you simple rfork a new process to do this for you. The rfork is really impressive for how you can control the resource sharing with the parent. Even freebsd implemented it (see rfork(2)).

>>998908

>plumber

Yes, the plumber pretty much solves many things people are debating here. Need this file opened? Just plumb it. If you have acme editor opened and plumb a file from another terminal, it uses the same instance of acme instead of creating a new window. It's incredible how many text editors today don't do this, starting a new instance without the state of the one already running.


 No.998991

>>998962

It's just the way processes work in Unix. If you run ed or vi in separate terminals, you have two entirely different copies of those editors, each with their own memory, file descriptors, etc. Emacs did it differently because it was always a memory hog and it wants to be your OS.


 No.999117>>999125 >>999149

File (hide): c46a3c2a7b24e22⋯.gif (1.06 MB, 360x270, 4:3, 1509534278484-co.gif) (h) (u)

The Unix Philosphy is shit and becomes a nightmare with complex programs. No kernel can follow it and X doesn't either.

Gates got it, Torvalds got, Stallman got it. Everyone got it except the retards at BSD, whatever UNIX OS is still im development (or should I say "life support"?) And the two guys working on Plan 9. Guess what? The world runs on Windows, Android and GNU's not Unix. Things could be better and there is a lot of room for improvement, but going back to UNIX is not the answer.

It's been almost 50 years since 1970. Let UNIX die already.


 No.999125

>>999117

And most realtime and/or important stuff runs on microkernels like QNX, OKL4 or Vxworks. Was their a point besides the usual ad populum.

The only meaningful thing you implied is that the UNIX philosophy is better suited for user than kernel space.


 No.999149>>999376 >>999447

>>999117

Hey, guy that shows up just to spout your opinion without adding anything constructive. Did you forget your gigantic quote wall this time?

>The Unix Philosophy is shit

Why is it shit? Explain or, at least, point to an article explaining it. Unix philosophy is basically small, simple programs cooperating to archive a bigger task. Are you against that?

>and becomes a nightmare with complex programs.

How so?

>No kernel can follow it and X doesn't either.

The philosophy is about user space programs using facilities from the kernel to cooperate. Not the kernel by itself. By X you mean X11? It was designed for a single purpose on MIT and then its scope went way beyond the original idea when it was adopted by everyone else. Unix should have used The Blit instead of X11.

>Gates got it, Torvalds got, Stallman got it.

Gotta need citations there.

Everyone got it except the retards at BSD, whatever UNIX OS is still im development (or should I say "life support"?) And the two guys working on Plan 9. Guess what?

>The world runs on Windows, Android and GNU's not Unix.

Appeal to populism and the fact that GNU stands for GNU is Not Unix doesn't mean rejection to Unix, it's just a name.

>Things could be better and there is a lot of room for improvement, but going back to UNIX is not the answer.

I agree with this, we should've been using an evolution to Unix like Inferno or even The Octopus from LSUB but we're stuck with 70's state of the art computing using abstractions like teletypes thanks to HP, IBM, DEC and Sun for their Unix wars from the 80's ~ 90's and now Linux.

For anyone more interested about the Unix philosophy, have a read of this paper.


 No.999321

>>992390

sad but true


 No.999376>>999387 >>999389 >>999404 >>999405

>>999149

Linus considers the 'Unix way' to be a guideline. Stallman rejects it entirely. Just look at emacs.


 No.999386

>>992006 (OP)

>One way of fixing this is abandoning the one window == one program idea

The Mac way: even if a program has a ton of windows open, it's still presented in the dock as a single instance of a program.


 No.999387>>999390

>>999376

It wasn't asked for you to refrase your point, but to provide the source for your claim.

And emacs is designed as many small lisps components working together passing pointer-like structures around over a small base of C programmed routines for speed.


 No.999389>>999447

>>999376

That's ironic, because GNU Emacs is often cited in Unix Haters when detailing the flaws of Unix. It's certainly far more amiable towards Unix environments than it is Windows land.


 No.999390

>>999387

BTW, check the chapter "How Lists are Implemented" in the book "An Introduction to Programming in Emacs Lisp" available in Emacs itself for source.


 No.999395>>999399

File (hide): e23c60c016a6fd0⋯.png (2.58 MB, 1080x1086, 180:181, ClipboardImage.png) (h) (u)

File (hide): ec95e81815d64c4⋯.png (2.68 MB, 1288x1294, 644:647, ClipboardImage.png) (h) (u)

>>992006 (OP)

>Making graphical software more unixlike

>I have not yet discovered ffmpeg, ImageMagick, or g'mic

Before and after pics from using ImageMagick. git gud

>>992210

thanks for the tip. I ditched Krita for MyPaint, but it crashes too often (less than Krita though). Good to be able to ditch it too and stick with one.

>>998791

>That's fantasy workflow and it ONLY works when you are

...organized. The only addition the IDE brings is organizing everything into one app, ergo you only find it essential if unorganized. God forbid your riced IDE isn't available to edit your code.


 No.999399>>999406

>>999395

ImageMagick and ffmpeg are great stuff, but I was talking about making software with graphical user interfaces more unixlike.


 No.999404

>>999376

What is HURD?


 No.999405

>>999376

What is HURD?


 No.999406>>999407

>>999399

>graphical user interfaces more unixlike.

unix-like = command line for me, but ymmv

Basically you would be re-creating command lines with a graphical overlay, otherwise re-inventing the wheel.

However if taking the re-invent approach, it would be (like an omni-wheel to a standard wheel) a pretty futuristic GUI, and long overdue. The data passing between the elements seamlessly is the core element, and where I imagine the major headache would lie.


 No.999407

>>999406

>recreating command lines with a graphical overlay

>reinventing the wheel

More like making the OS more consistent. I may try hacking something together using 9P.


 No.999424>>999453

>>996074

>How about using a simple gui to string the parts together and how they integrate?

The fundamental issue with modularity is that for it to work you need the user to:

1. Know very well the tools at his disposal.

2. Make a good model of the problem in relation to the available tools.

3. Figure out a way to create a solution by combining those tools.

Simulink is in fact something aimed at engineers, which are extremely qualified and intelligent workers that do this kind of problem solving for a living: for them, such modularity is perfect.

But the average person is not an engineer, and does not have decades of practice in abstract problem solving: for them, sensible defaults are a must and modularity is not relevant directly, it can be a plus if it allows for extensions/plugins but that's it.

I think having good APIs and a codebase built with plugins in mind is far more important than the extreme modularity suggested in the OP.

>>996253

>This means that all IDEs are bloated pieces of shit

No, it means IDEs are easy to use, easily portable across different systems, and an example of good encapsulation as they only show the user the parts they want to work on.

Bloat is an issue (rewriting everything in JS), but using your own text editor instead of relying on whatever non-standard editor installed on the system is good practice and not bloat.


 No.999447>>999457 >>999464

>>992613

>Unix is a waste of effort. The amount of work that it took for it to survive this long was enormous and it's still a broken piece of shit anyway. There has to be something wrong there when OSs made by 6 people work better than something as huge as Unix.

That's exactly right. Even the amount of work put into Multics and other mainframe OSes is miniscule compared to Linux, but most of that was R&D work, like inventing new hardware architectures, file storage structures, programming languages, storage devices, and so on. Linux needs 15,600 "programmers" just for a kernel of a clone of a PDP-11 OS.

>>996241

UNIX weenies mean the ability to replace a UNIX "tool" with a GNU "tool" that behaves identically. What really sucks is that everything is based on text instead of APIs and binary, so you end up never being able to fix text formats because computers depend on text being a specific sequence of bytes instead of being for people to read.

>>996253

>In this simple example, you use a terminal, a shell, an editor, a compiler, a linker, and a debugger, for the relatively simple task of writing a program.

That's the UNIX workflow. The Lisp, Smalltalk, BASIC, and FORTH workflow is using a REPL/prompt to directly alter the program and state of the machine. All of that is combined into one program, the opposite of the UNIX philosophy.

>>998962

>For instance, there's no "select" call to see which FID has data available, you simple rfork a new process to do this for you.

That's as "simple and elegant" as using a PDP-11 tape backup program to copy a directory hierarchy.

>>999149

>Unix philosophy is basically small, simple programs cooperating to archive a bigger task. Are you against that?

That's bullshit. UNIX pipes are much worse for making programs cooperate than dynamic linking and Multics segments. Most uses of pipelines are because the shell sucks at text processing so you need all these other programs that also suck.

>I agree with this, we should've been using an evolution to Unix like Inferno or even The Octopus from LSUB but we're stuck with 70's state of the art computing using abstractions like teletypes thanks to HP, IBM, DEC and Sun for their Unix wars from the 80's ~ 90's and now Linux.

Calling UNIX "70's state of the art computing" is like calling shitting in the streets "70's state of the art plumbing." Xerox Alto, VMS, and many other good systems are from the 70s. I never heard of "The Octopus" but it's more UNIX bullshit with the same UNIX problems.

http://lsub.org/ls/octopus.html

>The system derives from Plan B, therefore it is heavily influenced by Plan 9, and shares most of the source code with both systems.

Plan 9 itself shares most of its source code with UNIX.

>>999389

The problem with GNU Emacs is that it's bloated due to running on top of UNIX. Most of the UNIX Haters were Lispers who like Emacs and used a variant on ITS, Multics, or Lisp machines. They hate how much code GNU Emacs needs compared to the other versions because C and UNIX suck.

With respect to Emacs, may I remind you that the original
version ran on ITS on a PDP-10, whose address space was 1
moby, i.e. 256 thousand 36-bit words (that's a little over 1
Mbyte). It had plenty of space to contain many large files,
and the actual program was a not-too-large fraction of that
space.

There are many reasons why GNU Emacs is as big as it is
while its original ITS counterpart was much smaller:

- C is a horrible language in which to implement such things
as a Lisp interpreter and an interactive program. In
particular any program that wants to be careful not to crash
(and dump core) in the presence of errors has to become
bloated because it has to check everywhere. A reasonable
condition system would reduce the size of the code.

- Unix is a horrible operating system for which to write an
Emacs-like editor because it does not provide adequate
support for anything except trivial "Hello world" programs.
In particular, there is no standard good way (or even any in
many variants) to control your virtual memory sharing
properties.

- Unix presents such a poor interaction environment to users
(the various shells are pitiful) that GNU Emacs has had to
import a lot of the functionality that a minimally adequate
"shell" would provide. Many programmers at TLA never
directly interact with the shell, GNU Emacs IS their shell,
because it is the only adequate choice, and isolates them
from the various Unix (and even OS) variants.

Don't complain about TLA programs vs. Unix. The typical
workstation Unix requires 3 - 6 Mb just for the kernel, and
provides less functionality (at the OS level) than the OSs
of yesteryear. It is not surprising that programs that ran
on adequate amounts of memory under those OSs have to
reimplement some of the functionality that Unix has never
provided.


 No.999453>>999615

>>999424

>easily portable across different systems

Just a fancy way of saying unintegrated with the OS. Coming from windows I can see why you'd think this was a benefit, but on most systems the OS is there to help you.


 No.999457>>999495

File (hide): a3b7096232ec6a4⋯.png (307.13 KB, 500x465, 100:93, 1433860135028.png) (h) (u)

>>999447

>shits on Unix while promoting even less productive operating systems, languages, and their lazy communities

If you hate Unix so much, stop whining about it like a little bitch and write some fucking code. Prove the Unix weenies wrong with working software inb4 you point to some proprietary OS that we can't test without paying thousands of dollars to some greedy jews.


 No.999464>>999475

>>999447

The Multics guy is back, I'm sure your PDP11 is up and running because it's the only thing that can run Multics.

>That's as "simple and elegant" as using a PDP-11 tape backup program to copy a directory hierarchy.

Forking in Plan 9 is a cheap operation[1a], so instead of implementing more system calls, you just use what the system has.

>That's bullshit. UNIX pipes are much worse for making programs cooperate than dynamic linking and Multics segments.

How is it worse? How Multics segments do better? Wait, no one uses Multics anymore, nevermind. And how can you dynamic link on the fly like you can do with pipes?

>Xerox Alto, VMS, and many other good systems are from the 70s.

Of all these systems, just VMS is alive as OpenVMS. Acording to the wikipedia page, it looks good with stuff like Common Language Environment so why aren't you using it? For curiosity sake, also check how much of Windows NT kernel has from VMS due to MS hiring developers from DEC[2].

>I never heard of "The Octopus" but it's more UNIX bullshit with the same UNIX problems.

I have a more open mind than you. I never assumed Multics was bad before reading about it on multicians.org, indeed it was a nice system, but bloated and trapped in the PDP hardware, unable to adapt and died like the dinosaur it was. But I salute it for file hierarchy and other good ideas.

>Plan 9 itself shares most of its source code with UNIX.

No. [1b]

Plan 9 borrows many ideas from Unix, while improving them. The code is different.

[1]http://doc.cat-v.org/plan_9/4th_edition/papers/9

[1a]A single class of process is a feasible approach in Plan 9 because the kernel has an efficient system call interface and cheap process creation and scheduling.

[1b]Producing a more efficient way to run the old UNIX warhorses is empty engineering

[2]http://archive.is/AMN4f Windows NT and VMS: The Rest of the Story


 No.999475>>999482

>>999464

>PDP11

>Multics

Sad bait, but he'll probably fall for it.


 No.999482>>999498

>>999475

It wasn't bait. I was wrong. Yep, the GE-645 was the computer designed to run Multics. It just happens that the PDP line was around when the Multics was being conceived and was popular at the time. My point stands, Multics was trapped in a specific architecture incapable of adapting and died.


 No.999495

>>999457 Unix-like OSes have ever been the best OSes!


 No.999498

>>999482

It's great bait because the Multicuck really, really hates the PDP-11 for being the original Unix machine. Multics was designed for specific mainframes and, like most of his favourite OSes, depended on unusual hardware features with tradeoffs. The entire reason Unix exists is because his wunderOS didn't scale down to a cheaper minicomputer, so some ex-Multicians made a "good enough" operating system which just happened to be inherently portable. Most of his complaints are nitpicking about legacy holdovers from the PDP-11 days and trying to blame software bloat and bad 80s corporate software on the Unix philosophy.

Again, the Multicuck's biggest problem is that he sees an operating system as a collection of features rather than a cohesive whole. Every time you ask how he'd fix Unix, he goes full Poettering and suggests slapping on his favourite shit from other operating systems with no regard for how well it would fit the rest of the OS. If he really knew his shit he'd be contributing to an existing non-Unix OS or writing his own, but instead he's bitching about it on the internet in hopes one of us will write his dream cluttered-box-of-features OS for him.


 No.999512

>>992650

NetBSD works good on the RPi boards. The support for that hardware is in fact much better than basically all the other ARM boards. See here:

http://wiki.NetBSD.org/ports/evbarm/raspberry_pi/


 No.999532

>>992613

>Linux and the BSDs don't fucking work

Myself and anyone with a LAMP stack would like to have a word with you.


 No.999615>>999634 >>999668

>>999453

>Just a fancy way of saying unintegrated with the OS.

Which is exactly what you want on a large enough project.

You don't want to rely on the OS unless strictly necessary, so you aren't stuck on a suboptimal OS as easily.

Imagine coding something that relies on the X server and then having to redo it all when wayland becomes virtually mandatory.


 No.999622

Lisp machines are a funny metaphor for communism. :^)


 No.999634>>999668 >>999751

>>999615

A software library is a library of software functions. When you have multiple libraries that are conglomerated to form a big cohesive library spanning a wide range of general software functionality, this situation is often referred to as a software platform. It is very normal for programmers to target specific software platforms rather than to recreate a targeted subset of what a software platform provides. When your software targets the X11 protocol, that is a design choice that the programmer makes. If the team decides for the program to target the Wayland platform, that will take significant effort. However, I would bet that this effort is smaller compared to the effort required to independently implement the tiny subset of needed functions that are offered by the X11 platform or the Wayland platform.


 No.999668>>999751

>>999615

As someone writing software, you are interested in minimizing the amount of work you have to do. That's why you probably prefer writing webapps over real applications. As a user, I prefer software that uses familiar idioms, and integrates with the rest of my OS, so that it minimizes work for me. As such, I prefer software that is a library first and foremost, which works completely independent of the OS, then has a layer of UI code surrounding it to help me get my work done. Besides allowing for the application to be ported between OSes, this allows someone to consume the library directly in their programs. They could even write their own custom ui.

>>999634

I believe he would recommend that you use some framework like qt or electron rather than target the windowing system directly.


 No.999751

>>999634

>It is very normal for programmers to target specific software platforms rather than to recreate a targeted subset of what a software platform provides.

Sure, because most programs don't have the requirements an IDE is expected to offer nowadays.

If my game has a tiny difference in chat text formatting between Linux and Windows, that's unlikely to even be noticed, while an IDE formatting text differently on different systems could be a big source of headaches.

Combine that with the extreme portability demands, and forking your own solutions starts to sound good.

>>999668

>That's why you probably prefer writing webapps over real applications.

I can't say writing webapps is a pleasant experience, especially since performance requirements bite you in the ass as soon as you try to do anything interesting with them.

>As such, I prefer software that is a library first and foremost, which works completely independent of the OS, then has a layer of UI code surrounding it to help me get my work done.

Which is general a wonderful idea, it's so good even the most webshit IDEs follow it by having some impressive internal modularity: of course they try to lock down user-facing modularity so they can jew you out of money for dark themes and such.

Also, I would only recommend electron to my worst enemy, and even then only if they really pissed me off, qt is pretty neat tho even if you were developing only for windows.


 No.1000019>>1000068


 No.1000068

>>1000019

that seems pretty cool, and a good example of sensible modularity as it's based off real life workflows.




[Return][Go to top][Catalog][Screencap][Nerve Center][Cancer][Update] ( Scroll to new posts) ( Auto) 5
98 replies | 9 images | Page ???
[Post a Reply]
[ / / / / / / / / / / / / / ] [ dir / animu / arepa / leftpol / mde / newbrit / s / tacos / vg ][ watchlist ]