[ / / / / / / / / / / / / / ] [ dir / abdl / animu / fur / htg / leftpol / strek / sw / zoo ][Options][ watchlist ]

/tech/ - Technology

You can now write text to your AI-generated image at https://aiproto.com It is currently free to use for Proto members.
Name
Email
Subject
Comment *
File
Select/drop/paste files here
Password (Randomized for file and post deletion; you may also set your own.)
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Expand all images

File (hide): 8bd1de75ab68205⋯.gif (4.91 MB, 320x320, 1:1, Real_Sonic.gif) (h) (u)

[–]

 No.901458>>901462 >>901464 >>901497 >>901754 >>901758 >>901942 [Watch Thread][Show All Posts]

What is the ideal browser to people to this board and what language it should be written in? How hard is it too make a browser from scratch and what skill level is needed for one or a group to build it?

 No.901462>>901477 >>901497 >>901500

>>901458 (OP)

>What is the ideal browser to people to this board

It should be fast, secure, open-source, botnet-free, have a clean codebase, be minimal, yet be able to handle both older websites and newer websites (HTML5, CSS3, JavaScript (at the user's option), and when appropriate, ES6).

and what language it should be written in?

Rust, of course.

>How hard is it too make a browser from scratch and what skill level is needed for one or a group to build it?

As with any programming project, the hard part is coming up with all of the good ideas, running the Github and Slack, and creating the artwork/design. After we've gotten that nailed down, it should be pretty easy for a few anons to actually write a browser that meets the above criteria.

I'll set up the Github and get started on the logo.


 No.901464>>901477 >>901497

>>901458 (OP)

>What is the ideal browser to people to this board

subjective, but I prefer a fork of Firefox. On non-GNU/Linux systems I prefer Waterfox, and on GNU/Linux I like GNU Icecat.

>and what language it should be written in?

Currently, I think Firefox is written in mostly C++, although its new CSS rendering engine and probably some other shit will be in Rust.

>How hard is it too make a browser from scratch and what skill level is needed for one or a group to build it?

Using an already-created web rendering engine like WebKitGTK, QtWebEngine, or others like that? Shouldn't be that hard

Building literally everything from the ground up? Hard as fuck. See Netrunner.


 No.901477

>>901462

Is there a browser that does not handle old websites well? What could cause that and is there any examples of browsers failing to handle sites well? Also for the new websites, aren't they horribly written or just use a heavy amount of javascript or flash like amazon? Also would you define minimal as that can mean many things to a individual. In a related note why would someone even put a botnet in their own browser and by that I mean what advantage does the developer or company gain from it in the first place?

>>901464

Would there be any disadvantage in using a rendering engine like you mentioned or is fine to use it?

I would like to have a better running firefox that did not get rid of the old fashion add-ons but from the looks of the other forks that may be possible so I am wondering if something similar should be build from scratch.

In a unrelated note how good a browser would have to be for you to pay for it; one time, a timely subscription, or just to donate?


 No.901483>>901504

>901477

>Is there a browser that does not handle old websites well? What could cause that and is there any examples of browsers failing to handle sites well?

Dillo. Plenty of old websites use previous versions HTML, and few websites, old or new, actual contain 100% valid markup. As a result, browsers like Firefox and Chrome contain a ton of code that tries to account for all of this malformed markup, but still display the page the way the author intended. Dillo doesn't. If your webpage isn't valid markup, there's no guarantee that it will render correctly, and it often doesn't.

So if you want to create a browser that most people will find acceptable, you'll need to ensure that it takes a whole lot of badly written webpages into account. Also old HTML standards.

>Also for the new websites, aren't they horribly written or just use a heavy amount of javascript or flash like amazon?

Flash is dying. Lots of modern websites use a lot of js. Whether they're horribly written or not varies and is largely a subjective judgment anyway.

>Also would you define minimal as that can mean many things to a individual.

Merriam-Webster sez:

> : relating to or being a minimum: such as

>a : the least possible

> a victory won with minimal loss of life

>b : barely adequate

> a minimal standard of living

>c : very small or slight

> a minimal interest in art

I'm talking about meanings a and c.

>In a related note why would someone even put a botnet in their own browser and by that I mean what advantage does the developer or company gain from it in the first place?

Do you consider a billion or more U.S. dollars to be an advantage? Google didn't botnetify Chrom(e/ium) on a lark because they were bored and couldn't figure out anything else to do.


 No.901497>>901504 >>901874

>>901458 (OP)

>What is the ideal browser to people to this board

A browser that shall only render text and images anything thing else like remote code execution are security issues and will slow it down.

It shall at least have options for anonymized P2P mesh-network navigation with an integrated firewall like umatrix to filter the possible content loaded.

Video links and other content that normally is rendered in modern browser shall automatically be opened in the appropriate software (with option to do or not)

It must be under the GPLv3+

>and what language it should be written in?

C, Ada or one of the lisp family language.

>>901462

baito

>have a clean codebase, be minimal,

>(HTML5, CSS3, JavaScript

Choose one.

>>901464

> I prefer Waterfox

Placebo


 No.901500>>901504

>>901462

>HTML5, CSS3, JavaScript

It's impossible to do this without a big team.

At the moment you add JS in your browser it automatically becomes a impossible to secure/maintain over time mess.


 No.901504>>901522 >>901616 >>901874

>>901483

Amazon take a long time to load and gets buggy so that is why I consider it bad, but I am not sure if it is just a waterfox thing on my part. Also is what you are mentioning about the old sites is solely a issue on the minimalist browsers or is that in the more packaged browsers as well? Also I would be nice on why you would prefer minimalist browsers vs more package browsers? I would assume that it is more then just runs much better than the bigger browsers right?

>>901500

Is it impossible to not use javascript or at least minimize its usage?

>>901497

What browser comes close to fulfilling that description

Also two other things I would like to ask everyone: what is the browser that is closest to your ideal browser and how would this browser fund itself without botnet, starting a non profit like mozilla or just ads on their website like palemoon?


 No.901510

We already have the ideal browser - it's Links in graphics mode.


 No.901522>>901596 >>901609 >>901874

>>901504

First of all, forget about the anons saying your browser shouldn't include javascript. That's a quick way to the grave. 95% of sites will be unusable. Unless you want a browser only for anons to browse 8ch, Javascript must be there.

>Also two other things I would like to ask everyone: what is the browser that is closest to your ideal browser

Concept-wise, Otter Browser, but it's a long way from being usable (no addons).

>and how would this browser fund itself without botnet, starting a non profit like mozilla or just ads on their website like palemoon?

Mozilla has been funded by Google, Yahoo, etc. I don't know, try donations. The only ethical way.

As for the actual browser, old looking UI. None of the shiny, slick shit of FF or Chrome. No dialog boxes, or other stuff popping out of nowhere bothering the user.

Privacy focused. Nothing whatsoever sent out without the user's explicit permission. uMatrix, SSL Enforcer, Decentraleyes built in. Addon support should not be required if the functionality is there by default. User agent would be the newest Firefox (or Chrome) by default. One-button Tor toggle.

Forget minimal. They already exist and they suck. No one uses them.


 No.901596>>901609

>>901522

OR fuck it, try a kickstarter or something? Video games were funded this way, and a browser is much simpler.


 No.901609>>901640 >>901874

>>901596

Has someone attempted to kickstart a browser before? Did forked browsers asked for donations?

>>901522

So how does one make javascript good? another anon suggested rust, is that better than C or C++? Also do you like Otter browser for the fact that it retains the old opera browser direction or is there something more than that?


 No.901616

>>901504

>Amazon take a long time to load and gets buggy so that is why I consider it bad, but I am not sure if it is just a waterfox thing on my part. Also is what you are mentioning about the old sites is solely a issue on the minimalist browsers or is that in the more packaged browsers as well? Also I would be nice on why you would prefer minimalist browsers vs more package browsers? I would assume that it is more then just runs much better than the bigger browsers right?

Plenty of minimalist browsers parse old/incorrect HTML markup just fine. It's just Dillo (AFAIK) that insists on sticking to the standards. I'm just making the point that people are going to expect a new browser to function like Firefox and Chrome, and that means being able to handle old HTML standards and badly-marked-up pages, which means a lot more coding.

Speed is not my primary consideration in advocating for minimalism, though it's important. I think the browser codebase should be minimalist so that it is easy to maintain and easy to spot security flaws and other bugs in. But the browser should also be fully-featured, including support for js and the latest (X)HTML and CSS specs. Minimal in codebase, maximal in features and user experience. It should be pretty easy to code in Rust. I'm not sure why it's taking Mozilla so long.


 No.901621

Just carefully continue to improve netsurf.


 No.901640>>901874

>>901609

What do you mean "make javascript good" ? Just have Javascript support, like every normal browser. I think webkit handles that...

Whatever language you use does not matter. Don't concern yourself with these useless issues. It's just for /tech/ autists to stroke their dick over.

I don't know how the old Opera looked, but Otter's UI is exactly how it should be. I mean, even something like Tools > Cookies is something that other browsers don't have. Or importing custom styles without a fucking addon.


 No.901754

>>901458 (OP)

It already exists.

It's called Links.


 No.901758

>>901458 (OP)

>what is the ideal thing that runs 30K pages of specs and executes arbitrary JS code

something less shit than firecox or its forks for starters. i mean Links is the best but it doesnt work on all sites. once you start supporting more sites your browser becomes a piece of shit

>what PL

one with memory safety


 No.901760

>How hard is it too make a browser from scratch and what skill level is needed for one or a group to build it?

it's not hard at all it's just tedious and creating an ideal browser is an unsolvable problem because sites have too much freedom (JS, CSS, complex protocols, etc). no matter what you do it's going to be shit. you're literally better off just writing scrapers for every site you use and making a program to aggregate the scraped data


 No.901874>>901925

>>901504

>Is it impossible to not use javascript or at least minimize its usage?

No it isn't, it's a programming language there's no real limit to what people can imagine with it.

Simple proof of that is the sandboxing on each browsers that keeps getting pwnd each year.

>What browser comes close to fulfilling that description

The closest one would be links but half of what was described in that post is missing in it.

>and how would this browser fund itself without botnet

The browser that was described in post >>901497 doesn't need that much funding or people to make it two to three people are capable of this, simple funding like krita does would be good but the use of such browser would have a limited crowd since it wouldn't be compatible with websites who requires JS to load.

Why would this be possible ?

Because it doesn't require god dam JS, once you implement JS it's a perpetual game of cat and mousse on a security level.

And I'm not even talking of the W3C who spread their ass cheeks each time a megacorp wants to reinvent the wheel with backdoors in browsers thanks to JS.

>>901522

> That's a quick way to the grave. 95% of sites will be unusable

And 99.9% of web problems will be gone.

>Javascript must be there.

No it's not mandatory.

>Nothing whatsoever sent out without the user's explicit permission.

Then don't implement JS it's the core source of fingerprinting/tracking.

>No one uses them.

<Since I don't use them nobody does

>>901609

>So how does one make javascript good?

You don't it's a language that was made in one week.

>another anon suggested rust, is that better than C or C++

Rust is too young.

ANSI C is good, C++ is bloat.

I recommend ADA.

>>901640

>Just have Javascript suppor

Yeah sure thing anon just copy paste this shit in this folder IT'S JUST GONNA WORK.


 No.901917

html only and have people host their own webservers from their toaster.

I know the 2nd part isn't really part of the webbrowser, but there is no real reason for anyone to switch webbrowsers if no one is actually utilizing the technology.


 No.901925>>902328

>>901874

>And 99.9% of web problems will be gone.

No, you will just hide them.

>Then don't implement JS it's the core source of fingerprinting/tracking.

Modern browsers send many requests on their own, regardless of JS.

<Since I don't use them nobody does

Then they can keep using them, there doesn't need to be another surf, uzbl, etc...

By the way, it's not necessarily true that JS is insecure. If OP follows my suggestion of including uMatrix functionality by default, you can choose what scripts to run.


 No.901942>>901945

>>901458 (OP)

>How hard is it too make a browser from scratch and what skill level is needed for one or a group to build it?

As others have pointed out, it depends on what you consider to be a browser. If you want to use an existing browser engine the task is non-trivial, but it is an attainable goal. If you want to also write your own engines then forget about it. Not only are the official standards a pure nightmare, in order to be compatible with most websites your browser must also be able to deal with all sort of malformed shit.

This is a design problem in my opinion. When a browser sees invalid markup it should not render anything, only display an error and where it lies. Just like a compiler. Unfortunately browser developers were racing to the bottom to see who can be more compatible with shit code, and as a consequence people writing web pages saw that it "worked" and assumed that the code was correct.

>>901458 (OP)

>What is the ideal browser to people to this board and what language it should be written in?

A barebones browser that does nothing but display web pages (supporting CSS and JS of course). Everything else should be a plugin. Yes, even the user interface with things like forward/backwards buttons. Some essential plugins would be shipped with the standard installation, but on a technical level they would not be hard-coded into the browser. Instead the browser would just provide an API that these plugins can hook up to. Something like the design of Neovim, except for browsers.


 No.901945>>902304

>>901942

>A barebones browser that does nothing but display web pages (supporting CSS and JS of course). Everything else should be a plugin. Yes, even the user interface with things like forward/backwards buttons. Some essential plugins would be shipped with the standard installation, but on a technical level they would not be hard-coded into the browser. Instead the browser would just provide an API that these plugins can hook up to. Something like the design of Neovim, except for browsers.

What you just described is called XUL by palemoon. It is essentially a CSS, JS, and HTML interpreter with plugins/platforms for literally anything else. Pozzilla used to use XUL to make the thunderbird email client but then pozzed the shit out of XUL. Palemoon's devs seem to be trying to restore XUL to be useable like a plugin system again but at the coding/C level and not a click and it just works level like with chrome/modern browser plugins. Actually now that I think of it XUL could be the emacs of C.


 No.902304>>902328 >>902585

>>901945

> Palemoon's devs seem to be trying to restore XUL to be useable like a plugin system again but at the coding/C level and not a click and it just works level like with chrome/modern browser plugins.

Is there a reason for doing that? Also is there a practical reason why XUL as made obsolete by firefox? I never did found out the reason for it. In a related note is Basilisk browser any good?


 No.902328>>902411

>>901925

>No, you will just hide them.

This is incorrect, if you don't implement JS support you cannot execute any of it in any conceivable way thus you completely avoid the problem and not put it behind a firewall like umatrix does it.

> If OP follows my suggestion of including uMatrix functionality by default, you can choose what scripts to run.

Umatrix is basically a firewall it's not 100% failure proof. Depending on how the controlled opposition that is the W3C accepts new vague implementations it can go wrong on many levels without us knowing about it.

>Modern browsers send many requests on their own

I was talking about external entities that uses functions in the browser not the browser itself sending X on it's own.

Software in general needs to avoid the BIG BALL OF MUD as long as possible and not implementing JS is one helps that. Of course JS isn't the only reason for that for example rendering videos in browsers instead of an appropriate software is part of what causes the Big Ball of Mud.

> it's not necessarily true that JS is insecure

The past decade proves you wrong.

>>902304

>Is there a reason for doing that?

C is fast.

> Also is there a practical reason why XUL as made obsolete by firefox?

Economical/time reasons they dropped xul simply because they thought it would took too much time to correct it and went instead with what chrome did.


 No.902372>>902759

File (hide): 3b6c5322e4a1185⋯.png (1.58 MB, 1920x1080, 16:9, 1920x1080.png) (h) (u)

M

O

T

H

R

A


 No.902411

>>902328

>This is incorrect, if you don't implement JS support you cannot execute any of it in any conceivable way thus you completely avoid the problem and not put it behind a firewall like umatrix does it.

So you hide them (by not being able to visit the websites that use JS). But they will still use it, and you will be limited to shitposting on 8ch for the rest of your life - assuming it too does not require JS at some point.

If you consider JS a problem, that's not the way to fix it because eventually it will be required everywhere - so fix THAT, instead of hiding behind a browser that does not support it. Besides, those browsers already exist, so it's pointless to make another.


 No.902585>>902745 >>902747

>>902304

XUL was never designed to support security or a multiprocessing model. It's impossible to bolt security into XUL or make it do multiprocessing without destroying backwards compatibility of the existing XUL API.


 No.902598>>902601 >>902607 >>902630

Regex-based configuration.

. { -browser-javascript: off; }

(google|search) { -browser-autocomplete: off; }

^..\.wiki\w+\.org a:visited { font-size: larger; }

myfavoritesite\.com { -browser-screen: full; font-size: large; }

\.com$ { -browser-cookie: off; }

\.com$ input:last-of-type { border-color: red; }


 No.902601

>>902598

In fact, I imagine it should be possible to hack Firefox to allow change of its existent config directives on such per-site basis... if not in possibility sense, then in the "it SHOULD be possible" sense.


 No.902607

>>902598

Also, upon rereading the thread: directives such as

-browser-request: block;

-browser-proxy: myproxysettings;

would also be a thing of course.


 No.902630>>902638 >>902639

>>902598

Cool idea, but the vast majority of people wouldn't use this. I mean, I've even been programming for a long time, and I don't know regex...


 No.902635>>902640

Completely programmable and automatable. Allows user to override standard behavior; for example, control what some JS functions return or if they're even defined at all. Targeted at developers.


 No.902638

>>902630

Your loss. The web is inherently string-based by means of URL-s, as opposed to IPs which regexes are far less suitable for.


 No.902639

>>902630

>I've even been programming for a long time, and I don't know regex...

Programming in what and how long is this "long time"?


 No.902640

>>902635

>control what some JS functions return or if they're even defined at all.

Can't you do this already? Won't the following work?

alert = prompt; // fight me


 No.902745>>902771

>>902585

>XUL was never designed to support security or a multiprocessing model. It's impossible to bolt security into XUL or make it do multiprocessing without destroying backwards compatibility of the existing XUL API.

So is keeping XUL not doable? Is it better to make a new XUL that can do both of theses things over too continuing the old XUL? Also what makes XUL so vulnerable and not suited for multiprocessing?


 No.902747

>>902585

>XUL was never designed to support security or a multiprocessing model. It's impossible to bolt security into XUL or make it do multiprocessing without destroying backwards compatibility of the existing XUL API.

So is keeping XUL not doable? Is it better to make a new XUL that can do both of theses things over too continuing the old XUL? Also what makes XUL so vulnerable and not suited for multiprocessing?


 No.902759

File (hide): ddd4ea1233b1b9d⋯.webm (221.98 KB, 480x360, 4:3, nein.webm) (h) (u) [play once] [loop]


 No.902771>>902803

>>902745

You can keep XUL today, just fork the old XUL implementations and do as you wish. What you won't be able to do is to extend the XUL system so that it has a security model to segregate the different concerns of different applications within Firefox while remaining backwards compatible with old XUL extensions.

In addition to this problem, you will not be able to implement a multi-processing model within XUL while remaining backwards compatible with old XUL extensions. Multi-processing is not a trivial problem to achieve correct computation. Single processing is straightforward to achieve. Multi-processing requires a much more involved understanding of your computing model involving issues of the timing of computations with the improper timing of processes that totally screws up the logic of the whole system. The XUL system was only designed to work on the single processing model; it will break down when you try to use it in a multi-processing context in its current architecture. When you change the XUL architecture to be multi-processing compliant, it will no longer be backwards compatible with the old XUL extensions because all of them assume to be single processing.


 No.902803

>>902771

But if you accomplished this you would have the emacs of C.




[Return][Go to top][Catalog][Screencap][Nerve Center][Cancer][Update] ( Scroll to new posts) ( Auto) 5
40 replies | 2 images | Page ?
[Post a Reply]
[ / / / / / / / / / / / / / ] [ dir / abdl / animu / fur / htg / leftpol / strek / sw / zoo ][ watchlist ]