▶ No.861334
▶ No.861337
>C++
if your going to start from scratch why start from scratch with bloat
▶ No.861338>>861347 >>861351
>>861250 (OP)
> discord
fuck off
▶ No.861347>>861532
>>861338
>conveniently ignoring the IRC
/g/ user detected.
▶ No.861348>>861371
>>861250 (OP)
I am really happy someone is doing this. Thank you for all your hard work. Don't let negativity demotivate you. No matter if the end result is able to compete with mainstream browsers it is still a solution to escape WebKit/Blink/Gecko hellhole.
I am not a developer, but if I were making my own web browser I would only make basic rendering engine + ES5 JS engine. Everything else like bookmark manager, password manager, history, downloads would be implemented in some kind of API for plugins (maybe Lua?)
The whole GUI would be minimalistic, maybe just tabs, everything else would be accessible from integrated "drop-up" terminal like in qutebrowser. Mostly because this could be finished at some point and maintained without getting bloated and abandoned.
▶ No.861351
>>861338
>irc.rizon.net
>#/g/netrunner
▶ No.861359
There's already a Linux distribution and a TCG called Netrunner.
▶ No.861371>>861554
>>861348
>to escape WebKit/Blink/Gecko hellhole.
I actually really like servo. The project accomplishes what I set out to do when making netrunner, except it is a full browser engine. That is why I stopped developing netrunner.
<Rust is bloated
The web is bloated.
They are actually doing some cool stuff like their new font rendering engine.
Maybe one day I'll build a browser on top of servo.
▶ No.861468
>>861332
>not an argument
Not an argument.
▶ No.861532
>>861347
>Having discord at all
back to /g/
▶ No.861554>>861578
▶ No.861578
▶ No.861600
>>861250 (OP)
Dead on arrival. The web has become so bloated that it's impossible to create a rendering engine without massive funding. You're better off taking an existing rendering engine and building your browser around that. Have a layer between browser and engine so you can swap out the engine. If the browser catches on you could then work on your own engine and eventually swap it in.
▶ No.861602>>861670 >>861684
>>861250 (OP)
Why don't make new web? We need new protocol, new website standard (something that isn't html) and new browser for all that. We need to cut ourself from bloated www and only by doing something new we can accomplish that.
▶ No.861670
>>861602
you could potentially embed a new protocol in netrunner.
▶ No.861684>>861700
>>861602
>bloated www
It's literally plaintext
Anyway, have run on ReddiNet with the 6 other users
▶ No.861700
>>861684
>text protocols cannot be bloated
▶ No.861708
>shilling a cuckchan project
Saged and reported.
▶ No.861710
Gopher is closest to plain text. Even HTML 1.0 doesn't come close. And recent versions are unreadable when you add in all the obfuscated javashit that's all over the place.
▶ No.863832>>863833
>I'd like to develop text-based structures for communication between each piece
To the trash it goes then.
Text will never be an efficient format.
Should have used something like protobuf or similar.
▶ No.863833>>863834 >>863841 >>863934
>>863832
It's all going to get compressed anyways. The "inefficiency" is mostly theoretical. After for example JSON is compressed it is about the same size. The parsing still wastes a bit of time but most binary protocols also have to be parsed. There are some new "new serialization / deserialization" protocols. These mostly just actually leave the conversion to access time though a function you call to access an element.
▶ No.863834
>>863833
* "no serialization / deserialization"
▶ No.863841>>863867
>>863833
>The parsing still wastes a bit of time but most binary protocols also have to be parsed
"parsing" a good length-prefixed binary format is a completely different thing than parsing a fucking JSON.
and since this is what will be used to communicate between local components, it could become the bottleneck easily.
imagine what will happen if you start using JSON for sending graphic bitmaps back and forth
▶ No.863867>>863868 >>863900 >>863939
>>863841
>completely different thing
>imagine what will happen if you start using JSON for sending graphic bitmaps back and forth
The overhead would not be that high sending base64 encoded images inside JSON in a compressed context. Serialization is almost never the bottleneck.
https://auth0.com/blog/beating-json-performance-with-protobuf/
In many benchmarks that have been done its pretty similar performance with compression enabled. And you are going to be compressing those binary protocols all the same otherwise you are a retard wasting space with them to.
▶ No.863868>>863872
>>863867
>I have zero experience but I want to pretend to be clever: the post
▶ No.863871
BRING BACK THE MAC TONIGHT LOGO!
▶ No.863872>>863875
>>863868
>I have never actually benchmarked anything and am talking out my ass the response
▶ No.863875>>863877
>>863872
factually incorrect
▶ No.863877>>863891
>>863875
Really? Then why don't you post the benchmark code.
▶ No.863891>>863904
>>863877
it's not intended to be publicly available as is and I don't think the reward of extracting it is worth it.
▶ No.863900>>863904
>>863867
>performance with compression enabled
Compressing already compressed data usually doesn't offer much of an improvement.
▶ No.863904>>864270
>>863900
>Compressing already compressed data usually doesn't offer much of an improvement.
A bitmap is not compressed. Binary serialization is also not a compression algorithm.
>>863891
Well I linked you to some benchmarks demonstrating that there is no in practice difference between them.
▶ No.863934>>863938
>>863833
>compressed json is as small as a sane binary format
>deserializing json is as fast as deserializing a binary format
are you retarded?
▶ No.863938>>864273
>>863934
>>compressed json is as small as a sane binary format
I literally linked to benchmarks showing exactly this
>>deserializing json is as fast as deserializing a binary format
Deserialization is almost never the bottleneck. Network or disk speed are. Even then the benchmarks I linked prove that deserialization is almost as fast for JSON except for a couple types like floating point doubles.
▶ No.863939>>863942
>>863867
>https://auth0.com/blog/beating-json-performance-with-protobuf/
You BTFO yourself you retard. This blog post shows that Protobuf is way faster and smaller than JSON. Only when you pipe it through a shitty compression algorithm can JSON catch up.
>In many benchmarks that have been done its pretty similar performance with compression enabled. And you are going to be compressing those binary protocols all the same otherwise you are a retard wasting space with them to.
There exist better compression algorithms than those webshitters are currently using. Stop being a retard.
▶ No.863942>>864294
>>863939
>I did not read anything you said or understand the context of discussion but you BTFO yourself!
Well lets see that was linked to in what context
>In many benchmarks that have been done its pretty similar performance with compression enabled
>This blog post shows that Protobuf is way faster and smaller than JSON
Wrong. If you are not using compression you are wasting a shit ton of space binary or not.
Huh so it was in the context of "These things are all the same size when compressed". Wonder what "Only when you pipe it through a shitty compression algorithm can JSON catch up." has to do with it.
Now on to your next comment:
>There exist better compression algorithms than those webshitters are currently using.
Other compression algos exist but they bottleneck before the network does so they are useless and decrease the performance of the entire system.
▶ No.864270>>864312
>>863904
>A bitmap is not compressed. Binary serialization is also not a compression algorithm.
They either come pre-compressed (FLIF, etc) which is better than any blind general purpose compression, or they don't need to be compressed to be transfered locally.
Are you unironically saying that the interprocess communication inside one machine needs compression? This is not even funny anymore.
▶ No.864273>>864312
>>863938
>Deserialization is almost never the bottleneck. Network or disk speed are. Even then the benchmarks I linked prove that deserialization is almost as fast for JSON except for a couple types like floating point doubles.
The discussed project aims to use that shit for local communication between components, that pretty much excludes network and disk unless they are even more retarded that they look now.
▶ No.864294>>864312
>>863942
Fucking retard. Your link shows clearly that Protobuf BTFOs JSON. Even when compressed.
>This can sound like nothing, but considering that Protobuf has to be converted from binary to JSON - JavaScript code uses JSON as its object literal format - it is amazing that Protobuf managed to be faster than its counterpart.
▶ No.864312>>864332
>>864270
>don't need to be compressed to be transfered locally.
If by "local" you also mean on a filesystem then compression is still needed and will improve performance
>>864294
>Your link shows clearly that Protobuf BTFOs JSON. Even when compressed.
Only a trivial amount
>>864273
>shit for local communication between components
Why are you serializing at during IPC at all? Just send a struct directly. If it is IPC then its always the same architecture, if you need to save it and use it on a different computer then might as well use JSON.
▶ No.864332>>864361
>>864312
>being this retarded
>This can sound like nothing, but considering that Protobuf has to be converted from binary to JSON - JavaScript code uses JSON as its object literal format - it is amazing that Protobuf managed to be faster than its counterpart.
holy shit dude. just fucking end yourself
▶ No.864361>>864363
>>864332
>This can sound like nothing, but considering that Protobuf has to be converted from binary to JSON
Who are you even quoting here yourself?
▶ No.864363>>864378
>>864361
Your fucking blog post that you posted to proof that text based formats are as fast as binary formats you nigger.
▶ No.864378>>864416
>>864363
You know that statement means "to be used in a web browser" right, not that protobuf is json inside
▶ No.864416>>864423
>>864378
>he keeps being retarded
The shit you posted shows clearly that Protobuf is vastly superior to JSON. Stop being a retard and admit that text formats are shit.
▶ No.864423>>864426
>>864416
Not it shows that protobuf is only slightly faster / smaller than json in real world scenarios. Not at all worth the fact that you have to add all the protobuf toolchain shit vs literally every language having json support.
▶ No.864425
so... will it be as good as qutebrowser?
▶ No.864426>>864427
>>864423
>This can sound like nothing, but considering that Protobuf has to be converted from binary to JSON - JavaScript code uses JSON as its object literal format - it is amazing that Protobuf managed to be faster than its counterpart.
Have you not read the fucking blog post you posted??? The difference is not small. It is fucking massive.
▶ No.864427>>864453
>>864426
Not in real world scenarios
▶ No.864453>>864462
>>864427
>real world scenarios
>webdev
▶ No.864462>>864511
>>864453
>wedev
uhhhh nooooooo: loading from disk, loading from network, using compression. Who the fuck uses something like protobuf for IPC? That is a retarded use case.
▶ No.864511>>864538
>>864462
At least Protobuf has use cases. JSON is webdev cancer.
▶ No.864538>>864548
>>864511
JSON has the use case of being a common exchange for literally every programming language and not requiring a protocol compiler.
▶ No.864544
▶ No.864548>>864556
>>864538
But JSON is slow. Also there are other binary formats.
▶ No.864556>>864557
>>864548
Na its not slow the benchmarks linked earlier show that in the real world (disk, network, compressed) its pretty similar in performance.
A binary format is going to be much more annoying to work with in many different languages at the same time. If everything you are using supports a particular binary protocol go for it but if not its a PITA.
▶ No.864557>>864583
>>864556
>This can sound like nothing, but considering that Protobuf has to be converted from binary to JSON - JavaScript code uses JSON as its object literal format - it is amazing that Protobuf managed to be faster than its counterpart.
▶ No.864583
>>864557
Not the bottleneck waste of time setting it up