[ / / / / / / / / / / / / / ] [ dir / 2hu / bcb / choroy / dempart / doomer / jp / mde / xivlg ][Options][ watchlist ]

/tech/ - Technology

You can now write text to your AI-generated image at https://aiproto.com It is currently free to use for Proto members.
Email
Comment *
File
Select/drop/paste files here
Password (Randomized for file and post deletion; you may also set your own.)
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Expand all images

[–]

 No.1030587>>1030657 >>1030664 >>1030738 >>1031559 >>1031657 >>1031721 >>1032225 >>1032615 >>1032647 >>1035008 [Watch Thread][Show All Posts]

https://blogs.nvidia.com/blog/2019/02/05/adacore-secure-autonomous-driving/?linkId=100000004938589

>To ensure that this vital software is secure, NVIDIA is working with AdaCore, a development and verification tool provider for safety and security critical software. By implementing the Ada and SPARK programming languages into certain firmware elements, we can reduce the potential for human error.

>

>Both languages were designed with reliability, robustness and security in mind. Using them for programming can bring more efficiency to the process of verifying that code is free of bugs and vulnerabilities.

https://www.adacore.com/press/adacore-enhances-security-critical-firmware-with-nvidia

> Some NVIDIA system-on-a-chip product lines will migrate to a new architecture using the RISC-V Instruction Set Architecture (ISA). Also, NVIDIA plans to upgrade select security-critical firmware software, rewriting it from C to Ada and SPARK. Both moves are intended to increase verification efficiencies to achieve compliance with the functional safety standard ISO-26262.

 No.1030593>>1030610

YAER OF ADA


 No.1030602>>1030605 >>1030610 >>1030656 >>1031909

Nothing has ever become better when there's a push to rewrite something that was previously C/C++ into some other meme language.


 No.1030605>>1030657

>>1030602

>Ada

>Meme language


 No.1030607>>1030664

Rust BTFO

How is that web browser engine coming along? Is is ready yet?


 No.1030610>>1030613 >>1030614 >>1030615 >>1030616 >>1030635 >>1030714 >>1031720 >>1033507

>>1030593

Maybe -- the Ada 2020 Standard is coming along nicely and has some really nice features: lock-free nonblocking containers, the `parallel` blocks/for-loops, a few new attributes like 'Reduce/'Parallel_Reduce, among others.

>>1030602

> Nothing has ever become better when there's a push to rewrite something that was previously C/C++ into some other meme language.

Patently false; and known for years -- http://archive.adaic.com/intro/ada-vs-c/cada_art.pdf & file:///C:/Users/tiger/AppData/Local/Temp/1991_003_001_15905.pdf


 No.1030611>>1030617

YEEEEAAAAAAAHHHHHH!

for once, they finally did something right.


 No.1030613>>1030617

>>1030610

>file:///C:/Users/tiger/AppData/Local/Temp/1991_003_001_15905.pdf

How embarrassing, Tiger.


 No.1030614

>>1030610

Sorry, got the local link instead of URL -- http://archive.adaic.com/intro/ada-vs-c/cada_art.pdf


 No.1030615>>1030629

>>1030610

>want to talk about daemon I rewrote from C because the C was particularly shit. Fixed ten bugs just getting the thing going on our servers because other-than-author's configuration options were never tested

>will now have people calling me 'Tiger'


 No.1030616>>1030620

>>1030610

> file:///C:/Users/tiger/AppData/Local/Temp/1991_003_001_15905.pdf

I don't know what's worse: the fact that you linked a URL from your hard drive, or the fact that it's a Windows file URL.


 No.1030617

>>1030613

Yep, it is -- but I popped the correct link.

>>1030611

>YEEEEAAAAAAAHHHHHH!

>for once, they finally did something right.

Agreed -- I remember being *SEVERELY* dissapointed with CUDA because it was essentially manual insertion of GPU primitives, not significantly different from inline-assembly, rather than using Ada and taking advantage of Task and Protected types/objects.

I mean, it would have been really nice to use a implementation-pragma like:

Pragma CUDA( Some_Task );

And have the compiler pop the task onto the GPU automattically or issue a warning that it can't and use the normal CPU -- you also get the advantage of the programs being portable to other Ada compilers (though w/o the GPU support, obviously) AND when you get better GPU/compiler tech you can increase the capabilities w/o impacting extant code-bases.


 No.1030620

>>1030616

*shrug*

At work I have a Windows box, a Solaris box, a VxWorks box and [in storage] a Mac.


 No.1030624>>1030629 >>1030895 >>1031913 >>1037716

>writing in a language created by a kike

Write in Forth, homos.


 No.1030629

>>1030624

>C was particularly shit. Fixed ten bugs just getting the thing going on our servers because other-than-author's configuration options were never tested

That's one thing that I really hate about C: it's *SO* fragile on those compiler-switches/build-environment. It really makes me appriciate when I (eg) took a non-trivial 30 year old Ada program written for a different compiler on a different archetecture with about a half-dozen changes (mostly because I was using an Ada 2012 compiler, and the source had used as an identifire what became new keywords).

>>1030615

I love Forth -- I am thinking about using it as the backend/emition for a compiler just to leverage Forth's porting power when bootstrapping the compiler.


 No.1030635>>1030644

File (hide): 2c4aa2ebc35d40c⋯.png (123.73 KB, 500x584, 125:146, 2c4aa2ebc35d40cafcc68603eb….png) (h) (u)

>>1030610

>tiger


 No.1030640>>1030644

>tiger

Remember that Paki rape gang taximan wanting his victims to call him like that?


 No.1030644>>1030649

>>1030635

Yes, we know.

>>1030640

>Remember that Paki rape gang taximan wanting his victims to call him like that?

No, and I don't want to.


 No.1030649>>1030650

File (hide): abbbbb90834421f⋯.jpg (195.49 KB, 1280x720, 16:9, 1490905674746.jpg) (h) (u)


 No.1030650

>>1030649

No. Why would I be frustrated?


 No.1030656

>>1030602

Usually programs get rewritten in C/C++ and become worse, like the F-35.


 No.1030657>>1030693

>>1030587 (OP)

How does this change anything? Firmware isn't user accessible anyways for NVIDIA GPU's etc. so it doesn't really matter. What really matters is if performance will be affected on future cards. People won't use them because they are "safe" if it means a noticeable drop in output.

>>1030605

It is a meme language. It isn't even used in the industry for which it was designed. If that isn't meme, then meme on.


 No.1030664>>1030733


 No.1030685>>1030693

Never really looked at Ada or SPARK before. It looks awesome. I wish I had been taught this instead of Java. I can't even imagine programming in a world with such compiler-enforced pre-/post-conditions.


 No.1030693>>1030728 >>1030827 >>1035008

File (hide): bcd6a334772cb89⋯.jpg (22.86 KB, 630x834, 105:139, F35 - Forever.jpg) (h) (u)

>>1030657

>It is a meme language. It isn't even used in the industry for which it was designed.

You're talking out of your ass:

https://www2.seas.gwu.edu/~mfeldman/ada-project-summary.html

Unless you're referring to the money-sink that is the F-35 and how they used C++ "because they couldn't find any Ada programmers" there's a large unwritten caveat here: "at a price we wanted to pay."

Also, train-up time in Ada pretty good:

https://www.reddit.com/r/ada/comments/a62y4o/success_with_introducing_ada_to_three_college/

>>1030685

> Never really looked at Ada or SPARK before. It looks awesome. I wish I had been taught this instead of Java. I can't even imagine programming in a world with such compiler-enforced pre-/post-conditions.

It is!

That would actually be a really good question ot ask on comp.lang.ada -- and the Ada 2020 standard is going to have some *NICE* stuff.


 No.1030714>>1030718 >>1030730 >>1030836

>>1030610

lmao you hate unix because you're a fucking wangblows nigger


 No.1030718>>1030735 >>1030744

>>1030714

No, I hate Unix because of terrible design decisions.

I'd unironically suggest that, as a base system, VMS has a much better design than Unix.


 No.1030728>>1030729 >>1030737

>>1030693

You posted one source to expect that to apply to the whole industry, while the law stipulates that _any_ language can be used if it will save money for the project. The law even calls it a meme language.


 No.1030729>>1030737

>>1030728

And we have a good example of money saving right there. Instead of a paying some Ada programmers N dollars a year for five years, you can pay four times the number of C++ programmers N/2 dollars a year for 50 years.


 No.1030730>>1030737 >>1030744

>>1030714

>Windows users are the autistic unix is bad faggots

R U stupid? Both most Linux distros and Windows are just Unix-like and if they are Unix-complient then that's because some corporate distro bought the label.

It doesn't say anything about the OS.


 No.1030733

>>1030664

Mozilla seem to also partner up with Ubisoft recently.

Not entirely sure in which direction money is going.

[https://archive.fo/Qy7Ge] https://variety.com/2019/gaming/news/ubisoft-and-mozilla-announce-clever-commit-1203137446/


 No.1030735>>1030739

>>1030718

>VMS

Do you mean OpenVMS? It's probably way worse than Haiku.


 No.1030737>>1030761

>>1030728

I didn't post a law.

Why are you making shit up Anon?

>>1030730

There is that, too.

>>1030729

>And we have a good example of money saving right there. Instead of a paying some Ada programmers N dollars a year for five years, you can pay four times the number of C++ programmers N/2 dollars a year for 50 years.

You know, I wonder if they even out *that* much thought into it. At some places, there's absolutely *ZERO* discussion on what to implement a new project with, and it seems to me like they go with what's popular precisely to avoid training.

Some of the mental gymnastics I've seen to avoid proper training in the corporate world are *astounding* -- akin to "yes, a safety class on power tools would take an hour; but rocks are free."


 No.1030738>>1030740 >>1030741 >>1035030

>>1030587 (OP)

For those claiming ada is best for systems programming, why were colleges apprehensive about providing courses for ada programming after it was designed and standardized? It's a language from the 70's, but very few colleges actually teach it. Why? It's simply because it's not an efficient language for the needs that systems programmers have. It also shows how people are incapable of understanding that the problems with overflows and """"""safety"""""" are not intrinsic to software but to hardware, and anyone who argues contrary are either trying to goad argument or are ignorant.


 No.1030739>>1030750

>>1030735

>Do you mean OpenVMS? It's probably way worse than Haiku.

Why do you say *probably*?

And, you should note I *am* talking about design, particularly the underlying design-philosophies, rather than the actual implementation or polish.


 No.1030740>>1030763

>>1030738

>why were a bunch of communists and hippies unwilling to promote the official language of capitalism and baby-killing?

I dunno it's a mystery.


 No.1030741>>1030763

>>1030738

>For those claiming ada is best for systems programming, why were colleges apprehensive about providing courses for ada programming after it was designed and standardized?

Source needed.

> It's a language from the 70's, but very few colleges actually teach it.

The language ws standardized in the `80s, 1983 to be precise, and this was prior to any existing compiler and incorporating bleeding-edge compiler technology into the standard as requirements.

> Why? It's simply because it's not an efficient language for the needs that systems programmers have.

Oh? What needs, in particular, do you have within systems programming that are not addressed?


 No.1030744>>1030747 >>1030750

>>1030718

lol nope. You hate Unix because you are a retarded wangblows nigger. Your bizarre self-justification is not something I take seriously.

>>1030730

>#notall wangblows niggers

lmao


 No.1030747>>1030807

>>1030744

Hm, intersting; using your astounding mental powers, tell me more about my own personal prefrences.

Also, could you tell me about how, since they differ from yours they are automatically wrong?


 No.1030750>>1030760 >>1030807

>>1030739

>probably

Because I haven't tested it personally but multicore processors didn't even exist back then, so it probably doesn't have support for it.

>particularly the underlying design-philosophies

>closed-source, proprietary

>virtual memory system

Great, memory mapping. Something every OS has now.

>Motif user interface (based on CDE) layered on top of OpenVMS's X11

>Motif

>X11

Both shit. Windows and MacOSX both have a proper compositor now and Linux has Wayland coming with support from GTK and Qt.

>represents system time as the 64-bit number

One thing done well. It prevents time from overflowing.

Are you perhaps referring to the Do one thing and do it well! phrase cause everyone knows that's total bullshit and totally inefficient.

>>1030744

Are you one of those GNIggers that think every Linux distro is UNIX or are you one of those Macfags who think their OS is somehow based on Linux because Mac has the bash shell and other stolen components from BSD?


 No.1030760>>1030766

>>1030750

No, not merely features/implementations.

Things like:

* Common Language Environment

* Record Management System

and how they fit together and present an entire system to the user (or programmer).


 No.1030761>>1030762

>>1030737

Read it yourself

>SEC. 9070. Notwithstanding any other provision of law, where cost effective, all Department of Defense software shall be written in the programming language Ada, in the absence of special exemption by an official designated by the Secretary of Defense.

https://www.govinfo.gov/content/pkg/STATUTE-106/pdf/STATUTE-106-Pg1876.pdf

So not only does the cost influence Ada's status of being a meme languge, any official can meme Ada out of a project simply because it will increase the unnecessary complexity of the project.

Meme language defined by law.


 No.1030762>>1030764

>>1030761

I didn't cite that.


 No.1030763>>1030769 >>1030821

>>1030740

Not a point.

>>1030741

>Source needed.

Show me any ivy league college that teaches Ada as a required course in any CS or other degree program.

>What needs, in particular, do you have within systems programming that are not addressed?

Low-latency and accuracy are two needs that Ada does not address.

>completely ignores the fact that all problems with """"""safety"""""" is inherrently a fault at the hardware level and has nothing to do with software

So you are ignorant.


 No.1030764

>>1030762

I did. Ada is a meme.


 No.1030766>>1030769 >>1030771

>>1030760

>Common Language Environment, a strictly defined standard that specifies calling conventions for functions and routines, including use of stacks, registers, etc., independent of programming language

Solid argument. I bet this increases interoperability by a lot.

>Record Management System

Seems like an integrated serializer or am I wrong?


 No.1030769>>1030771 >>1030773

>>1030763

>Show me any ivy league college that teaches Ada as a required course in any CS or other degree program.

Does MIT count? -- https://ocw.mit.edu/courses/aeronautics-and-astronautics/16-01-unified-engineering-i-ii-iii-iv-fall-2005-spring-2006/comps-programming/

>>1030763

>Low-latency and accuracy are two needs that Ada does not address.

Yes it does; look at the Real Time Annex, Numerics Annex, and the Systems Programming Annex. (D, G, & C, respectively)

http://www.ada-auth.org/standards/rm12_w_tc1/html/RM-TOC.html

>>1030766

>>Record Management System

>Seems like an integrated serializer or am I wrong?

Not *entirely*, but yes... RMS is more like being able to define on-disk database records. WRT text, you can have the system natively know about lines and "do the right thing" rather than having to screw around with line-endings (CR vs CRLF vs LF).


 No.1030771


 No.1030773>>1030787 >>1031723

>>1030769

The problem of """"""safety"""""" is fundamentally a fault of hardware and has nothing to do with software.


 No.1030787>>1030880

>>1030773

You're an idiot; safety has multpile facets.

Yes, our hardware is one area that could be much better (look at the tagged archetectures like the Burroughs) -- and indeed should be -- but saying that it has *nothing* to do with software is just stupid.


 No.1030799>>1030802

OH SHIT

Ada might become popular now, FUCK! My pajeet free life is over!


 No.1030802>>1030809 >>1030815

>>1030799

>Ada might become popular now, FUCK! My pajeet free life is over!

Yeah -- Ada does tend to be a good filter against pajeet.


 No.1030807>>1030823

>>1030747

A wangblows nigger who criticises Unix is like a gay man who talks about MGTOW.

>b-but he might have a point

Doubtful. All he can do is spout talking points in a vain attempt to get more cocks in his mouth.

>>1030750

oh no no no I know Unix is awful trash. That's why I use a totally different thing called Linux.


 No.1030809

>>1030802

> That's why I use a totally different thing called Linux.

... yeah, "totally different".


 No.1030815>>1030818

>>1030802

How many Ada seminars have been created in India upon this news breaking?

>Get ahead of the crowd and learn 100% from the Oracle of Ada for guaranteed success in our teachings of material to avoid learning the unnecessary in becoming quick Ada ready hireable tomorrow programmer.


 No.1030818

>>1030815

>How many Ada seminars have been created in India upon this news breaking?

Who knows.

But I doubt Pajeet would have the patience to learn anything -- they can barely handle call-center scripts, after all.


 No.1030821

>>1030763

>Not a point

you want to know why colleges don't do a thing. The character of the people at the colleges is part of the explanation.


 No.1030823

>>1030807

>a gay man who talks about MGTOW.

get men live in civilization with the rest of us, and they occasionally come in conflict with women, and they're actively getting kicked out of the protected class club. They'll be talking about the same shit.

will be: https://www.youtube.com/watch?v=i0BRn016EGI

female liberation and its consequences are the problem. The non-gay aspects of MGTOW are just people talking about libido management methods, marriage, and ... eh... PUAs being fags. It's pretty much all stuff a fag could get on board with.

Gays might be exempted from the first round of Bachelor taxes. That's about the pace of things.


 No.1030827>>1030829 >>1030830

>>1030693

Does it include functions as first class citizens? That may be the only thing that's pushing Ada down in my priority queue.


 No.1030829

>>1030827

yeah. You can have HOFs, you can take pointers and pass them around. You can put them in variables. You can have closures as well. There are some limitations that you can run into due to subprogram lifetimes: your function pointer can't live longer than the function. In my code all this has meant is that I've moved 'local' function definitions off to a package.


 No.1030830

>>1030827

Yes[-ish].

You can pass a function as a parameter to a GENERIC, you can pass an access to a function as a subprogram parameter, and you can use RENAMES to give a name to a function call/result.

See also:

http://okasaki.blogspot.com/2008/07/functional-programming-inada.html

https://groups.google.com/forum/#!topic/comp.lang.ada/RyjqQ2QOS1g


 No.1030836

>>1030714

maybe he is using windows because he hates unix


 No.1030841>>1030866

>> Does it include functions as first class citizens? That may be the only thing that's pushing Ada down in my priority queue.

>

> You can pass a function as a parameter to a GENERIC


Generic
Type Element is private; -- Any non-limited constrained type.
Type Index is (<>); -- Any discrete type.
Type Vector is Array(Index range <>) of Element;
with Procedure Swap(Left, Right : in out Element);
with Function "<" (Left, Right : Element) return boolean is <>;
Procedure Generic_Sort( Object : in out Vector );
--...
Procedure Generic_Sort( Object : in out Vector ) is
Begin
-- Bubble-sort
For Outer in Object'Range loop
For Inner in Index'Succ(Outer)..Object'Last loop
if Object(Outer) < Object(Inner) then
Swap( Object(Outer), Object(Inner) );
end if;
end loop Inner;
end loop Outer;
End Generic_Sort;

Then you could say have a "count_swap" that increments a counter as a side-effect and pass that in as the actual for "Swap" in the instantiation.


 No.1030866

>>1030841

This is actually a pretty neat feature.


 No.1030880>>1030882 >>1030887

>>1030787

It has nothing to do with software. If the hardware provides the checks intrinsically, then any software check would be redundant and unnecessary, bloat.


 No.1030882

>>1030880

Are you stupid? You still have to encode the checks themselves into the hardware, and the best way we have of doing that is... software. Many checks in Ada actually can be elided and moved into CPU instructions as it stands. LARP harder next time.


 No.1030887>>1030888

>>1030880

>If the hardware provides the checks intrinsically,

And what hardware is doing this?

Heartbleed shows us there's not a lot that does in common, everyday, consumer-grade usage.


 No.1030888>>1030891 >>1030895

>>1030887

>what hardware is doing this?

Careful, you're going to summon the LISP machine poster.


 No.1030891>>1030895

>>1030888

>Careful, you're going to summon the LISP machine poster.

That would actually be kind of cool... I was hoping to get my hands on a copy of the source for Symbolics Ada.


 No.1030895>>1030899 >>1031058

>>1030624

Forth is awesome. People recommended it a few times and I finally tried it. I really like the way that it was designed, and it's the only language that I can't find anything to complain about, and it's so flexible that I can't see why you would be unable to do anything with it, since you can just build it up to whatever level you think is required. It's easy to see why someone would argue that it's the ultimate language. The postfix notation is really nice once you get used to it as well, and it legitimately makes more sense than anything else. Unique as hell and comfy as fuck. Still, I think Ada does look nice, but really verbose, and perhaps too close to natural language, which is inefficient, and kinda pointless since there is no need for a language to try that hard to be readable by people that haven't studied it. I would easily take it over Lisp (Smalltalk too, but again, really verbose). The parenthesis make my head hurt and my autism doesn't find it aesthetically pleasing enough, even though that never bothers me. I like how it works, but I couldn't get used to that.

>>1030888

>>1030891

Just mention C and/or Unix. Maybe talk about how great Unix is even though it's not. This might even be enough, actually.


 No.1030899>>1030907

>>1030895

Forth breaks down on generic code, on system-portable code, and on troubleshooting. And its hell of "other people's code" is much worse than most languages. Even fairly common problems like stack errors are a bitch to fix, even after you have tooling for them.

And there are a few places where Forth as practiced is pointlessly pessimal on modern systems. Like CREATE DOES> , "the pearl of Forth", often requires unnecessary memory fetches.

Implementation quality is generally poor, with a lot of surprising variations in performance, to the point that you might different parts of the same project to use different implementations.

I'm not one to say that popularity matters that much, but Forth is so unpopular that any serious project will be like mowing an abandoned lawn with tall grass without checking it over carefully--every other yard, you have to stop and pull a dead body or stolen statuary or evil spirits out of the lawn mower.

And if you think /tech/'s full of LARPers, the Forth community'll make your head explode.


 No.1030907

>>1030899

Well, I knew that all current implementations are kinda weak to begin but, but I was thinking about writing my own because that's the kind of thing that I'm interested in. The actual language is what really matters, since my original idea was to use what I learned from other languages to design my own. I don't intend to rely on someone else's work and I have no interest in "communities" (pretty damn sure that they are all crap), so it doesn't matter too much. My interest is to pretend to be living in the golden age of computing and develop basic shit on my own from scratch, pretending that it's being done for the first time, so I can understand how it was done.

Also, I am incapable of teamwork and almost dislike forking, so I have no intent of relying on anyone else. I only care about optimizations. Readability is only important if it doesn't prevent that and makes my own life easier. The machine comes first, other humans come last. I would even say that if you can't read optimized code, then you probably shouldn't be reading it. In my case, it's just fun hobby shit so it will never matter. I'm just trying to learn a lot of shit and occasionally hear what other people have to say, and then use the information to improve my ideas.


 No.1031058>>1031557

>>1030895

>Still, I think Ada does look nice, but really verbose, and perhaps too close to natural language, which is inefficient,

It actually *really* good for error-detection. Especially when you're refactoring nested constructs.

> and kinda pointless since there is no need for a language to try that hard to be readable by people that haven't studied it.

I flatly disagree; the readability for non-programmers is absolutely needed so that you can have domain-experts who aren't programmers understand it (say physicists or statisticians).


 No.1031437>>1031450

jesus christ, another group of tech nerds who couldn't produce a string length function or a webpage that does a database query without RCE vulns wrote an article about security, ep. #35723582375. +1 me on HN and twitter

>Ada is crap

>C is good

>programming is hard, the vulns aren't my fault

>C is the bestest PL ever made

>Ada is crap and impractical

>cars crash just from ABS malfunction and ECU vulns and bugs

>programming is hard, not my fault

>Ada is good! let's use it to make self driving cars


 No.1031450>>1032615

>>1031437

You're not wrong -- perhaps the biggest problem/difficulty is that the industry standardized on C and C++ and waste untold millions trying to "fix C" or make a "Better C" (Java, C#, Swift, ObjectiveC, etc) and now have such emotional investiture it's like looking at a fully internalized Sunk Cost Fallacy.

Though it really is super-frustrating to be plagued by bugs that are easily detectable if not outright avoided.


 No.1031466

wtf I love Windows and Ada now


 No.1031557>>1031558 >>1031562

>>1031058

>It actually *really* good for error-detection. Especially when you're refactoring nested constructs.

Well, I guess assuming that anything about Ada is the way it is because it's good for that wouldn't lead me to a very inaccurate understanding of the language. The language didn't find its niche for nothing.

>the readability for non-programmers is absolutely needed so that you can have domain-experts who aren't programmers understand it (say physicists or statisticians).

Well, only for source code that these people are expected to read. Other than that it doesn't matter and it's better to prioritize performance and make it easier to write. At least in my case, I don't think it's even easier to read unless you just don't know the language. Personally, I don't like having to rely on completion, I would rather have a more minimal syntax, but both have their place, I suppose. That's one reason why I enjoyed writing Forth, and it's also one of my reasons for appreciating Lisp even though it makes my head explode (just like graph paper fucks with my head and my eyes because of the excessive amount of squares, Lisp does that with parenthesis). Still, I really like the way that Forth is structured. But if I had to pick a language that is currently in actual use, I think I would pick Ada.


 No.1031558>>1031566

>>1031557

>Other than that it doesn't matter and it's better to prioritize performance and make it easier to write.

Why? -- Even disregarding the need for technical non-programmers to understand it, code is read much more than written, so it makes sense to optimize for readability, no?

> But if I had to pick a language that is currently in actual use, I think I would pick Ada.

Go for it!

Most of the Ada programmers I've met/talked with are pretty friendly.


 No.1031559

>>1030587 (OP)

How long until they get kicked out for being filthy kikes only to sue Adacore for 6 million shekels in libel and copyright infringement damages?


 No.1031562

>>1031557

> just like graph paper fucks with my head and my eyes because of the excessive amount of squares, Lisp does that with parenthesis

Interesting, I have developed a "blindness" to both the squares on the graph paper as well as to the parentheses in Lisp. Of course I still see them, but my brain just ignores them.


 No.1031566>>1031571

>>1031558

>code is read much more than written

Yes, by the computer. Computers will read it many more times than humans ever will, and optimizations will increase the program's longevity as well.

>so it makes sense to optimize for readability, no?

Someone that can't read optimized source code is clearly not competent enough to do anything with it. If it's unoptimized, then it's not worth using and therefore not worth writing. It's good to keep people that aren't good enough away from your source. Quality control.

>Most of the Ada programmers I've met/talked with are pretty friendly.

Well, I don't socialize. But the fact that Ada is actually used for some serious shit makes it easier to respect. It's safe to say that it's better engineered than the languages that crap consumer software is written in. If it's used by the military, and in aviation, then it's definitely doing something right considering how unreliable a lot of the C and C++ software that I have used has been. Then again, I'm sure that the people working in those fields are more competent than average as well.


 No.1031571

>>1031566

>>code is read much more than writtenYes, by the computer. Computers will read it many more times than humans ever will, and optimizations will increase the program's longevity as well.

By the human, too. -- Optimization for the computer is a function/property of the translator; in theory, the source language has little impact here.

>>1031566

>Someone that can't read optimized source code is clearly not competent enough to do anything with it. If it's unoptimized, then it's not worth using and therefore not worth writing. It's good to keep people that aren't good enough away from your source. Quality control.

I think you're misunderstanding -- Ada was designed for readability, this is completely orthogonal from the optimizations of generated code. It also makes it easier to maintain; and in my opinion Ada code is a lot easier to "come back to" after X weeks/months.


 No.1031624>>1031632 >>1031634 >>1031638 >>1031662

I can't into Ada. I keep writing junk in C.

Am I closet pajeet?


 No.1031632

>>1031624

Learn Pascal so you can wrap your head around the syntax.


 No.1031634

>>1031624

why can't you?


 No.1031638

>>1031624

How are you trying to learn it? Have you read the Barnes book?


 No.1031657>>1031660

>>1030587 (OP)

Dupix btfo, the thread. C and Unix fags, please don't kill yourselves.


 No.1031660

>>1031657

Program a lot of NVIDIA firmware; do ya?


 No.1031662>>1031664

>>1031624

Learning C requires a healthy, preferably white brain that can be properly damaged. People that don't understand toilets probably don't have that.


 No.1031664

>>1031662

Nobody can write correct and safe C code, it is impossible.


 No.1031682>>1031685 >>1031796

I'm a Haskell/F#/Scala dev; somebody sell me on Ada. How would I use/implement monads in it? For example, like in: https://fsharpforfunandprofit.com/rop/


 No.1031685>>1031686 >>1031696

>>1031682

You wouldn't. Haskell has

1. ML features, which are pretty cool, and which you can find in Ada.

2. a bunch of bad decisions

3. a bunch of features that only make sense in the context of #2

you should be well used to looking at other languages and saying "oh... weirdly, to use this I'll have to put aside all that weird stuff I learned in Haskell, like men putting away childhood toys."


 No.1031686>>1031690 >>1031696

>>1031685

Contrary to popular belief, monads aren't just some workaround to perform I/O, they're practical design patterns that make code easier to reason about, as demonstrated by that anon's "railway oriented programming" link.

If Ada claims "ease of maintenance" while not supporting such basic abstractions, then it's a waste of time to learn it when you can just use the equally rigorous safety of Haskell instead.


 No.1031690>>1031696 >>1031698 >>1031707

>>1031686

yeah, yeah, yeah. I ran out of patience for Haskeller cultist bullshit even before I started drinking to try and free up the completely wasted skill points investments that the language encourages. You wouldn't like Ada because it's readable, and 'remotely readable' is a bad code smell to a Haskeller.


 No.1031696>>1031699 >>1031796

>>1031685

>>1031686

>>1031690

Ada is supposed be non functional and low-level, so monads has to be thrown out the window by default


 No.1031698>>1031700

File (hide): 7f71979103fd7cd⋯.png (25.65 KB, 1128x732, 94:61, 13-Figure4-1.png) (h) (u)

>>1031690

>'remotely readable' is a bad code smell to a Haskeller.

I have some bad news for you:

https://www.semanticscholar.org/paper/An-Experiment-in-Software-Prototyping-Productivity-Hudak-Jones/4029a3c5b19365ea3e0c453c4245eb184e038c75

>Understandability

>Ada: A

>Haskell: A+

>Learnability

>Ada: C

>Haskell: A


 No.1031699

>>1031696

>Ada is supposed be non functional

I knew Ada was a meme language, but I didn't know it was intentionally trying to be useless, lmao


 No.1031700

>>1031698

>bad news

good joke though. Not going to waste my time digging into how they managed to arrive at Haskell begin either of those things.


 No.1031707>>1031783

>>1031690

<Y-y-you're j-j-just a cult

Get off my board brainlet


 No.1031720

>>1030610

>file://

>tiger

>C:

For shit like this I come here


 No.1031721>>1031796

>>1030587 (OP)

ADA was the right choice for something as crucial as this.

kernel written in ADA when?


 No.1031723

>>1030773

Enjoy you're overflow or frame attacks


 No.1031724>>1031742

I'm a registered sex offender, is Ada the right language for me?


 No.1031742

>>1031724

>LGBTPBBQ

No. Ada is too complicated for your cout<< code.

cd..


 No.1031783

>>1031707

If a brick landed on your head and took away only your knowledge of Haskell, you'd become a better programmer.


 No.1031796>>1031797

>>1031682

Well, it depends -- some simple monadic things are built into the language; example modular arithmetic.

-- Declares a numeric type [0..20], w/ wrap-around.
Type Cycle is mod 20;

Or you could use generics to encapsulate your monad and its generic functionality.

>>1031696

>Ada is supposed be non functional and low-level, so monads has to be thrown out the window by default

Funny,

It is funny, because Chris Oakasaki has a blog post about using Ada for functional programming:

http://okasaki.blogspot.com/2008/07/functional-programming-inada.html

>>1031721

>kernel written in ADA when?

There is/was some interest in comp.lang.ada about doing an OS in Ada.

I don't know if you'd count it, but there's the Muen Separation Kernel:

https://muen.codelabs.ch/


 No.1031797

>>1031796

Sorry, that should be [0..19] -- haven't had coffee yet.


 No.1031811

looks like fun


 No.1031851>>1031855 >>1031856 >>1031859 >>1031966

>be the only retard in the whole imageboard ecosystem to shill Ada

>nobody listens to me

>ff 2 years

>more and more people shill Ada

>be b& from 4chan because /g/anny unironically shits his pants when he sees cartoon frogs

>go to its goat fucker commie twin

>see this

Mind explaining why Ada is making a comeback? Cause I don't see any reason.


 No.1031855>>1031866

>>1031851

you were shilling Ada on /g/?

I only saw a few of their AoC threads, but I didn't see any Ada.

You missed an opportunity there.


 No.1031856>>1031866

>>1031851

Do you really expect anyone to believe that shit?


 No.1031857>>1031862

The thing I dont get about ADA is the license. Can I write closed source software with GNAT? Is the compiler itself open source? Ive been interested in it for sometime but this issue is holding me back from investing time learning it.


 No.1031859>>1031866

>>1031851

>cuckchan nigger confused about why anyone would use the thing he spent 2 years shilling

easily the most pathetic site and users on the internet


 No.1031862>>1031864

>>1031857

fsf-gnat is free to use.

adacore-gnat you can freely use for any open source project. commercial projects require a license.

gnat's not the only Ada implementation, too.


 No.1031864>>1031869

>>1031862

What does adacore have that gnat doesnt?


 No.1031866>>1031868 >>1031869 >>1031966

>>1031855

>you were shilling Ada on /g/?

On /g/ and /tech/, but it was 2 years ago.

>>1031856

Look at /g/ in the evening (PST), and see how many pepe pics get deleted within minutes.

>>1031859

The thing is I stopped shilling it when I actually learned how to use C++ cause Ada is mostly made for formal methods (which I don't use). But yeah, recently some tard on /g/ unironically blamed Unix when I told him pre-11 C didn't have atomics.


 No.1031868

>>1031866

here the pepe pics are usually in threads that have nothing to do with technology. this board does not have real mods so they will stay up


 No.1031869>>1031870

>>1031864

*shrug*. apparently it has a slower release schedule, although adacore-gnat only releases twice a year. All I know is, Fedora 'gnat' doesn't come with the pretty printer (gnatpp), and AdaCore's package does come with it.

>>1031866

>why wasn't my insincere shilling persuasive?

>btw using C++ now

I don't care about formal methods, but I care about a language not being a nightmare to use correctly:

https://www.youtube.com/results?search_query=nightmare+of+C%2B%2B


 No.1031870>>1031877


 No.1031875>>1031966

Is it possible to use ADA to write GUI apps?


 No.1031877>>1031879

>>1031870

Skimmed the slides. As expected, the guy's just doing a bunch of shit nobody ever does and blames the language when it doesn't work or gets incomprehensible. In C too you can do that.

https://people.eecs.berkeley.edu/~necula/cil/cil016.html


 No.1031879>>1031880 >>1031889

>>1031877

ok let me correct my greentext of you

>btw using a random subset of C++ with lots of hidden performance penalties

>I'm not writing template libraries here so I don't need to know how this language works :p


 No.1031880>>1031881 >>1031889 >>1031907

>>1031879

C is not a subset of C++. They are two distinct languages with their own syntax and semantics.


 No.1031881>>1031885

>>1031880

>C is not a subset of C++

WHERE DO YOU SEE ME SAYING LIKE THAT, YOU MORON


 No.1031884>>1031907

What about GPS ide, can I use it for closed source without paying?


 No.1031885>>1031887

>>1031881

It doesn't matter how you say it. All that matters it that you said it, and you did.

Why such butthurt?


 No.1031887>>1031896

>>1031885

You brainless freak, I said that you are using a subset of C++, which is what you are doing. Since all the horror stories in the video is "shit nobody ever uses", you don't even use range-based for loops apparently. I even called your subset "random", i.e., unique to you, probably slightly different from anyone you work with. I never named your personal subset of C++, and certainly didn't call it "C".


 No.1031889>>1031892 >>1031896

>>1031879

Nobody besides Boostfags write all their code in templates.

>>1031880

Next time you try to falseflag as me, refrain from saging. You'll be more credible.


 No.1031892>>1031901

>>1031889

>he doesn't write templates ever because he doesn't understand that part of the language because C++ is a nightmare

>he pretends this shameful observation is really about people writing "all their code in templates"

sage to falseflag as whoever this is


 No.1031893

File (hide): 36d152ad503aaf7⋯.jpg (178.16 KB, 900x900, 1:1, yikes.jpg) (h) (u)

No wonder NVidia cards are shit.


 No.1031896>>1031903

>>1031887

>>1031889

How can such butthurt exist?


 No.1031901>>1031902

>>1031892

>if you don’t casually use the hardest part of a language, you don’t understand it

That mentality might be why you’ve never programmed anything relevant in your life.


 No.1031902>>1031904

>>1031901

>he never does a thing because he doesn't understand it because his language is impossible to understand

>he pretends that this shameful observation is really about how *casually* he does a thing


 No.1031903

>>1031896

4um refugees.


 No.1031904

>>1031902

Am I really outside /g/ btw? Cause I don’t see any difference.


 No.1031907>>1031910 >>1031918

>>1031880

>C is not a subset of C++.

That's not quite true, but it's not quite untrue either -- C++ was designed as a C superset and so tried to keep backwards compatible w/ C -- and, for the most part it does.

>>1031884

> What about GPS ide, can I use it for closed source without paying?

Yes.

The situation w/ GNAT is there's three possible versions:

(1) AdaCore's Community edition -- this is the GPL restricted RTL one.

(2) AdaCore's Pro edition -- this is unrestricted, TTBOMK, and the most up-to-date.

(3) FSF -- this too is unrestricted, but usually lags behind #1 & #2.

There are other Ada implementations as well, RR Software's Janus/Ada, Green Hills, ICC, Verdix, and a few more.


 No.1031909>>1031966 >>1032184

>>1030602

>c/c++

Never heard of this language, what is it?


 No.1031910>>1031966

>>1031907

So in your opinion of those ADA implementations which is the most secure for powerpc and why. Does the ADA implementation write assembly to ADA or does it go assembly > c > ADA? Which is to say what architectures are supported by ADA/its implementations.

As if you are writing for security you might as well not bother with the lanuage if all it supports is x86 or ARM or some shit like that.


 No.1031913

>>1030624

>not using good tools because you despise those who made them

That just makes said language better. Learn from your enemy and beat him at his own game.


 No.1031918>>1031966

>>1031907

>C is not a subset of C++

That's better discussed in another thread.


 No.1031966

>>1031851

>Mind explaining why Ada is making a comeback?

Because it offers some good solutions to existing problems: the native TASK; GENERICs that are (a) not Turing-complete, (b) can use other generics, values, and subprograms, (c) essentially providing the functionality of the C++ proposed "concepts"; the functionality of C++ proposed "modules" provided by PACKAGEs; the SPARK formal-methods/-proovers; and the upcomming PARALLEL blocks/loops.

>>1031909

>>c/c++

>Never heard of this language, what is it?

Ususally a shorthand for "C and/or C++" -- usually grouping the two together based on similarities.

>>1031866

>Ada is mostly made for formal methods (which I don't use).

Not really; but it has a much better foundation to build provers on. (More information on types made explicit.)

>>1031875

>Is it possible to use ADA to write GUI apps?

Yes.

There's Gnoga.com [down right now], GTK, and CLAW from rrsoftware.

>>1031910

>So in your opinion of those ADA implementations which is the most secure for powerpc and why.

I honestly don't know which would be best for PowerPC, as I haven't built any PowerPC programs.

> Does the ADA implementation write assembly to ADA or does it go assembly > c > ADA?

This depends highly on the implementation; there's no reason that assembly or C needs to appear in any of the bootstrapping process at all. (I'm working on an Ada compiler written purely in Ada, so no C or assembly.)

> Which is to say what architectures are supported by ADA/its implementations.

> As if you are writing for security you might as well not bother with the lanuage if all it supports is x86 or ARM or some shit like that.

I know people who have compiled non-trivial programs from *VERY* different archetectures with little to no alteration in the source code. (In particular, Randy on comp.lang.ada has some excellent stories about odd archetectures and his Janus/Ada compiler.)

>>1031918

>>C is not a subset of C++

>That's better discussed in another thread.

Agreed.


 No.1032115

This is pretty interesting, I think.

I've always had a soft spot for Ada.


 No.1032184>>1032304 >>1032615

>>1031909

The first was an interesting concept that was originally referred to as `potable assembly` and was based around expanding on another language that didn't even have types. The second was a further expansion on the first that took something simple and stupid and made it complicated and stupid.


 No.1032225>>1032304

>>1030587 (OP)

Why not something actually verifiable, like F*?


 No.1032304>>1036698

>>1032225

>Why not something actually verifiable, like F*?

Maybe because for the firmware they want the ability to do low-level programming? Or maybe because they think that transitioning to the functional paradigm will be too much of a hurdle for their existing programmers.

>>1032184

> C = `potable assembly`

But, honestly, Forth does an excellent job of that.


 No.1032615>>1032626 >>1032638 >>1032640

>>1030587 (OP)

It's about time there's some good news in technology. Nvidia replacing C with Ada shows that C is not as hard to replace as the shills want you to believe. OP's article is proof that switching from C to Ada does the opposite of what switching from Ada to C and C++ did for the F-35.

>>1031450

>You're not wrong -- perhaps the biggest problem/difficulty is that the industry standardized on C and C++ and waste untold millions trying to "fix C" or make a "Better C" (Java, C#, Swift, ObjectiveC, etc) and now have such emotional investiture it's like looking at a fully internalized Sunk Cost Fallacy.

C and UNIX are the biggest disaster and waste of effort in computer science or possibly any industry ever. Much smaller teams of programmers produced much better languages and operating systems in a lot less time. This is because of how unproductive and inefficient C and UNIX are. They need that much work just to stay up to date. Without 15,600 programmers, Linux (just the kernel) would not be able to continue to run on newer hardware. Better operating systems have fewer programmers because they don't need as many. If 15,600 programmers were told to work on any other operating system, they would literally have nothing to do.

>>1032184

>The first was an interesting concept that was originally referred to as `potable assembly`

That's revisionism. C was not meant to be portable or assembly. It was meant to be the equivalent to PL/I in Multics and Lisp on Lisp machines. They only started calling it "potable assembly" when they realized that it couldn't compete with real programming languages. That's around the same time they started their "simple and elegant" buzzword describing kludges like using the tape archiver "tar" to copy directory hierarchies. What really sucks is that C is a lot less portable than other languages like Ada, Pascal, PL/I, BASIC, Lisp, Cobol, and so on. Most languages don't care if characters are byte-addressable or any of that other bullshit. Word-addressed CPUs were handling strings just fine before C.

>and was based around expanding on another language that didn't even have types.

This part's right, which is why C sucks so much. Types, arrays, strings, compound literals, objects, exceptions, the preprocessor, and everything else in C and C++ sucks because it was not originally there. There is no coherent design like there is in real programming languages. It's like building a house by having a thousand people nail boards together with no blueprints or idea of what it should look like.

Why am I retraining myself in Ada?  Because since 1979 I
have been trying to write reliable code in C. (Definition:
reliable code never gives wrong answers without an explicit
apology.) Trying and failing. I have been frustrated to
the screaming point by trying to write code that could
survive (some) run-time errors in other people's code linked
with it. I'd look wistfully at BSD's three-argument signal
handlers, which at least offered the possibility of provide
hardware specific recovery code in #ifdefs, but grit my
teeth and struggle on having to write code that would work
in System V as well.

There are times when I feel that clocks are running faster
but the calendar is running backwards. My first serious
programming was done in Burroughs B6700 Extended Algol. I
got used to the idea that if the hardware can't give you the
right answer, it complains, and your ON OVERFLOW statement
has a chance to do something else. That saved my bacon more
than once.

When I met C, it was obviously pathetic compared with the
_real_ languages I'd used, but heck, it ran on a 16-bit
machine, and it was better than 'as'. When the VAX came
out, I was very pleased: "the interrupt on integer overflow
bit is _just_ what I want". Then I was very disappointed:
"the wretched C system _has_ a signal for integer overflow
but makes sure it never happens even when it ought to".


 No.1032626>>1032630 >>1032638 >>1033559

>>1032615

>Lispfag shows up in an Ada thread

>spends most of his post bitching about Unix and C

>only mentions Ada offhand a couple times to make it look like he's on topic

>insists Linux needs 15k contributors when most of those are random programmers sending in a patch every couple years and the regular contributor count is much smaller

What a loser. You've wasted so much of your life bitching about things you hate that you can't even write a single post focusing on the things you like.

Here's a challenge: try writing a single post about something you like without mentioning C or Unix once. Convince people of your favourite language's and OS's advantages through their features and design philosophy instead of simply namedropping things you like between shitting on things you don't like.


 No.1032630>>1032653

>>1032626

Yeah. He does this shit. And when no one responds, he will samefag himself. The mods won't do anything about it because "we don't see it as off topic" but derailing threads has always been against the rules. I think the mods are the fag, or it's one of their friends so they won't do anything about it.


 No.1032638>>1032644 >>1032647

>>1032615

Interesting; thank you for sharing this.

>>1032626

>Convince people of your favourite language's and OS's advantages through their features and design philosophy instead of simply namedropping things you like between shitting on things you don't like.

Disredarding excessive language/OS hate, I think there's a few people who have done so in this thread... though sometimes it is helpful to draw comparisions.

Ex:

In Ada you can have arrays (and slices thereof) as proper parameters whereas in C and C++ you cannot because of how their semantics devolve the array into a pointer/address.

Type Vector is Array(Positive range <>) of Integer;

Function "+"( Object : Vector ) return Long_Long_Integer is
begin
Return Result : Long_Long_Integer := 0 do
For Item of Object loop
Result := Result + Item;
end loop;
end return;
end "+";

which allows our summation (unary +) to be used on 'Vector' and slices thereof. You simply can't do this with a C-style array because of the aforementioned problems (you would have to include a length parameter, or sentinel-value like NUL in strings).


 No.1032640

>>1032615

>It was meant to be the equivalent to PL/I in Multics

What a load. C's minimalism is a reaction to this exact bullshit. Too bad they took the reaction too far.


 No.1032644>>1032647

>>1032638

Why don't you take your shit and make your own fucking thread instead of trying to derail every other thread to be about you?


 No.1032647>>1032649

>>1030587 (OP)

>>1032638

>>1032644

> Why don't you take your shit and make your own fucking thread instead of trying to derail every other thread to be about you?

What?


 No.1032649

>>1032647

He probably responded to the wrong post.


 No.1032653

>>1032630

>I think the mods are the fag

Nah, they tolerate mikee too, they're just the mods this board deservers


 No.1033507

>>1030610

http://www.ada-auth.org/standards/2xrm/html/RM-4-5-10.html#I3068

function Factorial(N : Natural) return Natural is
([for J in 1..N => J]'Reduce("*", 1));
huge line savings from this. Maybe they're feeling some pressure from Rust?


 No.1033559

>>1032626

>Lispfag shows up in an Ada thread

>spends most of his post bitching about Unix and C

>only mentions Ada offhand a couple times to make it look like he's on topic

He's right tho


 No.1035008

>>1030587 (OP)

I really don't see any problems with this. OEMs need to start worrying more about firmware security. It's probably the weakest link in most modern computer systems today. Everything from device drivers to BIOS/EFI implementations is buggy as all fuck. The next step in the verification of this code is to open source it and allow community review. Combine that with a sane and regular update system and things will slowly but surely improve over time.

>>1030693

Stop comparing military hardware to literally anything else. It's all a massive money pit that government boomers shovel tax dollars into. It's in the interest of the military industrial complex to drag their feet and beg for more cash for years and years.


 No.1035018>>1035024 >>1035026 >>1035028

its weird they are moving to ada now. even mil shit is moving away from ada recently. no one comes out of their uni knowing it so it's a pain in the ass to teach new employees. also most of the compilers are not as nice as the C compilers and produce slower code. ADA is mostly used when dealing with legacy shit now days. Its also OO so its not as low level as C.


 No.1035024>>1035026 >>1035041

>>1035018

>cniles shitting on Ada because of muh fast

based

With C/C++ you truly can fuck your shit up the fastest.


 No.1035026>>1035031

File (hide): 0af55157a1beef6⋯.png (1.73 MB, 1731x1227, 577:409, f35_v2 (1).png) (h) (u)

>>1035018

And look at how well the F-35 clusterfuck is turning out.

>>1035024

If anyone's senile here, it's the Multicsfag boomer.


 No.1035028>>1035041

>>1035018

nigger, if the language you knew coming out of uni were what mattered, Scheme and Pascal would be the big two languages of industry.

You can get comfortable with Ada in a month at a relaxed pace.

>Its also OO so its not as low level as C.

jesus this LARP


 No.1035030>>1035033

>>1030738

>It also shows how people are incapable of understanding that the problems with overflows and """"""safety"""""" are not intrinsic to software but to hardware

It's a software problem when the software has to account for how the hardware behaves. Most of the egregious security flaws in C are retards like you expecting the C compiler to compile the program for exactly the hardware sitting in front of you.


 No.1035031>>1035043 >>1035168

>>1035026

>F-35

>F-35

>F-35

That's what you always bring up.

Do you think using "safe" software on broken hardware that it will be anywhere near safe?

Are you incapable of knowing what that problem is or are you willfully ignorant of it because it doesn't match your rhetoric?

The military has bigger problems with overcharges and excessive spending on contracts (which is excatly the case here) than using Ada for anything.

If they used Ada here with all the safety checks compiled in, the software would have too much latency. That's why they didn't use Ada. That's also why they need to shoot contractors who steal from them and produce shit work.


 No.1035033>>1035034 >>1035061

>>1035030

No.

It's a hardware problem. How do you not understand this? Overflows can be checked by hardware. Checks don't and should not rely upon software. Unfortunately, every single ISA available requires software to check for these issues when hardware can be designed entirely capable of handling and mitigating those errors.

Keep holding on to your """"""security"""""" blanket. You need to stay in school anyways.


 No.1035034

>>1035033

>have Ada compiler designed with overflow checks in mind

>compiling for platform with fast hardware overflow checks

>"it'll be slow because the checks are in software"

which wikipedia article did you read just so you can have more bait in your Ada whining?


 No.1035041>>1035049

>>1035024

I say its faster because of better compiler optimization and less overhead.

>>1035028

Have you gone to university? All the embedded shit you do in college is with C and asm and yes for many things outside your shitty desktop you want to be low level, how is that larp? Youre just a retard who never wrote any program other some shitty desktop app or maybe youre a webshitter im not sure.


 No.1035043>>1035069

>>1035031

>muh latency

Ada has been successfully used in other fighters. C++ is not the reason the F35 sucks balls but its adoption is a symptom of the overengineered but still broken bullshit plaguing the F35.


 No.1035049>>1035075

>>1035041

wanting low level isn't LARP. Thinking that OO features prevent you getting low level is LARP.

How many lines of code do you think it takes to add OO to Forth?

Do you think a feature like being able to specify exactly how a record is laid out in memory gets turned off if there are method calls around?


 No.1035061>>1035069

>>1035033

>Unfortunately, every single ISA available requires software to check for these issues

Except the ones that offer instructions that trap on overflow. Not that you actually know anything about hardware.


 No.1035069>>1035114

>>1035043

>in other fighters

Were those other fighters equipped with the same equipment? Probably not. When signals get to a high enough frequency, say like for radar, latency becomes a huge issue for tracking enemies and keeping a pilot alive.

Since you throw away latency as an important aspect when designing life-and-death devices, I take it that you are a dumb fuck that doesn't know his head from his ass.

>>1035061

Post them then, faggot. Otherwise you are just pulling shit from you ass like anyone else.


 No.1035075>>1035085 >>1036166

>>1035049

forth is a high level language, i dont know why you are bringing it up. low level lang means your code operations is mostly 1 to 1 with assembly, ie if you look at something like i = 1 + i each operation in the statement mostly corresponds to a part in the assembly at least with no optimization. that's why we call C a low level language. if you turn off compiler optimization each operation in C (at least) mostly corresponds to a group of instructions in asm and compiler optimization mostly does things you can actually comprehend like inlining and other garbage. once you start adding big ass data structures like objects and shit beyond the basic struct it gets less 1 to 1 thus less low level. although i guess you could get something sort of low level. i know people use a stripped away c++ compiler where they only leave basic object shit so its c with objects. i guess you could qualify that as low level but object orientated programing in general increases a language to a higher level of programming. its not how many lines of code a features takes to add to language that's retarded.

tldr your a retard desktop/webshitter/not a programmer


 No.1035085>>1035103

>>1035075

>I just read a wikipedia page about Forth and I am here to educate you about it

>I have used C and C++ and C++ sucks so "OO sucks for low level" QED


 No.1035103

>>1035085

>the forth website says its high level https://www.forth.com/forth/

I dont think there is anything wrong with being higher level. i like c++ actually and sometimes it good for for embedded shit same with ada. its just the fact ada is a little outdated and its weird for new shit to be written in it. in order to get a nice ada compiler with modern features you have to pay for it. thats why places like lockheed and boeing are moving away from it unless they are dealing with legacy stuff and thats why they use a striped away c++ compiler for when they want OO. plus other the other shit i have said like no one out of university knows it.


 No.1035114

File (hide): 002207d2e475171⋯.jpg (104.71 KB, 477x352, 477:352, 1443582710916.jpg) (h) (u)

>>1035069

>Were those other fighters equipped with the same equipment?

No, because their equipment actually works.


 No.1035168>>1035270 >>1035277 >>1036166

File (hide): 7951ae422173dbf⋯.jpg (90.63 KB, 1266x905, 1266:905, FWeXJc5a.jpg) (h) (u)

>>1035031

>If they used Ada here with all the safety checks compiled in, the software would have too much latency. That's why they didn't use Ada. That's also why they need to shoot contractors who steal from them and produce shit work.

Then why do NASA and ESA use Ada for rocket nozzle and aileron actuator FBW systems?

Why is most of the software running on whatever milshit SoCs the Eurofighter uses written in Ada?

Why aren't there any video games written in Ada?


 No.1035270>>1035274 >>1035277

>>1035168

>Why aren't there any video games written in Ada?

I hope this is a rhetorical question.


 No.1035274>>1035277

>>1035270

the answer is "nobody's done it yet". There's no significant barrier to it other than that.


 No.1035277

>>1035270

>>1035168

>>1035274

Game development is almost exclusively done in a mixture of C/C++ and assembly with an interpreted scripting language on the side. Sometimes you'll get Java or C#, even Pascal or Lisp in rare cases, but the majority of libraries and APIs are built with C or its fat cousin in mind even when there's bindings for other languages.

In addition, Ada is very foreign to the game development industry and non-aerospace programmers in general. There are some bindings for stuff like SDL2 and OpenAL but nothing uses them outside some very obscure freetard games and that one guy reimplementing id Tech 4 in Ada.


 No.1036166>>1036194 >>1036209 >>1036422

>>1035075

>forth is a high level language,

Dude, what are you smoking?

Forth is lower level than C.

(It just allows VERY rapid abstraction.)

>>1035168

>Why aren't there any video games written in Ada?

Mostly because in the mid-90s they drank the C/C++ kool-aid more than anybody. There's actually a nifty segment from a documentaty/informational/advocacy video series that has Ada being used for a game/sim:

http://www.adapower.com/files/media/SEG_MM1.MOV

http://www.adapower.com/files/media/SEG_MM2.MOV

http://www.adapower.com/files/media/SEG_MM3.MOV

http://www.adapower.com/files/media/SEG_MM4.MOV

http://www.adapower.com/files/media/SEG_MM5.MOV

http://www.adapower.com/files/media/SEG_MM6.MOV

http://www.adapower.com/files/media/SEG_MM7.MOV

http://www.adapower.com/files/media/SEG_MM8.MOV

http://www.adapower.com/files/media/SEG_MM9.MOV

(I think it's #6.)


 No.1036194>>1036196 >>1036733

>>1036166

Remember that game developers has long focused on speed over safety. It makes sense on limited hardware but as 3D engines grow in size and complexity, safer languages become more appealing.


 No.1036196>>1036733

>>1036194

less speed's always going to be a hard sell. Trying to wish that away is trying to wish that people will use Common Lisp.

What I'd like to argue is that actually Ada's speed is very good and that the safety saves so much development time that you also have more time to optimize it. But to really argue that I'd have to be a game developer, and I ain't one.

What I have seen is that Ada 2012 pre/post conditions are amazing for sussing out bugs in code, meaning you don't need them for a production build because the bugs are gone by then, and also that Ada's efficiency is good enough that a "debugging build" is not hopelessly less efficient and so is something you might acceptably put into production if you're concerned about bugs. I get the impression that some other languages have enormous penalties to compile times or to efficiency depending on your target; Ada (with gnat) has much smaller penalties.


 No.1036209

>>1036166

(It's #8)


 No.1036422>>1036733

>>1036166

Does this exist in a better quality somewhere? I can barely make out what they are saying.


 No.1036698

>>1032304

>`potable assembly`

Refreshing.


 No.1036733>>1037524

>>1036194

>Remember that game developers has long focused on speed over safety. It makes sense on limited hardware but as 3D engines grow in size and complexity, safer languages become more appealing.

This is true, but ignores something vital: more optimization (and not premature optimization) is doable with something like Ada, where you can say "Type Percent is range 0..100;" and have the compiler choose the appropriate low-level/HW representation.

>>1036196

>less speed's always going to be a hard sell.

Less speed is a lie perpetuated by people who don't know what they're talking about -- indeed, you can have more speed and more safety simultaneously.

See attached PDFs.

> Trying to wish that away is trying to wish that people will use Common Lisp.What I'd like to argue is that actually Ada's speed is very good and that the safety saves so much development time that you also have more time to optimize it.

This is absolutely true; having the compiler say "hey, you messed up here" at compile-time is far more productive than getting core dump at run-time.

> But to really argue that I'd have to be a game developer, and I ain't one.What I have seen is that Ada 2012 pre/post conditions are amazing for sussing out bugs in code, meaning you don't need them for a production build because the bugs are gone by then, and also that Ada's efficiency is good enough that a "debugging build" is not hopelessly less efficient and so is something you might acceptably put into production if you're concerned about bugs. I get the impression that some other languages have enormous penalties to compile times or to efficiency depending on your target; Ada (with gnat) has much smaller penalties.

Yes; see the PDFs.

>>1036422

>Does this exist in a better quality somewhere? I can barely make out what they are saying.

I've looked for better quality, but I have been unable to find one. (Though my searching-skills are, admittedly, not the best... esp. w/ new Google et al.)


 No.1037524>>1037543 >>1037638

>>1036733

Serious question, where did you learn all this stuff?


 No.1037543>>1037647

>>1037524

Not him, but old resources are a goldmine. Software as a field has a serious problem with not remembering its own history and chasing after fads.


 No.1037638>>1037640

>>1037524

>Serious question, where did you learn all this stuff?

Part of it was prior to college, part in college, and part of it is due to my interest in Ada and digging up interesting/useful things due to my interest in both compilers and language-design.

Probably a big portion was due to teaching myself Turbo Pascal (7 / 1.5 for Windows) using nothing more than the documentation that came with the compiler and the compiler's own error messages -- this paved the way for me to *SEE* just how much shit was being pushed in SW/college with C/C++, and later Java which I have *very* mixed feelings toward (it's alright as a 2nd or third language, but is terrible as a first/learning language and having its only redeeming quality education-wise being in that it's not *as bad* as C/C++).


 No.1037640>>1037726

>>1037638

>Java

>not as bad as C/C++

What are your thoughts on Smalltalk and Limbo?


 No.1037647>>1037726

>>1037543

I guessed so. It's incredibly hard to distinguish actual good sources of information from the snake oil in this particular field nowadays.


 No.1037716

File (hide): 936ab364e641b97⋯.png (308.49 KB, 990x682, 45:31, 1513192498808.png) (h) (u)

>>1030624

>not using holly C


 No.1037726

>>1037640

I haven't heard anything about Limbo.

Smalltalk is kind of interesting, though I honestly don't know a lot about it, I am a fan of the "image" idea.

>>1037647

>I guessed so. It's incredibly hard to distinguish actual good sources of information from the snake oil in this particular field nowadays.

Take a look at this 1987 paper, it shows how to achieve Continuous Integration at a fraction of the cost [money, complexity, bandwidth, and CPU-time] of current systems: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.26.2533&rep=rep1&type=pdf




[Return][Go to top][Catalog][Screencap][Nerve Center][Cancer][Update] ( Scroll to new posts) ( Auto) 5
191 replies | 12 images | Page ???
[Post a Reply]
[ / / / / / / / / / / / / / ] [ dir / 2hu / bcb / choroy / dempart / doomer / jp / mde / xivlg ][ watchlist ]