[ / / / / / / / / / / / / / ] [ dir / random / abcu / ebon / k / komica / miku / nofap / random / ytc ][Options][ watchlist ]

/tech/ - Technology

Freedom Isn't Free
You can now write text to your AI-generated image at https://aiproto.com It is currently free to use for Proto members.
Email
Comment *
File
Select/drop/paste files here
Password (Randomized for file and post deletion; you may also set your own.)
Archive
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Expand all images

File (hide): 319cb20512b7567⋯.jpg (1.7 MB, 2560x1699, 2560:1699, 2560px_School_Begins_Puck_….jpg) (h) (u)

[–]

 No.1084882>>1085002 [Watch Thread][Show All Posts]

Can AI be racist???

https://youtu.be/gYC9QoSQx48

____________________________
Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.1084887>>1084929

No, because AI don't discriminate based on race nor can you tell it to "stop noticing things" with humans unless you explicitly program it to do so.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.1084929

Doesn't software already make decisions on things like early release from jail/prison?

>>1084887

For things like governmental functions are those explicit statements allowed like "affirmative action" is?

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.1084966

Ai is like a graph

And needs to be read only

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.1085002

>>1084882 (OP)

Could an algorithm stereotype a race? If that's your definition of the term, then yes an algorithm could be racist. For instance, an AI that made a prediction upon seeing a photo of a person on whether they are likely to be able to solve a question, could add weight to perceived skin tone or bone structure as a feature in its neural network. After sufficient feedback, the AI might perhaps learn that a nigger is expected with high probability to "not know" the answer.

We do the same thing by the way. It's why when you walk into a store with a question, you're met with initial disappointment when the available employee is a nigger. You may even decide it's not even worth bothering to ask at all, or perhaps you'll be quite surprised if they actually turn out to be helpful. When you're walking alone and you see three old ladies coming your way, you relax knowing from a pattern of past behaviour that it is likely to end without incident. If it were a pack of young niggers, your expectation for conflict rises.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.



[Return][Go to top][Catalog][Screencap][Nerve Center][Random][Update] ( Scroll to new posts) ( Auto) 5
4 replies | 0 images | Page ?
[Post a Reply]
[ / / / / / / / / / / / / / ] [ dir / random / abcu / ebon / k / komica / miku / nofap / random / ytc ][ watchlist ]