[ / / / / / / / / / / / / / ] [ dir / random / 93 / biohzrd / hkacade / hkpnd / tct / utd / uy / yebalnia ]

/cyber/ - Cyberpunk & Science Fiction

A board dedicated to all things cyberpunk (and all other futuristic science fiction)
Name
Email
Subject
REC
STOP
Comment *
File
Password (Randomized for file and post deletion; you may also set your own.)
Archive
* = required field[▶Show post options & limits]
Confused? See the FAQ.
Flag
Embed
(replaces files and can be used instead)
Oekaki
Show oekaki applet
(replaces files and can be used instead)
Options

Allowed file types:jpg, jpeg, gif, png, webp,webm, mp4, mov, swf, pdf
Max filesize is16 MB.
Max image dimensions are15000 x15000.
You may upload5 per post.


“Your existence is a momentary lapse of reason.”

File: 0e4f908b11d6b82⋯.jpg (13.65 KB,281x179,281:179,index.jpg)

 No.51034

As mentioned in the cyber feel thread, how about we discuss how to properly use VR.

Right now, what we have is virtual monitors in a 3D environment. It's the most basic user interface inspired by the concept we've been using for decades.

It doesn't do the possibilities justice though. It's still limiting yourself to monitors, just more of them in a beach house, mountain cabin or whatever you design.

Let's discuss how you could actually "dive in". Post a conventional program (file manager, text processor, web browser, IDE, accounting applications, online banking, etc.) and how you could take advantage of VR. After all, it's just representing 1s and 0s, right?

Multiple concepts for a single program and feedback welcome.

____________________________
Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.51035

>>51034

>web browser

Websites are usually optimized for a resolution, but one way would be to create a hierarchy web to surround you, sort of like a localized web crawler.

Difficulties would be security and excessive preloading, but we're talking concepts here.

>read a news website

>the frontpage is the web's root

>every headline/subject is also a node left and right from the front page with edges connecting them

>looking at a node and pushing a button will project it behind the frontpage and have you "follow" it

>links, once more, are projected left and right

>pushing another button will cause you to go up in the hierarchy (sort of like a "back"-button)

>multiple pages just mean multiple webs placed all around you

You'll always be limited by websites being designed for relatively small surfaces, but you can use the environment for a better UI instead of >muh beautiful landscape.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.51036

File: bd8a77e5a16b2fc⋯.jpg (255.25 KB,1200x800,3:2,luleas-rapid-deployment-da….jpg)

File: 97da69791b9db98⋯.jpg (95.91 KB,450x338,225:169,cyberspace.jpg)

File: a251729adfac5de⋯.png (15.46 KB,728x698,364:349,img.png)

After trying to make sense of "Hackers" representation of the cyberspace, I came up with something I call "Machine Room Metaphor".

Consider the following:

Processes running in the system are represented by arbitrary sized boxes.

User can move and manipulate them to his liking, but by default they are stacked on top of each other as if they were rack mounted servers in a data center.

Every box has a front panel and I/O panel.

By default, front panel displays process' name but process can change it to display image data and recieve events from user interaction (text input, virtual cursor, "touch", maybe something else). This can be used to emulate traditional virtual monitor or just a collection of buttons, levers and indicators.

I/O panel hosts series of connectors, through which User can attach running processes to each other via virtual cables.

"Portals"/"Doors" connect different "machine rooms" which can be on different servers in the network or concurently running on single computer, and cables can run through them (having the ability to carry processes around would be cool too).

You don't even need a VR helmet for this, it would somewhat work on a standard screen+mouse+keyboard combo with familiar shootery controls.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.51041

If you think about it, we haven't really leveraged much in terms of the usual keyboard/monitor. We still have just a better version of a type-writer with a printer attached, scrolling text. Hypertext has been the only real move forward in generations.

Having used VR desktop programs, it really just made it easier to lay out 2D screens, so I could have more surfaces to read on. You're correct about that, but that's still just a single step from printed paper.

I'm a dev, and all the devs and sysadmins still use terminal all the time. Just little black windows. The only move forward, from the 80s, is that I can see more than one on my screen at a time.

So, yeah, VR could be leveraged somehow, but honestly I don't know if we ever figured out how to leverage 2D interfaces in any way special.

It's all just been a way to show us a sheet of words. Until we can directly connect with the brain, I'm not sure we'll get much further than that.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.51046

>>51041

This. The problem is how human brain evolved - there are two major senses which allow us harvest information from our environment - sight and hearing. Every single interface is made of text, which is graphical representation of words (sound) 1D and something you can see e.g. boxes, images. The problem is everything we see is 2D (with depth), and so are interfaces we make, our screens.

Making special rooms and 3D interfaces in VR would be cool thing, but it isn't going to improve anything, because it is moving away from the brain. Just look at how computer peripherals have evolved: plugs -> keyboards -> mouses -> touch screens. This is leading us closer to our brains, and because of that I think the next revolution is making brain-computer interfaces, so we can think directly to machine, while eliminating everything between. Other option is changing our brains, so we can develop other senses, that wouldn't be so primitive. What we have now evolved just to let us survive. Sorry for disabled english 2nd world here.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.51089

File: 3c6c07fb678668c⋯.jpg (96.56 KB,1125x750,3:2,imsai8080-1112-09-750.jpg)

>>51046

>plugs -> keyboards -> mouses -> touch screens. This is leading us closer to our brains

But, ironically, further away from the machine itself.

If you want to get an interface revolution going, rolling back to blinkenlights days and starting from scratch without accumulated layers of abstractions and assumptions might be a good idea.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.51095

>>51089

Maybe, but it's a risky gamble. That would probably alienate all the non tech-savvy people out there without any benefit in the short term, for what could possibly result in very insignificant breakthroughs. >>51089

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.51152

>>51089

Agreed, as much time as I spend in terminal interfaces, I sometimes find myself wanting a moving graph of output. I won't advocate for going back to unix philosophy, but maybe it would make sense if programs had variable numbers of input and output streams that could be connected to different things. Ex, outstream1 is a 2d graph, outstream2 is a 3d graph, outstream3 is a text console, ect. Then some other userland application is responsible for figuring out how to display it to the user.

I also thing 3d environments that you walk around and have to wave your arms to interact with are overrated. There's a reason the Wii didn't stay around. Just let me sit at my chair, and use the querty keyboard that I'm used to, and display stuff on top of my vision AR style.

>>51035

Image what websites might look like if they weren't constrained to 2d. At the very least I want to shitpost on a 3dmodel board, with animated lolis looking smugly at me.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.51177

>>51095

Remember tiling WMs, they work just fine and nothing is wrong with it.

>>51089

Mouses are as far from machine as touchscreens, being a pointing device.

The best thing we can possibly get with our shitty brains is https://en.wikipedia.org/wiki/Brain-computer_interface , especially with AR. Though doing all computer stuff when your brain is one is still infinitely better.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.51179

File: b7b595416c0de2b⋯.jpg (432.46 KB,1605x1078,1605:1078,CUTER_7FFF.jpg)

>>51177

>Mouses are as far from machine as touchscreens, being a pointing device.

I could argue about about driver complexity, but eh, I guess those trends aren't directly correlated.

It's just that we went from almost directly tapping into data and adress buses to working through so many layers of abstractions no sane individual can keep track of them, and I don't feel like this is a good trend.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.51185

>>51179

True, though this mess is is more about proprietary shit and corp/govt fuckery with compatibility, i.e. nvidia and linux, for example. You cannot do much with keyboards as they are standardized and only bluetooth ones have some space to make a mess required for backdoors and other security problems. Touchscreens are so terrible because of the systems they are implemented in, remember pocket PCs? They were not that bad even with touchscreens while today's phones have more with consoles than with a PC.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.51190

File: 7b6b0e34ca6552b⋯.jpeg (351.94 KB,2048x1536,4:3,01.jpeg)

Obviously, the CityScape meme is a common paradigm for navigating around. Also, I think a sort of clunky, old interface for indexed data (re the fax machine with the hidden message) is a tried and true approach to the problem.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.51191

File: 034f8c4112ce486⋯.jpeg (90.39 KB,826x466,413:233,02.jpeg)

File: e97bcdfdcbad321⋯.jpeg (1.21 MB,1920x1040,24:13,03.jpeg)

File: e5a346c36dad355⋯.jpeg (52.57 KB,640x480,4:3,04.jpeg)

File: 55eb0134f049e8d⋯.gif (2.78 MB,640x357,640:357,05.gif)

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.51195

>>51191

The physical world is the worst possible metaphor for the internet I could imagine. If I want to go to a site to get something, I have fly there? Ugh …

I remember Secondlife having a Dell shop where you could walk around and build your computer with virtual components. What a fucking terrible interface.

We do things with 'pages' and 'hypertext' because they work better than reality. We can search for something, click on it, and be there immediately without any sense of travel.

For getting work done, it's already so much better than a virtual world I inhabit.

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.51214

If you want to enable usable VR headsets and UIs on a very low level, you need to rethink the entire paradigm. Think VMs and containerization.

You should be spooling up a VM with your GPU on passthrough that does nothing but show you the display and pass data back and forth for a secondary VM, which is where all your actual work takes place. And much of that work should actually take place in containers, which would be really fucking nifty to control in VR and would be very intuitive as the cyber-tower type structures shown in >>51036

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.

 No.51217

File: a8eaad6c55733b0⋯.webm (373.3 KB,1280x720,16:9,glt camera.webm)

File: 17239cbec187bb0⋯.png (109.98 KB,1916x959,1916:959,hamstakilla.png)

I feel like something similar to webm related would be a good way to display files in 3d space

theres also this imageboard http://hamstakilla.com/ which has an option to browse in 3d

Disclaimer: this post and the subject matter and contents thereof - text, media, or otherwise - do not necessarily reflect the views of the 8kun administration.



[Return][Go to top][Catalog][Nerve Center][Random][Post a Reply]
Delete Post [ ]
[]
[ / / / / / / / / / / / / / ] [ dir / random / 93 / biohzrd / hkacade / hkpnd / tct / utd / uy / yebalnia ]