Hey this is really cool, I'll have to try this at lunch. How does the game emulation work inside UE4 ?
Who are you guys though? A studio, one guy? There's like 0 info on any website or social media channel.
We tried this at the office recently and it is by far one of, if not the coolest thing I've experienced in VR. They absolutely nailed the atmosphere in this aswell.
I'd love more of this in the future, like a lobby you can join with friends in. The future's bright for VR.
Xoliul: Thanks! The emulation is done using an open source Gamboy Emulator called Gearboy which is then embedded into Unreal and linked to the blueprint system for inputs. We're a new studio.
Mask_Salesman: Glad you enjoy it! Regarding the multi-player aspect, we agree with you! So far the feedback has been helpful and encouraging for possible future prospects.
This is pretty rad, even without VR (just played the build on a normal PC). I'm more excited about the ability for people to easily link together projects within UE than I am about the VR aspect, personally--embedding the emulator and streaming radio got my brain thinking.. I can imagine games where television, radios, etc, link to live feeds, Or where people could get ingame flavor text scraped from an active twitter feed. Or where real-life stock prices could be used as seeds for random content. Neat-O.
One potential glitch: I never heard any sound coming through the games?
Also, I'm a bit curious about the process: are you guys just capturing the emulator's screen and mapping it to a texture on the models 30 times a second, or is there a more graceful way of doing it?
blankslatejoe: The sound is not emulated due to the time constraints of our development combined with the huge optimisation that comes with not emulating the sound.
The emulator handles all the processing internally and allows output of raw pixel data, we then simply map and update a texture with those new pixel values and draw it like any other texture in the engine. The games are updated internally at 59.75fps like the original GB games and this is also when the output is mapped to the textures. I believe this is the most efficient way of achieving this rarther than direct window capturing.
blankslatejoe: The sound is not emulated due to the time constraints of our development combined with the huge optimisation that comes with not emulating the sound.
The emulator handles all the processing internally and allows output of raw pixel data, we then simply map and update a texture with those new pixel values and draw it like any other texture in the engine. The games are updated internally at 59.75fps like the original GB games and this is also when the output is mapped to the textures. I believe this is the most efficient way of achieving this rarther than direct window capturing.
How is that process different than capturing video? (Doesn't the emulator have to capture video to be converted to a texture?)
How is that process different than capturing video? (Doesn't the emulator have to capture video to be converted to a texture?)
I'll take a guess and say that direct window capturing would need the image processing to happen twice -- once to create the window and encode the images to be captured as video, then again to move that into the engine where the engine would have to encode the video capture for display. This method skips the initial encoding and sends the raw data to the engine, which then creates the images itself rather than recreating it from what the emulator creates.
Replies
Who are you guys though? A studio, one guy? There's like 0 info on any website or social media channel.
I'd love more of this in the future, like a lobby you can join with friends in. The future's bright for VR.
Mask_Salesman: Glad you enjoy it! Regarding the multi-player aspect, we agree with you! So far the feedback has been helpful and encouraging for possible future prospects.
One potential glitch: I never heard any sound coming through the games?
Also, I'm a bit curious about the process: are you guys just capturing the emulator's screen and mapping it to a texture on the models 30 times a second, or is there a more graceful way of doing it?
The emulator handles all the processing internally and allows output of raw pixel data, we then simply map and update a texture with those new pixel values and draw it like any other texture in the engine. The games are updated internally at 59.75fps like the original GB games and this is also when the output is mapped to the textures. I believe this is the most efficient way of achieving this rarther than direct window capturing.
How is that process different than capturing video? (Doesn't the emulator have to capture video to be converted to a texture?)
Thank you for asking Yourname942