Jump to content

johnparker007

  • Posts

    3,648
  • Joined

  • Last visited

  • Days Won

    167

Everything posted by johnparker007

  1. You got it man Sounds bodgy, but it's how all multiplayer games work... when playing, everything should appear perfectly responsive with zero lag to each individual player, that's the plan anyway
  2. The player actually playing the machine (and walking around) won't have any delay, as they are playing machines/walking around locally (just like in some of the earlier demo videos on this thread). It's a bit mad to get your head around ...but it's generally how multiplayer works on most games - you yourself experience zero delay on your own character/actions - though there's a slight delay on all the remote characters characters/actions on your screen. You don't normally get to see local and remote side by side like in the tech demo, so you wouldn't normally be able to notice the slight delay on remote characters. You just assume they moved whenever they moved on your screen, even though everyone else is slightly ahead. I think I've done a terrible job of explaining that lol So - in the video above, on the left side; the cube was moving for me at exactly the same time I was changing it's x/z position. But on the right side, it was happening a bit later. Normally, if I was the player on the left side, I'd only be able to see my own screen, so my cube (or avatar, or reels etc) would move at the correct time with no delay (for me). Likewise, the player on the right side can only see their screen, so they see me moving around, but they have know way to tell that I'm actually moving on their screen with a small delay. So it all generally works and feels like there is no delay, even though there's a little bit of an illusion going on Definitely looking at text chat... voice chat - will depend on how much server bandwidth it chews through
  3. First test of getting some data across the internet. In this vid, the left and right are two separate instances of the arcade. Both on my PC, but they could just as easily be on different PCs anywhere (using Amsterdam for the server for now, as we're mainly UK/EU based). You can notice the block on the right instance moves with some slight delay and smoothing, as I move it 'live' in the left instance.
  4. Aw thanks mate It's cool though, I've got something underway this afternoon - I've set up Perforce to run locally to manage the code/assets, then backing that up to BT Cloud (I get 1TB online free with them)... it's all chugging away at the mo, but I think that along with occasionally making a copy of my work folder to an external hard drive should do the trick, and ensure the project is safely backed up as it continues to grow
  5. I was thinking that for each machine, there would be pre-recorded gameplay clip (audio and lamps/reels etc). Then if that machine is in use, its audio sample would play on the client. By taking into account where the client avatar is stood, that would set how loud each sample would be playing. So hopefully it would sound authentic, once there are say 10-15 machines being played around the arcade - so you would only hear the sounds of Bar X being played off to your left if there actually is a Bar X being played off to your left. This approach could also support attract mode audio. As you walk closer to the Bar X being played, that sample would get louder (and machines being played that you are walking away from would get quieter). If you got close enough to trigger 'observer mode', then it would start streaming the live lamps/reels, and potentially audio (I am not sure how doable streaming the live audio will be just yet) etc from the PC of the player actually playing the game locally in MFME. There could be bots walking around playing too, though whether they'd really be playing in a streamable way (autoplay style where they just press random buttons to run through credits) will require more thought, as that's going to cost significant CPU time 'somewhere' in the network... just walking around simulating playing without actually running MFME instances will be fine though When a bot is playing a machine, it would then run a prerecorded gameplay loop just like how the idle loops work. In the short term, I have to sort out a better system for backing up this project - it's basically outgrown what github is for (mainly code), and is getting up to commercial game scale now! So I'd best get that sorted, as my project work folder is currently >6gb, and it's only going to keep growing, especially when I add all the other converted machines! I expect the end user install when all this is done will be something like 1gb.
  6. I've been doing some research into customisable avatars, and settled on a system that I think I will use. It's more complex to implement, but I believe it gives the best results. It would've really bugged me that the machines and environment would look realistic, but then the characters would look cartoonish, I think it would've been a real immersion-breaker! Anyway, here's me playing with the sample scene of the system (shows how the end user can customise their avatar at runtime), I popped some attract mode Happy Hours in the background along with our beloved arcade carpet to give it a bit of the arcade feel
  7. Thanks for the heads up, I may hit him up depending on how well my foray into 3DS Max goes I've currently discounted running the emulators on the server itself, as it wont scale - so if 100 people were to be online playing a machine each, I'd need the computing power to run 100 emulator instances... also they would experience a delay when interacting with the machine (as their button press has to get to the server, take place on server's emulator, then the client is also on a delay when rendering the machine via remote streamed data. So the plan is to have the end user running a single emulator instance locally (that would be installed as part of the arcade software) whenever they are playing a machine. Then, if people walk up to watch the gameplay, the player initially streams the codec to say the first 3 watchers, who in turn stream to up to 3 watchers each, etc. There'll be some small increasing delay for the observers as you move down the chain, but it's a scalable approach, so 1,000 people could all crowd around watching one machine being played, and the server wouldn't be overloaded
  8. Appreciated My general (somewhat uniformed) strategy for the cab models is to have say a straight on and side profile photo of the machine to be modelled, e.g: ...and then with a few rough known dimensions (i.e: cabinet width is 82cm, cabinet height at rear is 196cm)... I should be able to model the cab to accurate scale in 3DS Max. I'm just going to keep using the single cab model I have for now to keep things rolling, but I also have a talented 3D artist friend in mind who might up for doing some work on this project. He'll still need a few reference pics per cab, though from what I can tell there are far fewer cab variants than actual machine variants, for instance this JPM grey cabinet seems to have been used for a lot of different machines - great news for us, as then I only need a single JPM cabinet model and use it for all the JPM machines, and I'm pretty sure the same is true for other manufacturers due to it being more cost-effective to have 'standard' cabinets that they then put their various different game into
  9. Thanks for the info Currently I only have this one model, which I bought to save time so I could get started on the 3d side. My plan is to get some basic 3DS Max skills, then I reckon I'll be able to tweak/make new cab models as needed, as a lot of them are quite simple really; just chrome/painted metal tubing and black mdf panels - it should be something I can knock up then from photos, so the cabs are roughly authentic. Simple customisable humanoid avatars was what I had in mind - I'll probably just buy a prebuilt avatar creating/rendering system to save dev time, not researched that area yet, but should hopefully be something out there I can 'drop in' without a ton of work. This is a good point if/when this all gets to that stage, you guys will have get a list together of say 100 machines (and the relevant roms) that we want in the first test arcade, that are not vulnerable to emptiers for the tournaments. And yeah I see no problem with having a side room full of unchipped machines - I'll just have a flag on these broken machines that marks them as non-eligible for tournament play (so it wouldn't apply profit/loss to your bankroll).
  10. Ok sounds like I need to convert Rat Race and get a 3d model of a mobility scooter! Yep walking around is very much the plan, to get that first person perspective going on for the immersion And yes, I think the cat's out of the bag that my plan is to make this into a live arcade... so I'm thinking to make it so you can walk around and watch other people playing games on their machines (plus potential text/voice chat), as well as playing on machines yourself. When you choose to play a free machine, MFME will be launched in the background to run that machine. When you have finished playing on the machine, your server-based bankroll balance will update accordingly based on cash in/out. Then that will allow for daily/weekly/monthly tournaments etc which should be a lot of fun So as a part of that new work, I've taken a break from the Converter today, and started very basic work on the 'codec'. This will allow streaming of the machines lamps/reels/vfds across the net, as I should be able to write some efficient lossless compression. Currently I've just implemented the base recorder/player system, along with the recording/playing of 'simple' (on/off - no fade levels yet) lamps. This will be used for all the free machines to play their attract mode loops without needing to have an MFME instance running for each machine (as that would get impractical very quickly). Here's a demo of 44 machines running their attract mode lamps (at random playback start points to provide variation) via this new system (so I am running zero MFME instances here). These are not perfect loops, I just recorded a few minutes, later I should be able to write something to auto-identify the loops when recording the attract modes. Anyway, enough blurb from me - onto the eye candy (may be best to watch this direct on youtube rather than embedded in the forum, then you'll be able to see in full 1080p HD, as some of the lamps are now tiny!):
  11. Thanks mate You could move around in this environment, I'm generally using a static camera for these videos as it's just very quick for me to set up. Once I get that side of it more set up, I'll do some walking around videos Thanks man - and the offer of the donation is very kind ...if I get it all properly up and running, I'll put any donations towards server costs for hosting the networked arcade (there'll probably also be an offline mode for those that don't want to play in the live arcade)
  12. No real progress in this vid, just having some fun with the system so far, as I'm on an beer mission after too much work recently This is a bunch of machines (Happy Hour clones) running their lamps from a single MFME instance:
  13. Hehe indeed - it's just the only one I had to hand that I generated with the latest converter code, so I made some quick copies with a much-needed beer in hand... but hopefully more different machines to follow once I have the last few initial stage (lamp/reel) conversion issues nailed Good question This should all be automatic (provided the 'empty' 3d cab model has roughly the correct glass aspect ratio(s) of the original)... For each new machine, I generate the glass panels from the 2d mfme layout as a perfect 1m x 1m flat square, regardless of their original aspect ratio (and size): Then, for each new cab model, the top and bottom generated 1m square glass panels are then scaled to the 3d cabinet model top/bottom glass rectangles: ...so in a nutshell the plan is that the generated 1m square panels should auto-scale back to their correct aspect ratio once placed in an appropriate aspect 'empty' cab model panel (like these ones do now, as I'm choosing source machines that fit the single cab model I have so far). Later I'm planning to refresh my (very basic) 3ds max skills and knock up some more cabinets, though if it gets too time-consuming for me to do well as I only have extremely novice 3d art skills, I have a games industry artist friend in mind I can call on who would hopefully be interested in chipping in with some bespoke tweakable cabinet models once the project starts to shape up He also has a love for for the late 80s/early 90s retro arcades so I think I could lure him in... P.S. Ignore the incorrect scaled reels in the example pics above, I'm still working on reels/lamps to implement the proper one-size-fits-all perspective pass.
  14. Been working on the reels , but auto-placement/scaling needs me to fix the complicated maths stuff with the lamps - and it's been a long week, so having a beer tonight instead, and a play with the 'arcade environment' idea
  15. After a bit of headscratching, managed to get the transparent reel windows implemented Couple of very minor issues to sort out later with them, but they're good enough to move onto the actual reels next.
  16. Couldn't handle the purple blocks! ...so have implemented reading Checkboxes from the MFME layout (got away with not doing that until now), and then reading the Transparent value per lamp from the checkbox, along with implementing the alpha support itself. No more purple blocks, now the alpha lamps render with their transparency
  17. Needed to move to a new layout, with more detailed reel window overlays (as I need to preserve these to then make the reel windows transparent, Monopoly Millionaire ones are just plain square so no good to test with): Reel overlays from Happy Hour: After fixing some bugs in the layout extractor stage of the converter, we have the new machine running it's lamps in 3d (Bright pink areas are where I have to implement alpha mask support for the lamps that use it).
  18. Yep seems to play everything I throw at it!
  19. I was meant to be taking a break this week to focus on the day job but got hooked in again Got a good start on the extraction of lamp textures from the flat lit background quads and creation of relative positioned new lamps on the quad panels. There's still quite a bit of work, you'll notice a some lamps are visibly slightly incomplete, usually in their bottom-right, there's some grotty maths and whatnot to sort that out, which may take a while. Thought this was worth a quick vid update though albeit in its somewhat unfinished state
  20. Video screens should be fine - I had Pacman running before, so doesn't seem to be any issues doing 60FPS emulator raster output in general
  21. Work progresses slowly but steadily on the 'Converter' to generate 3d machines from 2d layouts... Now supports multiple perspective panels on the source 2d layout: ...and after some maths pain, I am finally able to auto-extract 'flat' textures from the perspective panels: Here are the generated backgrounds in a cabinet model, as the base for the lamps to go on next: 3d buttons on bottom are wrong (these are just placeholders), but there is a plan for those, along with the other buttons dotted around the glass panels. The note acceptor and coin slot housing in top-right of machine are also 3d. Next stage, I'm hoping to get the lamps working
  22. johnparker007

    20201030_100226.jpg

    Nice high res image!
  23. No eye candy this time, just a tech progress update Having to redo how lamp image extraction will work for the AI upscaling stage of the converter, as it adds AI upscaling 'noise' around the edges if scaling lamp images independently of the background, which is noticeable when a lamp flashes on/off (and is just not perfect even when lamps are steadily lit, when it can be ). To address this, now I auto-generate a background image for each of the lamp sets (Off, On1, On2, On3 etc...) like this: So when those images (or at least the perspective quads) are upscaled 8x with the AI, I can extract the lamps without having pixel noise around the edges. Currently I'm looking into the 'perspective fixing' (much like what's done in Photoshop/Gimp perspective tools), but implementing myself so it can be automatic once the quad is defined. Here's the very early WIP editable quad tool in place for the bottom glass, in the Unity Editor: The next step is auto-converting those defined quads (usually two for a machine; one for top glass, one for bottom glass) into 'flat' images. Then from the flat images, I can extract the background and lamps with original alpha masking (not yet implemented) and also the additional perspective masking, so everything will be correct 'flat' aspect ratio, ready for 3d rendering. This process (AI upscale, then perspective correct both backgrounds and lamps, then fix positions and scales) was how I (very slowly and completely manually) made the 2p Cash Nudger 3d test layout, so hopefully it'll all eventually come together and look alright Edit: Just done a test with the lampsOn1 background, should all look good and noise free Original: 8x AI Upscaled:
  24. Thanks @Amusements I'm just looking into it a little for the moment, but I may take you up one of your flat PSDs at some point to try some tests with, thanks very much for the offer (As the AI upscaling technique may not work well on some lower res layouts on the bottom glass panel, since it's already heavily vertically 'squashed' for the pseudo-3d perspective of the source MFME layout image). I'm thinking that theoretically once I've 'auto-flattened' an existing MFME layout so it can be used for the 3d panels, I could somehow write something to take a high res flat scan (or flat redraw), and auto map that into place (and potentially auto-lamp it) - giving a much higher fidelity source texture on the existing 3d layout. I've still got a couple of big chunks of work left to do on the converter to auto-flatten and AI upscale the source layouts, but then investigating this idea further might be a cool side-quest, to further enhance the visuals In the 3d engine, whilst the glass panels can be viewed from any angle: ...they are stored internally as perfectly flat:
  25. Ah cool, good to know it's a standard technique Yeah it's a tricky one with the lighting, I tried equalising then tweaking brightness/contrast down, seems to level it out a bit ready for an 'unlit' glass image... but getting the different stitched together parts to have even brightness, that does definitely add more complexity. It's an interesting problem, trying to get back to a evenly lit 1:1 ratio complete 2d scan from a bunch of unevenly lit 3d frames
×
×
  • Create New...