Starting from scratch
This is a follow up to my previous post here where I created a 3D multiplayer game in Godot. I ran into some issues with server authority with physics and decided I would need to restart the project from scratch. Unlike before, this is more of a "Where we're at" post as opposed to a "how we got here".
Current Tech Stack
SQL, MongoDB, NodeJS, Godot, Javacript, Python, AWS
Networking Setup
This server was modeled for use with AWS. I am creating instances within VPCs for each server component. Currently, each server is set up manually, however I do have plans to use terraform and ansible to automate scaling of this entirely. (I just feel like that is not a necessity until I at least start playtesting)
We essentially have 3 main components. The auth server (authenticates users against the users database), server collections (groups of game-servers that represent a single world with IDs such as us-west-1, etc), and the server manager (managers authentication servers and server collections).
Server Manager
The server manager allows connections from only whitelisted IPs (Auth servers and collections) on a websocket server (HTTPS), all other inbound traffic is blocked at the network level via security groups. The server essentially manages the state of all gameserver collections and auth servers, and acts as a medium between the two. This server has direct access to a non-relational database for players and manages communication between the gameservers and the DB. Nginx handles the routing for inbound TLS connections and routes them internally to the websocket server.
Auth Server
The auth server is serving a basic website and connects to the server manager websocket. There are a few endpoints, /register, and /login, that will perform inserts/checks against the database to create users and log them in. Users will send a few extra fields when POSTing to the /login endpoint from the game client to allow for secure sign-in.
Collections
Collections (As I've called them) represent a group of game servers. Each game server zone is instanced onto a separate instance within a network, each listening on a different port. Nginx will handle inbound connections and route them to the appropriate gameserver. Since a server world may have multiple zones, I've broken each zone down into its own server and a collection is the entire group of servers. Server manager will store each gameserver in memory and sort them into groups based on their collection ID.
Authenticated/Secure Login process
When a client logs in, communication can not be sent directly to the gameserver as this connection is not encrypted (to allow for faster processing of gameserver packets). Thus, we must send an encrypted connection request to the auth server, and have that route itself to the gameserver securely.
The process for a client logging in is essentially this.
- When a client starts the executable, they are greeted with a server selection (Client performs a GET request to a page listening on AUTH server.
- This page returns a list of all collections, the server connection information, player count, etc.
- The client will load this into a selectable list which, when a server is chosen and loaded, the client will connect to that game server.
- Once connected, the game server assigns the client a Peer ID and tell the player to load the login screen.
- When the player logs in, they send an encrypted /login post to the auth server, with the server collection they are connected to along with their Peer ID assigned to them by the game server.
- The auth server, on successful login, will send a request to the server manager essentially saying: "Player X wants to connect to Collection X"
- The server manager, on receiving this from the auth server, will check if the user has a player character.
- If they don't, one is created for them and then it checks if they exist again.
- If they do, server manager tells 'collection X' to load player of peer_id X into the location pulled from the database.
After the initial log in, credentials are cached on the client so that the server can call remote connections to other gameservers automatically.
Sharding
Initially one game server would handle every zone in the game, but this was going to be intensive as all physics, items, players, etc would need to be simulated on the host at the same time to ensure everything is working as the server intends. To ease the load on the game servers, I've broken down each zone into its own server, and then added functionality to allow gameservers to send a player from gameserver A to gameserver B in the collection without them noticing.
When a client switches to a new gameserver within the collection (like switching to another zone or area managed by a different server), the process above is repeated after their location and position are updated in the database so that they are routed to the appropriate server.
Video demonstration
Current plans
The project got pretty hectic setting up sharding, and I have functions in some spots that should be in others, etc. So my current plan is to clean up the game client code and add some better comments to make everything easier to understand.
Top comments (0)
Some comments may only be visible to logged-in visitors. Sign in to view all comments.