<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: GOALS Engineering</title>
    <description>The latest articles on DEV Community by GOALS Engineering (@goals).</description>
    <link>https://dev.to/goals</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/goals"/>
    <language>en</language>
    <item>
      <title>Unit testing private functionality in Unreal Engine C++ classes</title>
      <dc:creator>Göran Syberg Falguera</dc:creator>
      <pubDate>Mon, 27 Mar 2023 07:24:48 +0000</pubDate>
      <link>https://dev.to/goals/unit-testing-private-functionality-in-unreal-engine-c-classes-2fig</link>
      <guid>https://dev.to/goals/unit-testing-private-functionality-in-unreal-engine-c-classes-2fig</guid>
      <description>&lt;p&gt;Unreal Engine has a pretty extensible suit of &lt;a href="https://docs.unrealengine.com/5.1/en-US/automation-system-in-unreal-engine/"&gt;automation features&lt;/a&gt;. For instance, we run unit tests in our continuous integration system upon merge of new code.&lt;/p&gt;

&lt;p&gt;For these unit tests we utilize UE's &lt;code&gt;IMPLEMENT_SIMPLE_AUTOMATION_TEST&lt;/code&gt; or potentially &lt;code&gt;IMPLEMENT_COMPLEX_AUTOMATION_TEST&lt;/code&gt;. Here is a minimal test from Unreal's documentation:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="n"&gt;IMPLEMENT_SIMPLE_AUTOMATION_TEST&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;FPlaceholderTest&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"TestGroup.TestSubgroup.Placeholder Test"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;EAutomationTestFlags&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;EditorContext&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="n"&gt;EAutomationTestFlags&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;EngineFilter&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="kt"&gt;bool&lt;/span&gt; &lt;span class="n"&gt;FPlaceholderTest&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;RunTest&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;const&lt;/span&gt; &lt;span class="n"&gt;FString&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt; &lt;span class="n"&gt;Parameters&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c1"&gt;// Make the test pass by returning true, or fail by returning false.&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nb"&gt;true&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is all well and good but pretty soon after implementing your first test you realize that you will want to reach &lt;code&gt;private&lt;/code&gt; functionality in the class that you are testing.&lt;/p&gt;

&lt;p&gt;After pondering different more or less ugly solutions I talked to my colleague &lt;a href="https://www.linkedin.com/in/tomas-hubner/"&gt;Tomas Hübner&lt;/a&gt; and he showed me this old trick that does not  make this pretty but at least not terrible. And after pre-compiler is done the class we are testing is actually unaffected.&lt;/p&gt;

&lt;p&gt;We do not want to make this private functionality public just for testing and we do not want to pollute the tested class with for instance a friend class that the original class should know nothing about. So, here is the trick: &lt;/p&gt;

&lt;p&gt;We add an empty macro &lt;code&gt;UNIT_TEST_FRIEND&lt;/code&gt; to the class we want to test:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="cp"&gt;#pragma once
&lt;/span&gt;
&lt;span class="cp"&gt;#include&lt;/span&gt; &lt;span class="cpf"&gt;"CoreMinimal.h"&lt;/span&gt;&lt;span class="cp"&gt;
&lt;/span&gt;
&lt;span class="cp"&gt;#ifndef UNIT_TEST_FRIEND
#define UNIT_TEST_FRIEND
#endif
&lt;/span&gt;
&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;FMyTestedClass&lt;/span&gt; &lt;span class="o"&gt;:&lt;/span&gt; &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="n"&gt;FCoolUnrealClass&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
&lt;span class="nl"&gt;public:&lt;/span&gt;
    &lt;span class="n"&gt;FMyTestedClass&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

&lt;span class="nl"&gt;private:&lt;/span&gt;
    &lt;span class="kt"&gt;float&lt;/span&gt; &lt;span class="n"&gt;PrivateFloat&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="n"&gt;PrivateMethod&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

    &lt;span class="n"&gt;UNIT_TEST_FRIEND&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And then in the test class, we make sure to define the &lt;code&gt;UNIT_TEST_FRIEND&lt;/code&gt; macro. The test class from above then becomes:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="cp"&gt;#include&lt;/span&gt; &lt;span class="cpf"&gt;"Misc/AutomationTest.h"&lt;/span&gt;&lt;span class="cp"&gt;
&lt;/span&gt;
&lt;span class="cp"&gt;#if WITH_AUTOMATION_TESTS
&lt;/span&gt;
&lt;span class="cp"&gt;#ifdef UNIT_TEST_FRIEND
#undef UNIT_TEST_FRIEND
#endif
#define UNIT_TEST_FRIEND friend class FPlaceholderTest
&lt;/span&gt;
&lt;span class="cp"&gt;#include&lt;/span&gt; &lt;span class="cpf"&gt;"MyTestedClass.h"&lt;/span&gt;&lt;span class="cp"&gt;
&lt;/span&gt;
&lt;span class="n"&gt;IMPLEMENT_SIMPLE_AUTOMATION_TEST&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;FPlaceholderTest&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"TestGroup.TestSubgroup.Placeholder Test"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;EAutomationTestFlags&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;EditorContext&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="n"&gt;EAutomationTestFlags&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;EngineFilter&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="kt"&gt;bool&lt;/span&gt; &lt;span class="n"&gt;FPlaceholderTest&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;RunTest&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;const&lt;/span&gt; &lt;span class="n"&gt;FString&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt; &lt;span class="n"&gt;Parameters&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;FMyTestedClass&lt;/span&gt; &lt;span class="n"&gt;TestedClass&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="c1"&gt;// Using private member here&lt;/span&gt;
    &lt;span class="kt"&gt;float&lt;/span&gt; &lt;span class="n"&gt;MyValue&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;TestedClass&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;PrivateFloat&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="c1"&gt;// Calling private function here&lt;/span&gt;
    &lt;span class="n"&gt;TestedClass&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;PrivateMethod&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nb"&gt;true&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Note how we include &lt;code&gt;MyTestedClass.h&lt;/code&gt; after defining the macro! Otherwise the &lt;code&gt;UNIT_TEST_FRIEND&lt;/code&gt; will be empty when we parse the &lt;code&gt;MyTestedClass.h&lt;/code&gt; header.&lt;/p&gt;

&lt;p&gt;Tada!&lt;/p&gt;

</description>
      <category>ue5</category>
      <category>unittesting</category>
      <category>programming</category>
      <category>unrealengine</category>
    </item>
    <item>
      <title>Developing games on and for Mac and Linux</title>
      <dc:creator>Göran Syberg Falguera</dc:creator>
      <pubDate>Thu, 16 Feb 2023 08:31:30 +0000</pubDate>
      <link>https://dev.to/goals/developing-games-on-and-for-mac-and-linux-730</link>
      <guid>https://dev.to/goals/developing-games-on-and-for-mac-and-linux-730</guid>
      <description>&lt;p&gt;The case for multi platform game development.&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;In my current organization we have come to a point where we will have to test our belief in our previous choices. Up until now we have said “We will support Windows, Mac and Linux as long as it does not take up too much of our time”. Now things are heating up and we need to prioritize where we allocate our resources.&lt;/p&gt;

&lt;p&gt;In this article I will drill into the different aspects of multi platform development and multi platform development environments for personal computers. I will try to convince you that there is a case for multi platform development and if you can afford it, and have the skills and perseverance to follow through, there is much to gain.&lt;/p&gt;

&lt;p&gt;The below arguments all assume planned support for the two biggest console manufacturers on the market, Sony Playstation and Microsoft XBOX.&lt;/p&gt;

&lt;h2&gt;
  
  
  The costs
&lt;/h2&gt;

&lt;p&gt;When developing an application for multiple platforms there are different aspects that will incur costs. One way to slice it is development, quality verification (testing) and distribution. Depending on the platform and choices made, they may all have equal impact on total cost.&lt;/p&gt;

&lt;h3&gt;
  
  
  Development
&lt;/h3&gt;

&lt;p&gt;This is where extra costs arise from taking into consideration hardware and certification requirements from different vendors or platform owners.&lt;/p&gt;

&lt;p&gt;As for pure feature development, much of this should be solved for us by the chosen game engine, in our case Unreal Engine. We did indeed choose Unreal Engine much because of its wide platform support. This is a truth with some modifications as the platforms are far from on par feature wise in the engine. Due to this, we often stumble on issues on one platform even though it worked on another platform. Up until now most differences have been related to rendering and video codecs. This is not surprising as it is key things that platform owners try to differentiate themselves on. I.e. Windows, XBOX -&amp;gt; DirectX, Linux -&amp;gt; Vulkan, Mac -&amp;gt; Metal, etc. and video codecs that are proprietary and sometimes hardware accelerated with special purpose on board circuits.&lt;/p&gt;

&lt;p&gt;On top of pure technical development costs, most platforms have certification requirements that the developer needs to pass to be allowed to release the game on that platform. This could be certain ways menus are used, things that are not allowed to take too long, etc. The effect of this is that many features need special treatment and development for each platform. And this leads into the next section where we need to verify all these features.&lt;/p&gt;

&lt;h3&gt;
  
  
  Quality verification
&lt;/h3&gt;

&lt;p&gt;Simply put, all versions and combinations of features of the game need to be tested. The more versions, the more testing and hardware is needed. With cloud services available and making automated testing a priority early on, we can minimize the testing effort but there is no getting around that each platform and configuration needs to be tested.&lt;/p&gt;

&lt;p&gt;Worth noting is that PC-platforms are disproportionately expensive to test due to the number of hardware configurations. As an example, the Playstation 4 only exists in a couple of different hardware configurations so testing them all becomes possible. A PC could be put together with any compatible hardware making the permutations that need to be tested almost infinite.&lt;/p&gt;

&lt;h3&gt;
  
  
  Distribution
&lt;/h3&gt;

&lt;p&gt;This one is pretty straightforward. It’s the costs related to getting the game into the hands of the players. Every platform has its own quirks and associated cost (for instance, the cost of generating a code for a digital download, etc.) but generally speaking, the biggest cost is the licensing fee. I.e. How big of a cut does Steam take on PC or Sony on Playstation. This cost does not increase with the number of platforms but it may vary depending on which platforms are chosen.&lt;/p&gt;

&lt;h2&gt;
  
  
  The business case
&lt;/h2&gt;

&lt;p&gt;Looking at mostly game development in this article, the interesting data point is what is commonly referred to as install base. It is a simple to understand metric that means essentially: “If you build a game for this platform, how many potential customers do you have on that platform”. As we can see in the diagram, Linux accounts for only 0.5% of the potential install base and Mac only 1%. If it goes really well and the game reaches a really wide audience, say 100 million players, then that means potentially 500 thousand and 1 million players respectively for Linux and Mac.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--1Q5lXToN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l0fh61xobyhi2em90jot.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--1Q5lXToN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l0fh61xobyhi2em90jot.png" alt="Install base" width="800" height="491"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There is an argument to be made for different kinds of games and different audiences. A game that runs on cheaper hardware with less resources and is free to play could potentially have a larger install base on for instance Linux that is a free operating system. This is, however, only speculation.&lt;/p&gt;

&lt;p&gt;So, is the case closed? There is no point building for Linux and Mac? Not quite, read on.&lt;/p&gt;

&lt;h2&gt;
  
  
  The technical reasons
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Compiler
&lt;/h3&gt;

&lt;p&gt;If we group the platforms in a slightly different way, namely on compiler, we see that MSVC C++ compiler used by the Microsoft platforms and Clang C++ compiler used by all the others are pretty equal. Taking into account the dedicated servers that all run on Linux regardless of client platform, we can consider them roughly equal in terms of numbers. This is not interesting because it gives us any relief in development costs but it is interesting because our game code always needs to compile on MSVC and Clang C++ compilers regardless of if we chose to support Mac or Linux. So additional platforms do not add much cost in this regard.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--A_vUnrhI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/71tvk8cyycrhx9jyrw5i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--A_vUnrhI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/71tvk8cyycrhx9jyrw5i.png" alt="Compilers used" width="800" height="491"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Rendering API
&lt;/h3&gt;

&lt;p&gt;Although the compiler differs depending on which platform you are compiling for and on, much of the technical complexity should be abstracted by Unreal Engine. That is, before Unreal does a release, they make sure everything is working on the supported platforms. So in theory, when we take that release, everything should be working. It is however very hard for Epic (who builds and maintains Unreal Engine) to have test cases for every feature in all rendering API’s. Our experience from doing one major, 3 minor and 2 hot-fix upgrades on the game while in development and while being live, tells us that a lot of problems arise from differences in the rendering API’s and artifacts/errors due to that. What this means is that if the game is working nicely, and we do an engine upgrade, there is no guarantee that the game will run on all platforms after the upgrade if they did it before the upgrade. The more rendering API’s we support, the more special treatment we need for each platform. This is where it gets really bad for Mac and Linux. Since DirectX is a proprietary rendering API for Microsoft platforms and Apple stubbornly wants their own (Metal), Linux needs to use the open (and still fairly young) rendering API Vulkan. We end up with the following graph if we look at it from a Rendering API perspective:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--inihV59c--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ov9hnjkiryenemfnm5lx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--inihV59c--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ov9hnjkiryenemfnm5lx.png" alt="Rendering API usage" width="800" height="491"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Since we technically do not need rendering on the game servers (that run on Linux) and Mac is just a small percentage of the install base, it becomes difficult to motivate the increased development effort to maintain the game on Mac and Linux.&lt;/p&gt;

&lt;h2&gt;
  
  
  The practical implications
&lt;/h2&gt;

&lt;p&gt;Currently, 15% of our studio uses Mac, 40% of the game engineers prefer Linux. Among game engineers it is more common with Linux and among cloud engineering, art and other departments Mac is more popular. What this means is that if we do not support Mac, a significant part of the studio will need other computers/consoles to play the game if we decide not to support Mac. A large percentage of the game engineers will have to develop and/or test the game on another platform than the one they feel more comfortable with.&lt;/p&gt;

&lt;p&gt;I’ve worked at Windows only game development studios that only supported Windows PC clients but ran all servers on Linux. What ended up happening in a studio with several hundred developers was that one developer used Linux to natively debug and profile the server build effectively, many others used remote debugging from their Windows machine to track down specific issues. What happened in practice was that the Linux build, vis-a-vie the server build, was neglected due to the hassle to even run it. The low hanging fruit can usually be fixed and optimized on a Windows version of the build but if you want to get really fast or really stable (finding those intermittent bugs) then you need to be natively working on the platform.&lt;/p&gt;

&lt;p&gt;Another aspect that is often overlooked is the importance of the engineering team learning early to consider multiple platforms. As mentioned above, automatic testing becomes important right from the start since you do not have time to test every platform manually. The consequence of this is that the organization makes the multi-platform investments needed eventually directly and can start leveraging benefits right away.&lt;/p&gt;

&lt;p&gt;Technically, when forced to consider multiple platforms, engineers learn early that code needs to be modular and have minimal dependencies. The modularity is due to being able to make platform specific implementations and minimal dependencies is due to external dependencies seldom supporting all the platforms you need them to. Both of the above will require more senior developers to pull off.&lt;/p&gt;

&lt;h2&gt;
  
  
  Social responsibility and employer branding
&lt;/h2&gt;

&lt;p&gt;A well established practice in game development, and software development in general, is exclusive content. What this means is that platform owners ensure, through any means possible, that some game, applications or content is available only on their platform.&lt;/p&gt;

&lt;p&gt;After a while, consumers learn that their desired content is available on a specific platform and that platform grows. Next step is that some game studios are doing exactly the above calculation and realize the install base is too small on some platforms so they do not develop their game for it. The circle of platform lock-in is now complete when the platform owners do not have to pay for exclusive content anymore since the other platforms are too small to motivate developing games for them. Since we cannot see the future, this becomes a chicken and egg problem. Are the Mac and Linux install bases so small due to missing content or are they missing content because the install base is so small?&lt;/p&gt;

&lt;p&gt;The socially responsible thing to do, all other things equal, is to develop the game to be accessible to as many people as possible. Making the game accessible to for instance color blind people (~4%) is something that we take for granted, but making the game accessible to Linux and Mac players (~1.5%) is seldom talked about in relation to accessibility.&lt;/p&gt;

&lt;p&gt;Making a game at the level of ambition that GOALS is trying to achieve, takes a lot of good developers. Finding these developers is not easy but one good way of attracting them may be letting them work on their preferred operating system. According to Stack Overflow developer survey of 2022, the distribution between the operating systems for professionals is a bit different than the one for gamers:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TT-yp97i--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/g4trvrobrobjiaargvg4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TT-yp97i--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/g4trvrobrobjiaargvg4.png" alt="Image description" width="800" height="491"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  State of the art
&lt;/h2&gt;

&lt;p&gt;If you decide to go down this road there are some fantastic tools, applications and libraries who's makers share the vision of a multi platform gaming ecosystem:&lt;/p&gt;

&lt;h3&gt;
  
  
  Game engine
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Unreal Engine&lt;/strong&gt;: &lt;a href="https://www.unrealengine.com/"&gt;https://www.unrealengine.com/&lt;/a&gt;. Open Source and multi platform game engine.&lt;/p&gt;

&lt;h3&gt;
  
  
  Version tracking
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Plastic SCM&lt;/strong&gt;: &lt;a href="https://www.plasticscm.com/"&gt;https://www.plasticscm.com/&lt;/a&gt;. Not only multi platform but specifically built for the needs of a game studio.&lt;/p&gt;

&lt;h3&gt;
  
  
  Integrated Development Environment
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Rider&lt;/strong&gt;: &lt;a href="https://www.jetbrains.com/rider/"&gt;https://www.jetbrains.com/rider/&lt;/a&gt;. An IDE with a large ecosystem of plugins and specific Unreal Engine support.&lt;/p&gt;

&lt;h3&gt;
  
  
  Engine binary distribution
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Gamecure&lt;/strong&gt;: &lt;a href="https://github.com/goalsgame/gamecure"&gt;https://github.com/goalsgame/gamecure&lt;/a&gt;. Our own cross platform game engine distribution application that we open sourced. Read more here: &lt;a href="https://dev.to/goals/collaboration-on-large-teams-with-gamecure-1dfb"&gt;https://dev.to/goals/collaboration-on-large-teams-with-gamecure-1dfb&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Engine upgrades
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;UEIMPORTER&lt;/strong&gt;: &lt;a href="https://github.com/goalsgame/ueimporter"&gt;https://github.com/goalsgame/ueimporter&lt;/a&gt;. Open sourced and in house built tool set for upgrading Unreal Engine when under Plastic SCM version control.&lt;/p&gt;

&lt;h3&gt;
  
  
  Code generation and network serialization
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Protocol Buffers&lt;/strong&gt;: &lt;a href="https://developers.google.com/protocol-buffers"&gt;https://developers.google.com/protocol-buffers&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Delta patching
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;golongtail&lt;/strong&gt;: &lt;a href="https://github.com/DanEngelbrecht/golongtail"&gt;https://github.com/DanEngelbrecht/golongtail&lt;/a&gt;&lt;br&gt;
Cross platform library for delta patching. Used by our end user game launcher and Gamecure (mentioned above)&lt;/p&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;On the surface, it looks like an easy decision. The numbers just don't add up. Drilling a bit deeper, we realize there is much to gain from sticking to multiple platforms right from the start and having come such a long way as GOALS, and having all infrastructure in place, my conclusion is that there is no question what is better and what is desirable, the question is what the organization can afford and what the people in the organization can manage.&lt;/p&gt;

&lt;h2&gt;
  
  
  References
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam"&gt;https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.statista.com/statistics/1260264/global-gaming-console-installed-base/"&gt;https://www.statista.com/statistics/1260264/global-gaming-console-installed-base/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://survey.stackoverflow.co/2022/"&gt;https://survey.stackoverflow.co/2022/&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Title image credit: &lt;a href="https://midjorney.com"&gt;Midjourney&lt;/a&gt;&lt;/p&gt;

</description>
      <category>gamedev</category>
      <category>linux</category>
      <category>unrealengine</category>
      <category>apple</category>
    </item>
    <item>
      <title>Ball physics in GOALS</title>
      <dc:creator>Göran Syberg Falguera</dc:creator>
      <pubDate>Mon, 06 Feb 2023 09:35:21 +0000</pubDate>
      <link>https://dev.to/goals/ball-physics-in-goals-58g6</link>
      <guid>https://dev.to/goals/ball-physics-in-goals-58g6</guid>
      <description>&lt;p&gt;Title image credit: &lt;a href="https://midjourney.com/"&gt;Midjourney&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;When creating the ball for GOALS we realized fairly quickly that using Unreal’s off-the-shelf physics will not suffice for our ball. Footballs behave in peculiar ways when forces are applied and when they are deformed. We want a really configurable and realistic looking ball trajectory. But at the same time, we want to be able to leverage Unreal's built in physics for collisions and interactions with the rest of the world. The current solution uses different approaches depending on which state the ball is in and we will describe each of them in this article.&lt;/p&gt;

&lt;h2&gt;
  
  
  The different states of the ball
&lt;/h2&gt;

&lt;p&gt;The ball will be in six distinct states and different calculations are applied in each of them. Starting with ball being possessed by one of the players:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--I1AX74kZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mfox3vuyrwmp848nzx86.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--I1AX74kZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mfox3vuyrwmp848nzx86.gif" alt="Shot with debug info" width="800" height="446"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Dribbling - One of the players (or the AI) is performing dribbling animations that effect the ball.&lt;/li&gt;
&lt;li&gt;In air predicted - One of the players is about to shoot and all parameters are known.&lt;/li&gt;
&lt;li&gt;In air simulated - The ball now leaves the player and it's parameters are calculated each game update in our custom ball physics simulation.&lt;/li&gt;
&lt;li&gt;On ground simulated - As soon as the ball hits something, like the frame of the goals, the ground or another player, we let Unreal physics take over and manage the ball according to it's physical properties like weight, material, size, etc.&lt;/li&gt;
&lt;li&gt;On ground predicted - For players and AI to be able to intercept the ball, without it looking strange, we need to predict the ball path.&lt;/li&gt;
&lt;li&gt;Once ball is at rest, it is just considered a plain old static mesh with physics in Unreal.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Dribbling
&lt;/h3&gt;

&lt;p&gt;This is when one of the players is possessing the ball. In this state, the ball movement is dictated by designer controlled parameters that gives the ball a physical &lt;em&gt;impulse&lt;/em&gt; depending on the stats of the player. This way we can vary the control the player has with the impulse. As we progress to more advanced dribbling, we may have to add more sophisticated ball control mechanisms.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--LAXZzU0i--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/m6zircjf0stkxi2p863l.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--LAXZzU0i--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/m6zircjf0stkxi2p863l.gif" alt="Image description" width="800" height="464"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  In air predicted
&lt;/h3&gt;

&lt;p&gt;To be able to put other players in the right state, i.e. getting ready for a volley shot etc. but also to adjust aiming, we need to know where the ball will end up (if it does not get intercepted on the way).&lt;/p&gt;

&lt;p&gt;We do this by emulating Unreal Chaos physics and create a path from where the ball is to where it would land unobstructed. This is all done in the frame (game loop update) where a player lets go of one of the shooting buttons. We now know which type of shot the players used as well as the power and the aim, we also know how much we need to adjust the shooting angle for the player to hit the intended target.&lt;/p&gt;

&lt;p&gt;All this is now plugged into the actual shooting function.&lt;/p&gt;

&lt;h3&gt;
  
  
  In air simulated
&lt;/h3&gt;

&lt;p&gt;Once ball is let go, at each game update we considers several forces that impact the trajectory; speed, air drag, spin force, gravity to create a natural feeling during game play. But we also make sure that the ball in this state is regular &lt;code&gt;AStaticMeshActor&lt;/code&gt; so that we can have it collide with and bounce of things in the world without having to implement this ourselves. We have built a set of development tools for engineers and designers to see exactly how the ball behaves with different parameters set. In this image we see the spin of the ball illustrated on the ball bath as well as spin force, spin speed, drag and ball speed plotted on screen.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0jYIWXr2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j0tmvlbre0qi3j3k5wng.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0jYIWXr2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j0tmvlbre0qi3j3k5wng.png" alt="Illustration of debug" width="800" height="442"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;High quality ball simulation is accomplished by updating physics at a fixed 128 Hz rate on the server. To minimize client-divergences we use sub-stepping, a technique to decouple physics simulation rate from frame rate, so that even players with hardware that fail to render the game at 128 frames per second will update physics in roughly the same rate as the server.&lt;/p&gt;

&lt;h3&gt;
  
  
  On ground simulated
&lt;/h3&gt;

&lt;p&gt;From our perspective, this is the simplest case as we let Unreal engine physics handle most of it for us. We will get physical properties while the ball interacts with the worlds for free and we can leverage Unreal material properties for friction etc. The thing we apply ourselves is drag force that is also applied when in air. We do this with a manually adjusted curve based on research on ball behavior.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--n1jLdiXl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/tj2ez9zla9tblbf2sjz5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--n1jLdiXl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/tj2ez9zla9tblbf2sjz5.png" alt="Image description" width="800" height="421"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this short GIF we can see how the ball is sliding slightly before friction is enough for it to start rolling. Then after a while the ball comes to rest.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4qFdiiTr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hpxtcq9tpmojhyveq107.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4qFdiiTr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hpxtcq9tpmojhyveq107.gif" alt="Rolling ball" width="800" height="245"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  On ground predicted
&lt;/h3&gt;

&lt;p&gt;To be able to steer AI and different lock-on scenarios, we need to be able to predict where the ball is going. Otherwise the players will just keep updating where they are going and create the strange round path shown in the image below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3ISaqrCV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/b7rekuba6zomzl9csiu7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3ISaqrCV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/b7rekuba6zomzl9csiu7.png" alt="With and without predition" width="594" height="567"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The trick in this scenario, is figuring out what Unreal physics will be doing to the ball (remember that when we are in air we are controlling physics so we can more easily predict it). Up until now, we have used a simple constant deceleration to model the behavior but as we increase the quality this is not enough anymore. We are currently testing different models and will update this section once implemented.&lt;/p&gt;

&lt;h2&gt;
  
  
  Configuration and shot types
&lt;/h2&gt;

&lt;p&gt;The system is built to be highly configurable and extendable by the game designers and adding new shot types is just a matter of configuration and adjusting some curves. Here is an example of a curved shot. The amount of curvature does not only depend on the input of the controller but also on the stats of the current player:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_duDubOD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xc6tjw8u9q351ri24yf9.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_duDubOD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xc6tjw8u9q351ri24yf9.gif" alt="Curved shot" width="800" height="446"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Networking the ball
&lt;/h2&gt;

&lt;p&gt;The ball is simulated on the server as well as on the client. To ensure fair play, the server is authoritative and any client divergence will be smoothly corrected.&lt;/p&gt;

&lt;p&gt;At the time of writing, we use off-the-shelf Unreal movement component to network the kinematics of the ball. One big task for GOALS is to find a way to make this component more streamlined and optimized for GOALS use case as networking the ball takes up most of our network resources.&lt;/p&gt;

</description>
      <category>gamedev</category>
      <category>unrealengine</category>
      <category>football</category>
      <category>physics</category>
    </item>
    <item>
      <title>Runtime Procedural Character Generation</title>
      <dc:creator>Göran Syberg Falguera</dc:creator>
      <pubDate>Wed, 11 May 2022 13:22:15 +0000</pubDate>
      <link>https://dev.to/goals/runtime-procedural-character-generation-161d</link>
      <guid>https://dev.to/goals/runtime-procedural-character-generation-161d</guid>
      <description>&lt;p&gt;&lt;strong&gt;Or storing 18 quintillion characters in a 64 bit integer.&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;The purpose of this article is to give a high level survey of a field that I believe will become increasingly important in the future. I aim to not only describe the area but try to spark your imagination on things that can and will be done with this technology.&lt;/p&gt;

&lt;p&gt;Increasing numbers of game developers are using procedural tools to create content for their games. A lot has been done for procedural generation of environments, levels, game modes, etc. and even though procedural character generation has been done, it has not been as popular a use case as for instance generation of thousands of game levels for replayability or for generating vibrant environments. The reason for the relative lack of progress in this area I attribute to game designs not needing it. For one thing, it turns out people get more attached to characters that they get to create themselves than having them generated. Even for NPC-creation, the created characters usually have a place in a story, and any arbitrary character wont be good enough. Maybe a specific gender or geographical inheritance is needed for the story.&lt;/p&gt;

&lt;p&gt;There is however a group of games that will not only gain from this technology but requires it and I am working on one of them. After reading this, I think you will have the foundation to not only accept my proposed use cases but maybe think of a few new ones yourself. Maybe you want to come work on this at &lt;a href="https://careers.goals.co/"&gt;GOALS&lt;/a&gt;? &lt;/p&gt;

&lt;h2&gt;
  
  
  Asset Generation
&lt;/h2&gt;

&lt;p&gt;Asset generation, as opposed to asset creation, alludes to some kind of automated process. The point, of course, being that you can create a lot more content faster. Now, you could go about this in a couple of different ways depending on what you need. For instance, you could generate characters based random inputs and get random outputs each time. This would be referred to as stochastic generation of content. Or you could have a set of parameters that you have decided on and generate say 100 different characters based on 100 different predefined values of characteristic X. Now, in computer science "randomness" is a tricky thing. For "true" randomness you have to look at some non-deterministic physical aspect outside of the computer, for instance the radioactive decay of an atom. This is a bit overkill for our current topic and most programming languages supply some kind of pseudo random number generator. The funny thing with these "random number generators" is that they are fully determined by their "seed" and hence not "random" at all. Or, you could generate the same random number every time you want to.&lt;/p&gt;

&lt;p&gt;A seed is something you pass into the generator function to make it differ from the previous time you ran the function. The reason we need a seed is that a computer cant make stuff up. They are built specifically to not be random and to reproduce the exact same result given the same input.&lt;/p&gt;

&lt;p&gt;So, if you pass the same seed into the same random number generator, you end up with the exact same "random" number every time you do it. A "pseudo random number generator" is actually a "procedural number generator"! How fun. More about this in the "Procedural Content" section. The point to take with you from this section is that the set of generated outcomes can be done in a deterministic, and hence reproducible way (same result every time), or in a stochastic unique (most often) way.&lt;/p&gt;

&lt;p&gt;Now, if the seed is a 64 bit unsigned integer (unsigned means only positive numbers), which is the most common integer size in modern computers, we could then generate about 18 quintillion (18,446,744,073,709,551,615) different random numbers. And if a character, or other procedural content, was fully generated from this seed, it would mean we would have 18 quintillion different and fully deterministic (we could recreate them anytime we want) versions of that asset.&lt;/p&gt;

&lt;p&gt;This is where my head exploded the first time I thought about this. The possibilities are just staggering. Keep this in mind for now and let's talk a bit about how these procedural algorithms could work.&lt;/p&gt;

&lt;h2&gt;
  
  
  Procedural creation of content
&lt;/h2&gt;

&lt;p&gt;Imagine a game character in a world with screens only supporting 12x12 pixels:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--NzhNDBqW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/crakizfj68462d600y3i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--NzhNDBqW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/crakizfj68462d600y3i.png" alt="Image description" width="800" height="749"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To create this character in the traditional (manual) way, we would have to manually chose the color of each pixel. This gives us a lot of control but if we want to create another character, we would have to do it again. Hence, this process scales really badly.&lt;/p&gt;

&lt;p&gt;What if we would instead of choosing the color of each pixel create a little program that did something like this for the head of the character: &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Aligned to the top of the center of the body, draw a symmetrical elliptic shape with the parameters X and Y while X and Y vary between Z and I.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Obviously the "body" would have to be similarly defined ahead of this but by varying X and Y between Z and I we can create as much variation as the resolution permits.&lt;/p&gt;

&lt;p&gt;If we specify the values we want to vary in a list, like for the colors in the image below, we can have quite a lot of control of the process. Or we just vary the RGB-values pseudo randomly and get some unexpected color. Varying randomly can be risky though and may require more rigorous testing.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zVOas62p--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mpf0sl98n3exb49o5xrk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zVOas62p--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mpf0sl98n3exb49o5xrk.png" alt="Image description" width="735" height="670"&gt;&lt;/a&gt; &lt;/p&gt;

&lt;h2&gt;
  
  
  Game Characters
&lt;/h2&gt;

&lt;p&gt;Now its time to talk about some 3D game characters and what they are made up of so that we can figure out how to procedurally create them.&lt;/p&gt;

&lt;p&gt;Game characters are generally made up of the skeleton, the model mesh and the skin and textures. Arguably cloth could be seen as a fourth part in modern game engines. Traditionally cloth and clothing was part of the model mesh.&lt;/p&gt;

&lt;h3&gt;
  
  
  Skeleton or rig
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--w5riCGvs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dyf1fusd3aqs5texeoxd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--w5riCGvs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dyf1fusd3aqs5texeoxd.png" alt="Image description" width="800" height="367"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The skeleton in game characters is not used to keep the body upright (as in real life characters), these will not fall in a pile of skin without it. The reason to have a skeleton is to be able to reason about the movement of the limbs and the body in relation to each other.&lt;/p&gt;

&lt;p&gt;The definition of a skeleton is commonly just a list or hierarchy of bones starting at the hip and going outwards to the extremities.&lt;/p&gt;

&lt;p&gt;To manipulate a skeleton, given the same number of joints (joints being the points tying the whole thing together) one might just vary the bone length according to some rules. As long as we do it within certain limits, this fits perfect for our procedural tools! Now we can just set random variation on bone length similar to pixel characters head above.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--sgcEXi_D--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mzk4k4u88s8iezapwb75.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--sgcEXi_D--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mzk4k4u88s8iezapwb75.png" alt="Image description" width="357" height="410"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It is conceivable to create more arbitrary skeleton configurations and it has been done in games to create wonderful variation of creatures. As you can imagine, animation and testing of these characters can be a daunting task. For our use case, a humanoid configuration will suffice. Maybe we have different limits on male and female skeletons etc.&lt;/p&gt;

&lt;h3&gt;
  
  
  The model mesh
&lt;/h3&gt;

&lt;p&gt;Attached to the bone is what is arguably the most complex part of the character. The model or the mesh is what makes up the shape of the character. It is made up by a number of vertices, or points in space. Between these points we draw lines called edges and together they crate surfaces or faces. In the image below we have a mesh that is made up of quads, meaning four vertices, and you need at least 3 vertices to create a surface. (That is why you often hear talk about triangles when people talk about graphics in games).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--envc4My5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/cju8tzpzu1qqm2r7g75f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--envc4My5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/cju8tzpzu1qqm2r7g75f.png" alt="Image description" width="800" height="477"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To manipulate these meshes, you simply move one of the vertices. This is what applications like 3ds Max, Maya and Blender excel at. To do this in a procedural context and randomly move vertices around probably would not make sense as the outcome is very unpredictable. One could instead imagine we create two extreme characters, one light and one heavy. The same for big/small noses, etc. and then vary some input parameters to create variations between those extreme values. It is easy to imagine a classical character creator in a RPG with sliders being the randomized values.&lt;/p&gt;

&lt;h3&gt;
  
  
  Skin and textures
&lt;/h3&gt;

&lt;p&gt;Now after creating our model mesh, we need some color on this gray clay model. And that is what skin and textures are for. We could make our lives easy and create something like the blue man group and then vary the colors just like on pixel character above. But we probably want something more sophisticated like this character made by an old colleague of mine &lt;a href="https://www.artstation.com/bjornarvidsson"&gt;Björn Arvidsson&lt;/a&gt;:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--px98tg-K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/35iumsje8sr35fknlibj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--px98tg-K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/35iumsje8sr35fknlibj.png" alt="Image description" width="578" height="665"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Looking at a character like this it can be a bit daunting to think about how we could possibly generate something looking so realistic but luckily for us, thanks to other businesses, procedural textures and specifically for skin, has gotten pretty far. Adding more variations for scars with some shader programming or tattoos with decals, we can create endless variation here as well.&lt;/p&gt;

&lt;h3&gt;
  
  
  Tools already available
&lt;/h3&gt;

&lt;p&gt;There are other tools on the market to manipulate and create characters. Below is an example from Epic's &lt;a href="https://www.unrealengine.com/en-US/digital-humans"&gt;MetaHuman&lt;/a&gt;. This is not a procedural tool &lt;em&gt;per se&lt;/em&gt; but it would be trivial to provide MetaHuman with a bunch of random variables as input and create a lot of characters. The problem with tools like MetaHuman is that its a cloud service and we have no idea how much resources and assets go into creating those characters. Also, if you export these characters to disk they are pretty big and creating/storing millions of them is just not practical. But this takes us to the first word in the title of the article.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--RS_9iBsU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/v4xhv0trvf6db9trn8rh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--RS_9iBsU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/v4xhv0trvf6db9trn8rh.png" alt="Image description" width="589" height="667"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Runtime generation and the GOALS use case
&lt;/h2&gt;

&lt;p&gt;At GOALS we need characters that can scale up and down since we are targeting both high and low end machines, they cannot be on disk as they don't exist (have not been born) when we ship and we cannot use any manual labor to create them, there are simply too many of them. Also, back to the MetaHuman example: Even if we were able to store them in the cloud, it is not a nice user experience to wait while we download them before you can play.&lt;/p&gt;

&lt;p&gt;Enter &lt;strong&gt;runtime character generation&lt;/strong&gt;. All we have done today needs to be done at runtime. Meaning while the game is starting or running. This is sometimes referred to as online procedural generation vs offline.&lt;/p&gt;

&lt;p&gt;The idea is that when you boot the game, you get a list of &lt;em&gt;seeds&lt;/em&gt; belonging to you. For each of them we pass it into our procedural box and create skeletons, meshes and textures that makes sense for Unreal engine in the GOALS use case. Then, to update with new cool features and looks etc. we just have to update the procedural algorithms. Tada!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--GToB0eZz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l3o8t1m64a4shf3xh8kq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--GToB0eZz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l3o8t1m64a4shf3xh8kq.png" alt="Image description" width="800" height="222"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Future use cases
&lt;/h2&gt;

&lt;p&gt;Considering what has been mentioned above, imagine if someone would create engine adapters for multiple engines. Then you could in theory, provided you have the right to use a particular seed, use it as you pass between engines. This is the holy grail of the metaverse.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PnAk-Q7I--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/de8vwphvqfsju1g14cvb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PnAk-Q7I--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/de8vwphvqfsju1g14cvb.png" alt="Image description" width="800" height="302"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If this sounds exiting, I would recommend you come work on it at &lt;a href="https://goals.co"&gt;GOALS&lt;/a&gt; and share your work with the world!&lt;/p&gt;

</description>
      <category>gamedev</category>
      <category>proceduralgeneration</category>
      <category>procedural</category>
      <category>proceduralprogramming</category>
    </item>
    <item>
      <title>Working with Unreal Engine source releases in Plastic SCM</title>
      <dc:creator>Andreas Andersson</dc:creator>
      <pubDate>Thu, 31 Mar 2022 08:27:27 +0000</pubDate>
      <link>https://dev.to/goals/working-with-unreal-engine-source-releases-in-plastic-scm-ihf</link>
      <guid>https://dev.to/goals/working-with-unreal-engine-source-releases-in-plastic-scm-ihf</guid>
      <description>&lt;p&gt;&lt;strong&gt;This guide is for &lt;a href="https://www.plasticscm.com" rel="noopener noreferrer"&gt;Plastic SCM&lt;/a&gt; users, that want to build a game using &lt;a href="https://www.unrealengine.com" rel="noopener noreferrer"&gt;Unreal Engine&lt;/a&gt;, and plan to compile and make changes to the engine code itself, while also upgrading engine releases as they are released by Epic.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Hi, I'm Andreas, a game developer here at GOALS. We are building a football game using Unreal Engine and use Plastic to manage our source code and assets. This article tells the story on how we bridged the gap between Git and Plastic to stay on top of new engine releases.&lt;/p&gt;

&lt;h2&gt;
  
  
  Building blocks
&lt;/h2&gt;

&lt;p&gt;Full engine source code is served via &lt;a href="https://github.com/EpicGames/UnrealEngine" rel="noopener noreferrer"&gt;Unreal Engine own GitHub repository&lt;/a&gt;, to which you get access by registering your GitHub user with Epic. See &lt;a href="https://www.unrealengine.com/en-US/ue4-on-github" rel="noopener noreferrer"&gt;How do I access Unreal Engine 4 C++ source code via GitHub?&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Each official engine release is labelled with a Git tag, for example &lt;a href="https://github.com/EpicGames/UnrealEngine/releases/tag/4.27.2-release" rel="noopener noreferrer"&gt;4.27.2-release&lt;/a&gt; or &lt;a href="https://github.com/EpicGames/UnrealEngine/releases/tag/5.0.0-preview-2" rel="noopener noreferrer"&gt;5.0.0-preview-2&lt;/a&gt;. Here you also find a downloadable tarball or zip for the release.&lt;/p&gt;

&lt;p&gt;Then we have a Plastic repository, where we want to develop our game.&lt;/p&gt;

&lt;h2&gt;
  
  
  The idea
&lt;/h2&gt;

&lt;p&gt;The main idea is to see Unreal Engine as a third-party vendor library, albeit a really big one. One strategy for vendor libs is to keep a clean and unmodified copy of it in a separate branch, that is then merged into your development branch.&lt;/p&gt;

&lt;p&gt;This idea isn't new, it's basically exactly what &lt;em&gt;Karl Fogel&lt;/em&gt; describe in &lt;a href="https://durak.org/sean/pubs/software/cvsbook/Tracking-Third_002dParty-Sources-_0028Vendor-Branches_0029.html" rel="noopener noreferrer"&gt;Tracking Third-Party Sources (Vendor Branches)&lt;/a&gt;, a section of his 20+ years old book titled &lt;em&gt;Open Source Development With CVS&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;Sounds simple enough, but how do we apply it to Plastic and Unreal Engine?&lt;/p&gt;

&lt;h2&gt;
  
  
  Branch layout and upgrade flow
&lt;/h2&gt;

&lt;p&gt;Let us start with deciding a branch layout. The game itself is developed on &lt;code&gt;main&lt;/code&gt;, we keep unmodified Unreal Engine releases in &lt;code&gt;vendor-unreal-engine&lt;/code&gt; that is &lt;strong&gt;merged&lt;/strong&gt; down to main for each release.&lt;/p&gt;

&lt;p&gt;Take this example, where we set up an empty repo with &lt;code&gt;4.27.0&lt;/code&gt; that we upgrade to &lt;code&gt;4.27.1&lt;/code&gt; and finally &lt;code&gt;4.27.2&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faxykfp32rpbmkj5lx5ps.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faxykfp32rpbmkj5lx5ps.png" alt="Branch Layout"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This image speaks a thousand words, in text it says:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Start with an empty Plastic repo&lt;/li&gt;
&lt;li&gt;Add &lt;code&gt;4.27.0&lt;/code&gt; to &lt;code&gt;vendor-unreal-engine&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Make a small tweak to the engine itself.&lt;/li&gt;
&lt;li&gt;Merge it all down into &lt;code&gt;main&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Add your games main module&lt;/li&gt;
&lt;li&gt;Upgrade &lt;code&gt;vendor-unreal-engine&lt;/code&gt; with release &lt;code&gt;4.27.1&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Merge the new release with &lt;code&gt;main&lt;/code&gt;, and apply fixes
to make your game module compile and work again.&lt;/li&gt;
&lt;li&gt;Merge the new engine into &lt;code&gt;main&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Make local modification to the engine itself&lt;/li&gt;
&lt;li&gt;Upgrade &lt;code&gt;vendor-unreal-engine&lt;/code&gt; with release &lt;code&gt;4.27.2&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Merge it with &lt;code&gt;main&lt;/code&gt;, and apply even more fixes
to your game so that it still works.&lt;/li&gt;
&lt;li&gt;Publish the new release to &lt;code&gt;main&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;And continue making local changes to the engine&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Importing Unreal Engine releases
&lt;/h2&gt;

&lt;p&gt;So, how do you import or upgrade the engine source code into our &lt;code&gt;vendor-unreal-engine&lt;/code&gt;?&lt;/p&gt;

&lt;h3&gt;
  
  
  Import by delete and re-adding all files
&lt;/h3&gt;

&lt;p&gt;One crude strategy is to delete all files on the vendor branch and simply copy all files from the new release, and then let Plastic detect which files have been added, removed, modified or moved.&lt;/p&gt;

&lt;p&gt;This should work well. Alas, plastics move detection seem to miss most moves, maybe there are too many files involved in engine upgrades for it to be practically possible? It would be understandable, upgrading &lt;code&gt;4.27.2&lt;/code&gt; to &lt;code&gt;5.0.0-early-access-1&lt;/code&gt; modifies over 50k files.&lt;/p&gt;

&lt;p&gt;As a result, moved files will be imported as a delete followed by an add.&lt;br&gt;
If you have made changes to the file in the old location on your &lt;code&gt;main&lt;/code&gt;-branch, Plastic will not help you merge these changes into the file in its new location. Instead, it will ask you how to resolve your changes to the old, deleted file. Forcing you to manually copy your changes into the new location.&lt;/p&gt;

&lt;p&gt;Depending on how widespread your local engine changes are, this strategy might be good enough, for us at GOALS it was not.&lt;/p&gt;
&lt;h3&gt;
  
  
  Import by replicating changes from Git
&lt;/h3&gt;

&lt;p&gt;Luckily, we can do better. Full revision history is available in the main Git repo, it knows which files has been modified, added or removed, and most importantly it also knows which files have been renamed or moved.&lt;/p&gt;

&lt;p&gt;The command we use is &lt;code&gt;git diff --name-status&lt;/code&gt;, here's the output of the diff between &lt;code&gt;4.27.1&lt;/code&gt; and &lt;code&gt;4.27.2&lt;/code&gt;. Note that this is just an excerpt, the real diff contains roughly 460 changes.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;$ &lt;/span&gt;git diff &lt;span class="nt"&gt;--name-status&lt;/span&gt; 4.27.1-release 4.27.2-release
M   Engine/Build/Build.version
A   Engine/Extras/Containers/Dockerfiles/linux/dev-slim/Dockerfile
D   Engine/Source/Programs/Enterprise/Datasmith/DatasmithSolidworksExporter/Private/Animations/AnimationExtractor.cs
R064    Samples/PixelStreaming/WebServers/SignallingWebServer/platform_scripts/cmd/run.bat  Samples/PixelStreaming/WebServers/SignallingWebServer/platform_scripts/cmd/run_local.bat
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The leading column means:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;M&lt;/code&gt; - File was modified in place&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;A&lt;/code&gt; - File was added&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;D&lt;/code&gt; - File was deleted&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;R*&lt;/code&gt; - File was renamed or moved.
The number is a percentage of how certain Git is that the file was in fact moved, and not a delete followed by an add. There is some grey area when it comes to moves in Git, sometimes a file is moved, but then modified to fit in its new location. For example, a moved C++ file may need to have include paths tweaked to compile.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Git uses some fuzzy heuristics to discern moves from adds and deletes. Most of the time it seems to make good guesses. When it fails it is not a big deal; the old location will still be deleted, and the name, location and content of the added file will be correct.&lt;/p&gt;

&lt;p&gt;Now, it's just a matter of replicating these changes in &lt;code&gt;vendor-unreal-engine&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Dealing with modified files is simple, just check out in Plastic and copy the file from the new release.&lt;/p&gt;

&lt;p&gt;Adding files is almost as easy, just copy their content. But, if the target folder does not exist in Plastic we need to first create and add it before coping the file.&lt;/p&gt;

&lt;p&gt;For deletes, the opposite of adds, we start by deleting the file itself.&lt;br&gt;
If the folder in which it existed became empty, we need to remove the folder, we also need to iterate up the hierarchy to delete any now empty parent directories.&lt;/p&gt;

&lt;p&gt;For moves we start by creating and adding target directories to before we tell Plastic to move the file. Finally, we copy the contents of the file from the new release.&lt;/p&gt;

&lt;h3&gt;
  
  
  Announcing ueimporter
&lt;/h3&gt;

&lt;p&gt;This git-&amp;gt;plastic import process is very scriptable, and I wrote a command line tool called &lt;code&gt;ueimporter&lt;/code&gt; that takes care of it all. I'm happy to announce that GOALS are now open-sourcing this tool, available in GitHub at &lt;a href="https://github.com/goalsgame/ueimporter" rel="noopener noreferrer"&gt;goalsgame/ueimporter&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Ignoring the elephant in the room
&lt;/h2&gt;

&lt;p&gt;If you have ever tried to store Unreal Engine source inside Plastic, you may have noticed a rather big elephant that I so far avoided; Plastics ignore file, and it's incompatibility with Gits equivalent.&lt;/p&gt;

&lt;p&gt;Unreal Engines GitHub repo comes with a rather complex &lt;code&gt;.gitignore&lt;/code&gt; file. Whenever you build or work with the engine various intermediate and temporary files is scattered all over your workspace, not to mention the thousands of files that is downloaded when you run &lt;code&gt;Setup.bat|sh&lt;/code&gt;.&lt;br&gt;
These files should not be committed into Git, and likewise we do not want them checked into Plastic.&lt;/p&gt;

&lt;p&gt;It is not possible to directly translate Gits &lt;code&gt;.gitignore&lt;/code&gt; file into Plastics &lt;code&gt;ignore.conf&lt;/code&gt;, the two systems have rather different rules deciding in what order ignore patterns are applied.&lt;/p&gt;

&lt;h3&gt;
  
  
  Ignoring in Git
&lt;/h3&gt;

&lt;p&gt;On one hand, we have Git, where each line specifies a file or directory pattern, any subsequent matching line will override preceding matches. A simple philosophy that is relatively easy to understand.&lt;/p&gt;

&lt;p&gt;If we compress Unreals &lt;code&gt;.gitignore&lt;/code&gt; into a nutshell, it can be described like this:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Start by ignoring &lt;strong&gt;all&lt;/strong&gt; files&lt;/li&gt;
&lt;li&gt;Add exceptions to &lt;strong&gt;not ignore&lt;/strong&gt; certain extensions.
For instance, &lt;code&gt;.h&lt;/code&gt; and &lt;code&gt;.cpp&lt;/code&gt; files.&lt;/li&gt;
&lt;li&gt;Add exceptions to those exceptions so that temporary build folders are ignored.
For example, everything under &lt;code&gt;Engine/Intermediate&lt;/code&gt; should be ignored, or else the &lt;code&gt;*.h&lt;/code&gt; and &lt;code&gt;*.cpp&lt;/code&gt; files that &lt;code&gt;UnrealHeaderTool&lt;/code&gt; generates during the build process would be tracked.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;On and on the list goes, with more detailed exceptions and ignore patterns. There are close to 160 rules listed in the ignore file for the &lt;code&gt;5.0.0-preview-2&lt;/code&gt; release.&lt;/p&gt;

&lt;h3&gt;
  
  
  Ignorance is not a bliss in Plastic
&lt;/h3&gt;

&lt;p&gt;Then we have Plastic, it prioritizes patterns based on its type, rather than in what order it occur in its &lt;code&gt;ignore.conf&lt;/code&gt;. Two patterns of the same type are applied in the order they appear in the file. Exception patterns take precedence over ignore patterns of the same type.&lt;/p&gt;

&lt;p&gt;So what pattern types are we talking about? Quoting the &lt;a href="https://www.plasticscm.com/book/#_pattern_files" rel="noopener noreferrer"&gt;Pattern evaluation hierarchy&lt;/a&gt; section of the &lt;em&gt;Version Control, DevOps and Agile Development with Plastic SCM&lt;/em&gt; book.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Plastic SCM will try to match the path of an item using the patterns in the file in a predefined way.&lt;br&gt;
This means that some pattern formats take precedence over others rather than processing the patterns&lt;br&gt;
in the order they appear in the file.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Absolute path rules that match exactly&lt;/li&gt;
&lt;li&gt;Catch-all rules&lt;/li&gt;
&lt;li&gt;Name rules applied to the current item&lt;/li&gt;
&lt;li&gt;Absolute path rules applied to the item directory structure&lt;/li&gt;
&lt;li&gt;Name rules applied to the item directory structure&lt;/li&gt;
&lt;li&gt;Extension rules&lt;/li&gt;
&lt;li&gt;Wildcard and Regular expression rules&lt;/li&gt;
&lt;/ol&gt;
&lt;/blockquote&gt;

&lt;p&gt;There are more devils in the details, but that's the gist of it.&lt;/p&gt;

&lt;p&gt;Unreals &lt;code&gt;.gitignore&lt;/code&gt; use most of these types, but relies on the order to accomplish desired behaviour. Just copy pasting this to &lt;code&gt;ignore.conf&lt;/code&gt; does not work, because it wreaks havoc to this order.&lt;/p&gt;

&lt;h3&gt;
  
  
  An acceptable workaround
&lt;/h3&gt;

&lt;p&gt;We at GOALS have wrestled quite a bit with our &lt;code&gt;ignore.conf&lt;/code&gt;, trying to come up with equivalent behaviour as in Git. So far, we haven't nailed it, but have at least arrived at a config file that we can endure.&lt;/p&gt;

&lt;p&gt;We simply ignore the entire &lt;code&gt;Engine&lt;/code&gt;-folder, that gets rid of most of the intermediate and temporary files that is explicitly ignored in &lt;code&gt;.gitignore&lt;/code&gt;.&lt;br&gt;
The main drawback with this is that we must remember to manually add files to Plastic, whenever we add anything to &lt;code&gt;Engine&lt;/code&gt;, or else it will not show up as a pending change that can be checked in. We can edit files that are already checked into Plastic just fine, they will be detected as changed.&lt;/p&gt;

&lt;p&gt;For our games own modules and plugins it was relatively easy to write ignore rules, mainly because Unreal write most files to &lt;code&gt;Engine&lt;/code&gt; during the setup process and build artefacts all end up in easily identified intermediate folders.&lt;/p&gt;

&lt;h3&gt;
  
  
  Ignore files and our vendor branch
&lt;/h3&gt;

&lt;p&gt;Thankfully, the ignore file is irrelevant on our &lt;code&gt;vendor-unreal-engine&lt;/code&gt; branch.&lt;br&gt;
Here we always want Plastic to detect all files, so that we can check them in and later have them merged into &lt;code&gt;main&lt;/code&gt;. This implies that you must clear out any private files before you start importing a new engine release to the vendor branch. You should not build or do anything to pollute your workspace here, do that on an upgrade branch after merging with main (where an ignore file is present).&lt;/p&gt;

&lt;h2&gt;
  
  
  Parting words
&lt;/h2&gt;

&lt;p&gt;With this setup you have the power to change the engine at will and still stay up to date with new releases. How you wield this power is up to you.&lt;/p&gt;

&lt;p&gt;Consider that any merge conflicts you get with new engine releases, after making local changes, will need to be resolved. This is a manual process that is hard to automate.&lt;/p&gt;

&lt;p&gt;In the past, when working in another big game engine, I have seen many, many, &lt;strong&gt;many&lt;/strong&gt; dev-months been sunk into resolving merge conflicts and follow up issues, due to local modifications when the engine was upgraded. Tears were shed, good night sleeps lost and dev-happiness fled down the drain. It was not pretty.&lt;/p&gt;

&lt;p&gt;Keep your engine changes small and isolated, and tag changed lines with begin/end comments. If a change can be done in your game module or a plug-in that is the preferred way.&lt;/p&gt;

&lt;p&gt;One benefit with this setup, is that you can cherry-pick fixes from Epics mainline and push directly into your own &lt;code&gt;main&lt;/code&gt;. Later, when the fix gets included in an official release your divergence should just resolve itself in the upgrade process.&lt;/p&gt;

&lt;p&gt;Finally, upgrade the engine often, the further you diverge from the mainline the harder it will be to catch up. Take one version at a time, even if you are more than one version behind. In their &lt;a href="https://www.youtube.com/watch?v=AaZrAjkBhlM" rel="noopener noreferrer"&gt;Fish Slapping Dance&lt;/a&gt; Monty Python teach us that it's better to be slapped with a small pilchard multiple times than it is to be slapped by a big fat halibut just once.&lt;/p&gt;

&lt;p&gt;Take care, stay safe and happy game making.&lt;/p&gt;

</description>
      <category>gamedev</category>
      <category>unrealengine</category>
      <category>plasticscm</category>
      <category>git</category>
    </item>
    <item>
      <title>Collaboration on large teams with GAMECURE</title>
      <dc:creator>Göran Syberg Falguera</dc:creator>
      <pubDate>Thu, 31 Mar 2022 08:09:44 +0000</pubDate>
      <link>https://dev.to/goals/collaboration-on-large-teams-with-gamecure-1dfb</link>
      <guid>https://dev.to/goals/collaboration-on-large-teams-with-gamecure-1dfb</guid>
      <description>&lt;p&gt;Introducing &lt;strong&gt;GAMECURE&lt;/strong&gt;! An open source Unreal Editor build distribution system for Windows, Linux and Mac, utilizing Dan Engelbrechts eminent longtail delta patching and with a bunch of other nifty features to get going quickly with your source built Unreal Engine in Plastic SCM.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--f9A-emqo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/z2bt00bne1cl7xnwjpny.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--f9A-emqo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/z2bt00bne1cl7xnwjpny.png" alt="GAMECURE main window" width="800" height="523"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Yes, that introduction is a mouth full so let me break it down for you.&lt;/p&gt;

&lt;p&gt;Most of my career I've spent developing games at EA DICE. This is what would be considered a large studio (not the largest by any means) but we had projects with hundreds and sometimes thousands of individuals contributing to game code and content. All these individuals can contribute up to thousands of changes to the repository every day. The trick to being this fast is putting a wall between the content creators (artists, game designers, 3d-modelers, animators, etc.) that produce content and coders that improve, fix and many times break the game.&lt;/p&gt;

&lt;p&gt;The coders will have a source version of the game engine on their computers. To play the game or use the game editor the coders need to build (compile, link, etc.) the whole engine locally on their computers. The custom computers used by GOALS coders spend 12-13 minutes on this but a regular gaming computer will spend hours on it! This does not only take a lot of time but it distracts you, is error prone and complicated. As a content creator you just want to create amazing content, no hassle!&lt;/p&gt;

&lt;p&gt;What we need is some kind of tool that can act as the central tool for all GOALS development if you are not a coder, managing the other tools you need and allows you to easily feedback to engineers what is working and not working as well as quickly stepping back and forth between Unreal Editor versions.&lt;/p&gt;

&lt;p&gt;Another big studio/project is Epic Games who produce the game Fortnite. They also happen to be the owners and maintainers of Unreal Engine. There is a great article explaining their workflow and reasons behind it called: &lt;a href="https://cdn2.unrealengine.com/workflow-on-fortnite-whitepaper-final-181633758.pdf"&gt;Workflow on Fortnite - Collaboration on large teams with UnrealGameSync&lt;/a&gt;. And this takes us to an application called "UnrealGameSync" which is essentially what we want! A separation of coder and content creator workflow and concerns. Perfect! But... UnrealGameSync only works with Perforce version tracker (which is not really intended to be replaced) and its downloading full binaries each time (no delta patching) and it only works on Windows &lt;em&gt;sigh&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;For a studio, like ours, that is cloud native and allow their developers to work on Windows, Linux or Mac and uses Plastic SCM for version control, that is three deal breakers right away. It was deemed to much for us to work around so we decided to create our own tool - GAMECURE!&lt;/p&gt;

&lt;p&gt;One of the core development philosophies at GOALS is simplicity. And for us that usually means as few dependencies and as few hoops to jump through as possible. The maximum amount of complexity for a content creator at GOALS setting up their environment, we decided should be: Get the simplified version of Plastic SCM (called &lt;a href="https://www.plasticscm.com/gluon"&gt;Gluon&lt;/a&gt;) and get GAMECURE from a permanent link. The rest would be automagic. Hence the GAMECURE application could not have a bunch of external dependencies and a stack made up .NET 6 and &lt;a href="https://avaloniaui.net/"&gt;AvaloniaUI&lt;/a&gt; that we compile to standalone binaries turned out to be a good combo.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_you6fQv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/k0fnglemuxhlgr4pby9i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_you6fQv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/k0fnglemuxhlgr4pby9i.png" alt="Plastic Gluon. Simplified version tracking client" width="800" height="487"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Another important requirement for a cloud native company as GOALS is that we don't have to download a bunch of unnecessary stuff. Cloud native meaning that essentially all our infrastructure is in the cloud. All downloads we do internally and that other people do externally from our CDN's incurs costs. Minimizing the amount downloaded is paramount and if you ask me, is just plain better engineering. For this we turned to a library called &lt;a href="https://github.com/DanEngelbrecht/golongtail"&gt;longtail&lt;/a&gt; by Dan Engelbrechts. This allows us to only distribute what has actually changed to the content creators. &lt;/p&gt;

&lt;h2&gt;
  
  
  What GAMECURE will do for you
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Set up your Plastic SCM workspace with a predefined configuration so it only downloads what is needed to work with the pre-built editor.&lt;/li&gt;
&lt;li&gt;Setup and download Unreal dependencies.&lt;/li&gt;
&lt;li&gt;Single sign on to our cloud so you don't have to authenticate each time you want to update.&lt;/li&gt;
&lt;li&gt;Delta-downloads of the editor to decrease the time it takes to update your editor.&lt;/li&gt;
&lt;li&gt;Notifications when there's a new version available on the main branch.&lt;/li&gt;
&lt;li&gt;Single click to create a bug report for the editor in Jira.&lt;/li&gt;
&lt;li&gt;Cross platform packages, .app bundle for Mac, MSI installer for windows and a zip for Linux with a bundled config that is put together at build time.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--NvXb5RF8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zuyd3pmlata1r1sjmkw4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--NvXb5RF8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zuyd3pmlata1r1sjmkw4.png" alt="Create workspace dialog in GAMECURE" width="459" height="250"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  A typical workflow
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;A content creator may need new editor functionality or a new C++ blueprint class for instance.&lt;/li&gt;
&lt;li&gt;The coder takes the specifications and writes the feature in code. Once tested and reviewed, she pushes the changes to Plastic SCM and kicks of the CI.&lt;/li&gt;
&lt;li&gt;The coder can now ping the content creator and say "i pushed the changes you need in changeset xxx"&lt;/li&gt;
&lt;li&gt;The content creator will get a notification in GAMECURE saying there is a new version available and can verify that the particular changeset is in there.&lt;/li&gt;
&lt;li&gt;The content creator now hit download on the new editor and usually in just a few seconds (due to the delta patching) the new version is on disk.&lt;/li&gt;
&lt;li&gt;The new editor can now be launched with a button press from GAMECURE.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Some technical details
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;.NET 6 and AvaloniaUI for cross platform GUI application&lt;/li&gt;
&lt;li&gt;WIX installer for an easy to use MSI on Windows&lt;/li&gt;
&lt;li&gt;Custom BuildTool built in .NET 6 to remove the need for multiple scripts for different platforms. This tool is used to generate a config and build the different packages.&lt;/li&gt;
&lt;li&gt;GUI is separated from the core functionality so that a CLI(or any other UI) can easily be added.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Future work:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Auto-updating when there's a new version of GAMECURE&lt;/li&gt;
&lt;li&gt;Better Jira integration, instead of opening a browser it should have a dialog inside GAMECURE and attach any log files or screenshots&lt;/li&gt;
&lt;li&gt;Track the current changeset, so that the user doesn't download an editor that is very different from the changeset they are on (this is currently a bit hard and undefined in plastic)&lt;/li&gt;
&lt;li&gt;PKCE implementation for authentication/authorization with GCloud to remove the need for a Client Secret.&lt;/li&gt;
&lt;li&gt;Progress bars where it's possible, for example download the editor, copy the editor etc.&lt;/li&gt;
&lt;li&gt;Add support for other version trackers than Plastic SCM.&lt;/li&gt;
&lt;li&gt;Add support for other clouds than Google.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We welcome contributions and we are curious to know how you solved your workflow challenges!&lt;/p&gt;

&lt;p&gt;Head over to our &lt;a href="https://github.com/goalsgame/gamecure"&gt;GitHub&lt;/a&gt; to get started with your GAMECURE - workflow!&lt;/p&gt;

</description>
      <category>unrealengine</category>
      <category>plasticscm</category>
      <category>workflow</category>
      <category>jira</category>
    </item>
    <item>
      <title>Input lag, what's going on?</title>
      <dc:creator>Jens Andersson</dc:creator>
      <pubDate>Wed, 30 Mar 2022 12:24:40 +0000</pubDate>
      <link>https://dev.to/goals/input-lag-whats-going-on-1ka8</link>
      <guid>https://dev.to/goals/input-lag-whats-going-on-1ka8</guid>
      <description>&lt;p&gt;Input lag, input latency, bad servers, and low tick rate are some of the phrases used to describe unresponsive gameplay. Let’s break it down into smaller parts and talk about where and why this happens, and what we can do to improve the experience.&lt;/p&gt;

&lt;p&gt;At GOALS we want to create the best player experience possible, and we know there are no shortcuts to achieving that. GOALS should feel good and responsive to play; there shouldn’t be any delay when you press a button. To make this happen we’re going to build tools that help us measure the delay early on in the development process.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;The tests in this article were performed with an Unreal Engine 4 empty project with a dedicated server and one client on a single machine. These values might not be 100% accurate but they will give you some information about where the latency happens. The client was running at 120fps and the server was set to a tick rate of 60. The client FPS and server tick rate for these tests were selected to make it easier to measure and present for the reader.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Hardware
&lt;/h2&gt;

&lt;p&gt;The hardware you use can be a factor for the perceived input lag. We have a controller, mouse, or keyboard connected through Bluetooth, USB, or Wi-Fi. All of these devices have a small delay and it varies between the specific device, polling rate, and connection type. Both Xbox and PlayStation controllers have a small delay but they have a different default polling rate which could cause a bigger variety in the delays. The polling rate of a device is the number of times the operating system/driver will ask for changes. For example, an Xbox 360 controller has a default polling rate of 120 times per second and a Playstation 4 controller of 250 times per second. This means that the Xbox controller has a max delay of 8.3 ms and the Playstation controller 4 ms.&lt;/p&gt;

&lt;p&gt;The delay to render a frame on your Monitor or TV can vary between 1 ms and 200 ms. For   a new gaming monitor, the delay is very low (usually between 1-2 ms) and for older monitors, it can be around ~2-6 ms. TVs are different, they have a bigger screen and are not optimized for fast rendering. The delay can vary between 14 ms and up to 200 ms. Some new TVs are GSync compatible and have a gaming mode setting where some of the filters are disabled to reduce the delay in rendering. The refresh rate on monitors and TVs differs as well. Monitors have a refresh rate between 60 and 360hz while TVs are between 50 and 120hz. The refresh rate is how often a frame will be drawn on the screen, higher values will make motions appear more smooth and responsive.&lt;/p&gt;

&lt;h4&gt;
  
  
  Client frame rate table
&lt;/h4&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Frame rate&lt;/th&gt;
&lt;th&gt;Time per frame&lt;/th&gt;
&lt;th&gt;Comment&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;30&lt;/td&gt;
&lt;td&gt;33 ms&lt;/td&gt;
&lt;td&gt;Older TVs and consoles&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;60&lt;/td&gt;
&lt;td&gt;16.7 ms&lt;/td&gt;
&lt;td&gt;Most common refresh rate&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;120&lt;/td&gt;
&lt;td&gt;8.3 ms&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;144&lt;/td&gt;
&lt;td&gt;6.9 ms&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;240&lt;/td&gt;
&lt;td&gt;4.2 ms&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h2&gt;
  
  
  The server tick rate
&lt;/h2&gt;

&lt;p&gt;Let’s take a closer look at how game servers, specifically how tick rate works, and how it affects the responsiveness. &lt;/p&gt;

&lt;p&gt;Tick rate is the number of times a server will process input, update the game state, and send updates back to the clients per second. A server at 60 tick rate will process input every 16.7 ms and for a server set to 120 tick rate, it's every ~8.3 ms. A higher tick rate on the server will decrease the time it takes to process the input from the client and update the game logic. This comes at the cost of more processing power and network traffic since more data has to be synchronized and processed. &lt;/p&gt;

&lt;p&gt;In a game where the game state is big and there are a lot of changes, the server might not be able to process the entire state more than 30 times per second (30 tick rate), while in a game where the state is really small it might be able to process the game state at 240 times per second without any problem.&lt;/p&gt;

&lt;p&gt;The tick rate will affect the input latency between 0 and 1/[tick rate] seconds depending on when the input package arrives at the server. &lt;/p&gt;

&lt;h2&gt;
  
  
  Internet and networked communication
&lt;/h2&gt;

&lt;p&gt;Multiplayer games over the internet will always have a varied latency and this is due to the connection type, the number of routes between the client and the server, as well as the amount of traffic on that physical line. The Internet is very unreliable and can cause packages to arrive in the wrong order or get lost on the way. When a package is lost the client/server has to decide if it should request a resend or just discard it. This is usually done upfront in the game engine, marking some packages as reliable or unreliable. Reliable packages require the server/client to respond with an ack (acknowledgment) when the package has arrived while the unreliable version is just fire-and-forget, and ignores lost packages. &lt;br&gt;
Input should always be sent as a reliable package since that data should never be lost, while position updates can be sent as unreliable. Resending an old position of a player in the game might be a waste because by the time it reaches the destination the data is old (this varies between different types of games).&lt;/p&gt;

&lt;p&gt;When packages are lost some kind of delay will happen, in some cases, the game might appear frozen and in other cases, you might experience that your character did not react to your input fast enough.&lt;/p&gt;

&lt;h2&gt;
  
  
  Rendering and synchronization
&lt;/h2&gt;

&lt;p&gt;There are a handful of settings that can change how a frame is rendered on the screen to increase performance, reduce the latency, avoid screen tearing, etc. Here’s a brief overview of how they affect performance and the perceived input lag. &lt;/p&gt;

&lt;p&gt;A frame can be rendered to the back buffer (the buffer that the display reads) either by writing directly to the buffer, writing to a temporary buffer and copying the buffer when it's ready or to several buffers and just swapping the source. The two latter ones are called double and triple buffering. Both of these will increase the performance of the game since it doesn't write directly to the back buffer but it will also introduce a small latency. (see references for in-depth information about this)&lt;/p&gt;

&lt;p&gt;Screen tearing is something that happens when the monitor's refresh rate doesn’t match the frame rate in the game. VSYNC is a way to solve this by keeping the frame rate (writing to the back buffer) in sync with the monitor's refresh rate. This will give the player a very smooth experience with no screen tearing, but it will introduce a small latency on the input since each frame has to wait until the monitor is ready to render it. &lt;/p&gt;

&lt;p&gt;GSync/Freesync is a new monitor technology that lets the GPU control how often the monitor should render instead of the other way around. This eliminates both the input lag and the screen tearing and will give the best possible experience. The downside is that it requires special hardware, both the GPU and the monitor must support either GSync(Nvidia) or Freesync(AMD). &lt;/p&gt;

&lt;h2&gt;
  
  
  The tests in Unreal Engine
&lt;/h2&gt;

&lt;p&gt;We’ve built some simple tools to track the latency from a button press until the server responds with the updates.&lt;/p&gt;

&lt;p&gt;The blue bar is the Client and that’s how long it takes for Unreal Engine to register a button click. In Unreal Engine the input is tied to the frame rate, a high frame rate will poll the input more often, lower frame rate will increase the latency. The red bar is the communication to the server where it will package the data, send the package, unpack the data and run the method on the server. The yellow bar is the time it took for the response to come back to the client.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv6yvhn0ld753m4yckg2k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv6yvhn0ld753m4yckg2k.png" alt="Image description"&gt;&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;If we break these down into smaller parts and compare them to what we’ve learned so far: The client is running at 120fps, which means that the input will be polled every ~8.3 ms which would give us an average input delay of ~4 ms. This can be seen in the graph above.&lt;br&gt;
The server tick rate is set to 60 and each input will be read every 16.7 ms (average at ~8.3 ms), this is the red bar in the graph. The server sends the response back to the client and the client reads that response every 8.3 ms (120fps), this is the yellow part. &lt;br&gt;
The worst-case in this scenario is 30 ms between a button click and the code gets run on the client, and the best is below 10 ms.&lt;/p&gt;

&lt;p&gt;In the graph below we can see a comparison of how long it takes before a button release event happens (this is a normal click with the thumb on a PS4 controller). This means that if the game reacts on button-up instead of button-down you’ll have a 50-120 ms delay before the input has even been read and processed by the game loop. To reduce the input delay you should do the action on button-down events, this might not be possible in some cases though. For example, in games where you have a double tap or a press+hold, the game code has to wait for a certain amount of time before it can decide which action you’re doing. If you use the same button for a single tap you’ll still have to wait that amount of time before it knows which action you are performing.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdbigesa6ap20azuizqwe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdbigesa6ap20azuizqwe.png" alt="Image description"&gt;&lt;/a&gt; &lt;/p&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;We’ve talked about the hardware, the software, and the internet. There are a lot of steps between pressing a button until something happens on the screen. In the table below you can see all the steps that happen when you press a button in the test we did. When everything is perfect most of these steps will be very close to 0 and in the worst case it will add up to a high number based on the frame rate, tick rate, internet latency, input device, and rendering settings. &lt;/p&gt;

&lt;p&gt;The rendering and display times were not measured by us, these numbers are from tests that can be found in the references section.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Source&lt;/th&gt;
&lt;th&gt;    Possible Latency  &lt;/th&gt;
&lt;th&gt;Comment&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Input device (controller, keyboard, mouse)&lt;/td&gt;
&lt;td&gt;0 - 10 ms&lt;/td&gt;
&lt;td&gt;The delay in the physical device, the drivers, and the Operating System.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Game loop (Read input)&lt;/td&gt;
&lt;td&gt;30 fps = 33 ms&lt;br&gt;60 fps = 16.7 ms&lt;br&gt;120 fps = 8.3 ms&lt;/td&gt;
&lt;td&gt;The frame rate that the game is running at if the engine polls the input once each frame.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Send input to server&lt;/td&gt;
&lt;td&gt;0 - 0.2 ms&lt;/td&gt;
&lt;td&gt;Packaging/Serializing the data on the client.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Internet/Network&lt;/td&gt;
&lt;td&gt;&amp;gt; 0 ms&lt;/td&gt;
&lt;td&gt;This is the time it takes for a package to reach the server (This is not RTT or Ping). This can vary a lot depending on your location and your ISP.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Process input on the server&lt;/td&gt;
&lt;td&gt;30 ticks = 0 - 33 ms&lt;br&gt;60 ticks =  0 - 16.7 ms&lt;br&gt;120 ticks = 0 - 8.3 ms&lt;/td&gt;
&lt;td&gt;The package will be processed every tick on the server, so depending on when it arrives the delay will vary.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Send response to client&lt;/td&gt;
&lt;td&gt;0 - 0.2 ms&lt;/td&gt;
&lt;td&gt;Package/Serializing the data on the server.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Internet/Network&lt;/td&gt;
&lt;td&gt;&amp;gt; 0 ms&lt;/td&gt;
&lt;td&gt;Time for a package to reach the client.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Process message on client&lt;/td&gt;
&lt;td&gt;30 fps = 0 - 33 ms&lt;br&gt;60 fps = 0 - 16.7 ms&lt;br&gt;120 fps = 0 - 8.3 ms&lt;/td&gt;
&lt;td&gt;In most game engines the network packages will be processed at the start of the game loop and systems can look at the values in their tick/update function.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Render&lt;/td&gt;
&lt;td&gt;0-10 ms&lt;/td&gt;
&lt;td&gt;Depending on the rendering technique used this can vary between 0 and a couple of milliseconds. It also depends on when in the game loop graphics are processed.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Display on monitor/TV&lt;/td&gt;
&lt;td&gt;1 - 200 ms&lt;/td&gt;
&lt;td&gt;The time it takes for the monitor to show what the GPU has in the back buffer.&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;There are some things we didn’t mention in this article and that are issues that happen on the server, for example, a short “freeze” on the server that causes the processing time to take longer than it should, disconnects, or dropped packages due to overflow of input, etc.&lt;/p&gt;

&lt;h2&gt;
  
  
  What can you do?
&lt;/h2&gt;

&lt;p&gt;There are a few things you can do to reduce the perceived input latency. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Do some testing with VSync and enable/disable double/triple buffering.&lt;/li&gt;
&lt;li&gt;If your device supports a higher polling rate you can probably find something in the manual or some online forum on how to increase it.&lt;/li&gt;
&lt;li&gt;If you’ve got an older Monitor or TV and are considering an upgrade, look for something that is made for gaming.&lt;/li&gt;
&lt;li&gt;GSync/Freesync/GSync compatible monitors and TVs will make the game look and feel better, make sure your Graphics card supports it.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  What will GOALS do?
&lt;/h2&gt;

&lt;p&gt;Some of these things that we’ve talked about are not something that GOALS can do anything about, but there are some parts where we’ll do our best to improve the responsiveness of the game.&lt;/p&gt;

&lt;p&gt;The performance on the client and the server is one part that affects the input latency and it's something we can work with. We'll add a lot of telemetry points early in the development of the game so we can track the impact of every change and feature we add. This will give us a better understanding of which parts can be further optimized to achieve a higher frame rate. &lt;/p&gt;

&lt;p&gt;There are a lot of different techniques that can be used to improve the perceived latency by predicting the move that the player does, like starting an animation when the button is pressed instead of waiting for the server to approve it. The downside is when the client and server disagree on something and you’ll experience a small rubber-band effect. &lt;br&gt;
We’re going to spend a lot of time testing out different solutions to this problem so that we can give the player a great experience even when the network condition isn’t optimal. &lt;/p&gt;

&lt;p&gt;We’ll keep you updated with our progress and share details about how we’re measuring the performance and responsiveness.&lt;/p&gt;

&lt;h2&gt;
  
  
  Links and References
&lt;/h2&gt;

&lt;p&gt;Monitor/TVs&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://hdtelevizija.com/en/input-lag-measurements/" rel="noopener noreferrer"&gt;https://hdtelevizija.com/en/input-lag-measurements/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.rtings.com/monitor/tests/inputs/input-lag" rel="noopener noreferrer"&gt;https://www.rtings.com/monitor/tests/inputs/input-lag&lt;/a&gt; &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Monitors Input lag for controllers&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.youtube.com/watch?v=Cv-OOn7iYio" rel="noopener noreferrer"&gt;https://www.youtube.com/watch?v=Cv-OOn7iYio&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.youtube.com/watch?v=ahsO5bhBUtk" rel="noopener noreferrer"&gt;https://www.youtube.com/watch?v=ahsO5bhBUtk&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Rendering&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://en.wikipedia.org/wiki/Multiple_buffering" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/Multiple_buffering&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://en.wikipedia.org/wiki/Screen_tearing" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/Screen_tearing&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
    </item>
    <item>
      <title>External code in Unreal Engine game source</title>
      <dc:creator>Göran Syberg Falguera</dc:creator>
      <pubDate>Thu, 24 Feb 2022 14:30:07 +0000</pubDate>
      <link>https://dev.to/goals/external-code-in-unreal-engine-game-source-414o</link>
      <guid>https://dev.to/goals/external-code-in-unreal-engine-game-source-414o</guid>
      <description>&lt;h1&gt;Introduction&lt;/h1&gt;

&lt;p&gt;As many game developers know, the Git 100MB files size limit renders vanilla Git useless. For us, LFS did not seem like a viable option (we may post our full report regarding selecting version tracker later) and we ended up gong with Plastic SCM. So what we have now is:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A game built on Unreal Engine. We are using the source version of the engine as we need to touch engine code.&lt;/li&gt;
&lt;li&gt;Full game and engine source as well as game data stored in a Plastic SCM repository.&lt;/li&gt;
&lt;li&gt;All cloud services, backend code and shared code stored in our GitHub repository.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To be able to work with faster iteration times outside the engine, use the Git-workflows and tool sets and potentially share code with other Goals services, we have introduced the "External Packages Workflow".&lt;/p&gt;

&lt;p&gt;In a nutshell, it is a GitHub repository inside the Goals game source and it allows for a couple of workflows:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Iterate rapidly on a library and its tests outside the engine source.&lt;/li&gt;
&lt;li&gt;Iterate on a library, committing to the GitHub repository, but while testing it as part of the game source.&lt;/li&gt;
&lt;li&gt;Iterate on a library when using it as part of another goals service.&lt;/li&gt;
&lt;li&gt;Allow packages to be both Bazel packages and Unreal source at the same time without interfering with each other.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For the remainder of this article, I will use the Goals &lt;code&gt;game-ai&lt;/code&gt; library as an example. It could also be used for instance for shared Protobuf definitions files or something similar. The key is that you can put anything there that does not interfere with Unreal build system.&lt;/p&gt;

&lt;h1&gt;Defining which packages to use&lt;/h1&gt;

&lt;p&gt;Since there was no natural place to add a list of external packages that we needed to use, we introduced the &lt;code&gt;Packages.goals&lt;/code&gt; file that lives in the root of the Game source. This is a text file with a pretty self explanatory syntax:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Defines which external Goals packages should be added to the source.
# If the Commit hash/releaese is omitted, latests will be fetched.
# Format:
# [Package name] [Remote git url] [Commit hash/Release tag]

# Game AI
game-ai https://github.com/goalsgame/game-ai.git 242a3d0ed0ba1cc3668962c90911ec0c8197a360
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;Wrapping Setup.sh/bat&lt;/h1&gt;

&lt;p&gt;To achieve this without adding an extra thing you have to do each time you sync we introduced &lt;code&gt;GoalsSetup.sh/bat&lt;/code&gt;. What &lt;code&gt;GoalsSetup&lt;/code&gt; does is:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Wrap the Setup.sh/bat files shipped with Unreal and only run it if needed.&lt;/li&gt;
&lt;li&gt;Check if there are any missing external code repositories missing (as defined by the Packages.goals file) and if there are, fetch or update.&lt;/li&gt;
&lt;li&gt;Check if one of the external repositories have been touched locally (i.e. git status --porcelain) and if it has, leave it alone.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So, as a coder, the only thing you have to do after you sync the main game repository is run &lt;code&gt;GoalsSetup&lt;/code&gt; and everything should be dandy. It is our goal to not have any other dependencies that are needed on your machine apart from the ones dictated by working with unreal (which is essentially a compiler).&lt;/p&gt;

&lt;h1&gt;Hooking into Unreal build system&lt;/h1&gt;

&lt;p&gt;The easiest way to hook into unreal build system is to look at all external packages as one Packages module. This way, unreal build system can parse the source for the needed files and add them as dependencies for the game. So, create a new module by adding a build file in your source. For example &lt;code&gt;Packages.Build.cs&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;using UnrealBuildTool;

public class Packages : ModuleRules
{
    public Packages(ReadOnlyTargetRules Target) : base(Target)
    {
        PCHUsage = PCHUsageMode.UseExplicitOrSharedPCHs;

        PrivateDependencyModuleNames.AddRange(new string[] {  });
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, in your game, add this module as a dependency. For example in &lt;code&gt;Game.Build.cs&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;using UnrealBuildTool;

public class Game : ModuleRules
{
    public Game(ReadOnlyTargetRules Target) : base(Target)
    {
        PCHUsage = PCHUsageMode.UseExplicitOrSharedPCHs;

        PublicDependencyModuleNames.AddRange(new string[] { 
            ...
            "Packages",
        });

        ...

    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, each time you build your game project, the source of the &lt;code&gt;Packages&lt;/code&gt; module will be parsed and needed files added to the project. Here is what it could look like in your source:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frclmiywmmh4zlo3kvncd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frclmiywmmh4zlo3kvncd.png" alt="Image description"&gt;&lt;/a&gt;&lt;br&gt;
Note that the external package &lt;code&gt;game-ai&lt;/code&gt; has some Git and Bazel related files in it but that is fine. And now, for each package we add to the &lt;code&gt;Packages.goals&lt;/code&gt; text file a new directory is created on the side of the &lt;code&gt;game-ai&lt;/code&gt; directory above and its files included in the game build.&lt;/p&gt;

&lt;h1&gt;Left to do and caveats&lt;/h1&gt;

&lt;p&gt;We still need to be able to handle including more complex packages. This could be for instance packages with source files that should not be included (test files) or files that are disruptive to the Unreal build system. This could be handled in &lt;code&gt;Packages.Build.cs&lt;/code&gt; above by excluding certain directories. Maybe ones the include the word "test" for example. See how this is done in for instance &lt;a href="https://github.com/GameInstitute/RuntimeMeshLoader/blob/master/Source/RuntimeMeshLoader/RuntimeMeshLoader.Build.cs" rel="noopener noreferrer"&gt;RuntimeMeshLoader&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;If we want to include packages that depend on other packages we would have to make sure the include paths are the same and the libraries available but ideally, all the external packages are atomic and do not rely on other things.&lt;/p&gt;

</description>
      <category>unrealengine</category>
      <category>plasticscm</category>
      <category>gamedev</category>
      <category>github</category>
    </item>
    <item>
      <title>Building docker images in docker, and dynamic sequential Jenkins stages</title>
      <dc:creator>Jonas Bergström</dc:creator>
      <pubDate>Thu, 17 Feb 2022 09:51:29 +0000</pubDate>
      <link>https://dev.to/goals/building-docker-images-in-docker-and-dynamic-sequential-jenkins-stages-2fni</link>
      <guid>https://dev.to/goals/building-docker-images-in-docker-and-dynamic-sequential-jenkins-stages-2fni</guid>
      <description>&lt;p&gt;A couple of things I thought would be super easy but turned out to require a few hours of research and trial-and-error...&lt;/p&gt;

&lt;p&gt;At &lt;a href="https://goals.co/"&gt;GOALS&lt;/a&gt; we run &lt;a href="https://plugins.jenkins.io/kubernetes/"&gt;Jenkins in Kubernetes&lt;/a&gt; for various reasons.&lt;br&gt;
Some build jobs generates Docker images as artefacts, for example when we build new versions of some backend service.&lt;br&gt;
However, &lt;a href="https://levelup.gitconnected.com/kubernetes-is-deprecating-docker-in-2021-fa8317f9f070"&gt;Kubernetes is deprecating Docker&lt;/a&gt;, and we are running all our nodes on &lt;a href="https://cloud.google.com/kubernetes-engine/docs/concepts/using-containerd"&gt;containerd&lt;/a&gt;.&lt;br&gt;
So, how to build Docker images then?&lt;/p&gt;

&lt;p&gt;Turns out that Google has a project for it, named &lt;a href="https://github.com/GoogleContainerTools/kaniko"&gt;Kaniko&lt;/a&gt;. And since we're running GKE, and has &lt;a href="https://github.com/GoogleContainerTools/kaniko#pushing-to-gcr-using-workload-identity"&gt;Workload Identity&lt;/a&gt; properly configured, it should be super easy to build Docker images and push them to our GCP managed docker repo, right?&lt;/p&gt;

&lt;p&gt;No. I get this error:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws_credentials.go:77] while getting AWS credentials NoCredentialProviders: no valid providers in chain. Deprecated.
For verbose messaging see aws.Config.CredentialsChainVerboseErrors
error checking push permissions -- make sure you entered the correct tag name, and that you are authenticated correctly, and try again: checking push permission for "europe-west1-docker.pkg.dev/XXX": creating push check transport for europe-west1-docker.pkg.dev failed: GET https://europe-west1-docker.pkg.dev/v2/token?YYY: UNAUTHORIZED: authentication failed
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;AWS credentials&lt;/em&gt; - wth?&lt;/p&gt;

&lt;p&gt;After some testing I found &lt;a href="https://github.com/GoogleContainerTools/kaniko/issues/1287#issuecomment-1036638533"&gt;this&lt;/a&gt;, which solved the issue :). For an example Jenkinsfile, see below.&lt;/p&gt;

&lt;p&gt;Another issue I had was that I wanted to generate dynamic build stages in Jenkins, and &lt;em&gt;execute them sequentially&lt;/em&gt;. There are a lot of examples of how to execute dynamically generated stages in parallel, but that's not what I wanted.&lt;br&gt;
Turned out to be super simple in the end ofc, but it aint simple until you've learned.&lt;/p&gt;

&lt;p&gt;Here's an example Jenkinsfile that demonstrates both Kaniko and dynamic sequential build steps:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def stageConfigs = [
  [name: "Sara", age: 20, color: "blue"],
  [name: "Mona", age: 10, color: "green"],
  [name: "Lotta", age: 8, color: "red"]
]

def generateBuildStage(stageConfig) {
  return {
    stage("Building ${stageConfig.name}") {
      container('build-container') {
        echo "Building ${stageConfig.name}"
        writeFile(
          file: "${stageConfig.name}.generated",
          text: "${stageConfig.name} is ${stageConfig.age} years old and loves ${stageConfig.color} things")
      }
    }
  }
}

def buildStages = stageConfigs.collectEntries {
    ["${it}" : generateBuildStage(it)]
}

podTemplate(
  inheritFrom: 'linux',
  containers: [
    containerTemplate(name: "build-container", image: "busybox", command: "sleep", args: "infinity"),
    containerTemplate(name: "kaniko", image: "gcr.io/kaniko-project/executor:343f78408c891ef7a85bab1ecbf2dd69367a58bc-debug", command: "sleep", args: "infinity", runAsUser: "0", ttyEnabled: true)])
  {
  node(POD_LABEL) {
    stage("Checkout") {
      checkout(scm)
    }

    stage("Build application") {
      // execute builds in parallel
      // parallel(buildStages)
      // execute builds sequentially
      for (stage in buildStages.values()) {
        stage.call()
      }
      container('build-container') {
        writeFile(
          file: "Dockerfile",
          text: '''
            FROM busybox
            COPY *.generated ./
          ''')
        sh "ls -al"
      }
    }

    stage('Build image') {
      container('kaniko') {
        sh "/kaniko/executor --context `pwd` --dockerfile `pwd`/Dockerfile --destination europe-west1-docker.pkg.dev/XXX"
      }
    }
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>jenkins</category>
      <category>docker</category>
      <category>kubernetes</category>
    </item>
    <item>
      <title>HOWTO; Connect to a private GKE cluster using a site-to-site VPN between a GCP VPC...</title>
      <dc:creator>Jonas Bergström</dc:creator>
      <pubDate>Sat, 12 Feb 2022 16:50:05 +0000</pubDate>
      <link>https://dev.to/goals/howto-connect-to-a-private-gke-cluster-using-a-site-to-site-vpn-between-a-gcp-vpc-2ak</link>
      <guid>https://dev.to/goals/howto-connect-to-a-private-gke-cluster-using-a-site-to-site-vpn-between-a-gcp-vpc-2ak</guid>
      <description>&lt;p&gt;&lt;strong&gt;... and a double NAT'ed Ubiquiti Dream Machine Pro&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Wait, what? I just want to access a Kubernetes cluster in a secure way ... should be easy.&lt;/p&gt;

&lt;p&gt;[2022-03-10] Update from Google! I just got this message: &lt;em&gt;"We are writing to let you know that starting June 15, 2022, we will remove the restriction on the Internet Key Exchange (IKE) identity of peer Cloud VPN gateways."&lt;/em&gt; &lt;br&gt;
This means in practice that NAT'ed setups will become easier to manage because GCP will not require remote IP to match remote id.&lt;/p&gt;

&lt;p&gt;At &lt;a href="https://goals.co" rel="noopener noreferrer"&gt;GOALS&lt;/a&gt; we are cloud native, and we are serious about security. As a consequence our Kubernetes clusters are provisioned with &lt;a href="https://cloud.google.com/kubernetes-engine/docs/how-to/private-clusters" rel="noopener noreferrer"&gt;private endpoints only&lt;/a&gt;, where all nodes have internal IP addresses. This is great, but the question immediately arrises - how do we operate such a cluster, when we cannot access it from outside the VPC?&lt;/p&gt;

&lt;p&gt;Here is a high level overview of what we have:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi3gc8466jr10ktcza4i9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi3gc8466jr10ktcza4i9.png" alt="Private GKE and office"&gt;&lt;/a&gt;&lt;br&gt;
At the top of the diagram we see the private Google managed Kubernetes (GKE) cluster. A Kubernetes cluster consists of a control plane and worker nodes. In the case of GKE, Google manages the control plane (api server, etcd nodes, etc), the underlying VM's the control plane is running on, and the underlying VM's that worker nodes are running on.&lt;br&gt;
We have set up our own VPC and a subnet where the worker nodes are running, and Google creates a managed VPC where the control plane is running. Google automatically peers the control plane VPC with our VPC.&lt;/p&gt;

&lt;p&gt;At the bottom of the diagram we see an overview of our office space network. Since we are a startup we rent office space and share the network with other tenants. We have connected a Ubiquity Dream Machine Pro to the office space network, and created our own GOALS network where we connect our work stations.&lt;/p&gt;

&lt;p&gt;Obviously, running eg &lt;code&gt;kubectl describe nodes&lt;/code&gt; from my workstation in our office network doesn't work since &lt;code&gt;kubectl&lt;/code&gt; needs access to the cluster's api server. So, how can we connect our office network to our VPC in a secure way, and enable management of the GKE cluster using &lt;code&gt;kubectl&lt;/code&gt;?&lt;/p&gt;

&lt;h2&gt;
  
  
  A note on our infrastructure
&lt;/h2&gt;

&lt;p&gt;We use &lt;a href="https://www.terraform.io/" rel="noopener noreferrer"&gt;Terraform&lt;/a&gt; to provision all GCP resources. Google provides opinionated Terraform modules to manage GCP resources &lt;a href="https://github.com/terraform-google-modules" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;br&gt;
Our infra leverages a &lt;a href="https://cloud.google.com/vpc/docs/shared-vpc" rel="noopener noreferrer"&gt;shared VPC&lt;/a&gt; and we use the &lt;a href="https://github.com/terraform-google-modules/terraform-google-project-factory" rel="noopener noreferrer"&gt;project factory&lt;/a&gt; module to create the host project and the service projects.&lt;br&gt;
The VPN will be provisioned in the host project that owns the VPC.&lt;br&gt;
Our GKE clusters are created with the &lt;a href="https://github.com/terraform-google-modules/terraform-google-kubernetes-engine/tree/master/modules/private-cluster" rel="noopener noreferrer"&gt;private cluster&lt;/a&gt; terraform module.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites and preparations
&lt;/h2&gt;

&lt;p&gt;There are a few things we need to ensure and some information we must gather before we can start.&lt;br&gt;
First, we need admin access to the Ubiquiti Dream Machine (UDM) and we need a Google user with Network Management Admin.&lt;br&gt;
Now you need to gather the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The office space external IP address, eg &lt;strong&gt;123.45.67.89&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;The Goals network subnet range, eg &lt;strong&gt;192.168.1.0/24&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Ensure that the Kubernetes cluster is prepared
&lt;/h3&gt;

&lt;p&gt;Two configuration entries are necessary to get right when setting up GKE to enable VPN access to the cluster.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;master_authorized_networks&lt;/code&gt; needs to include the office network subnet range, ie &lt;strong&gt;192.168.1.0/24&lt;/strong&gt; in our case.&lt;/li&gt;
&lt;li&gt;VPC peering must be configured to &lt;strong&gt;export custom routes&lt;/strong&gt; - in this particular case it means that the custom network route that the VPN will create to enable communication from the VPC to the office network will also be available in the Google managed VPC that host the GKE control plane. This is enabled by adding the following terraform configuration to the GKE setup:
```
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;module "kubernetes_cluster" {&lt;br&gt;
  source = "terraform-google-modules/kubernetes-engine/google//modules/private-cluster"&lt;br&gt;
  version = "18.0.0"&lt;br&gt;
  ...&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;resource "google_compute_network_peering_routes_config" "peering_gke_routes" {&lt;br&gt;
  peering = module.kubernetes_cluster.peering_name&lt;br&gt;
  network = var.vpc_id&lt;br&gt;
  import_custom_routes = false&lt;br&gt;
  export_custom_routes = true&lt;br&gt;
}&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
### Shared VPN secret
Generate a shared secret that will be used for authentication of the VPN peers and put it into a new secret in GCP Secret Manager, name it **office-vpn-shared-secret**.

### Create a Cloud VPN Gateway
For this part we will use the [VPN](https://github.com/terraform-google-modules/terraform-google-vpn) module, setting it up in the host project like this:

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;locals {&lt;br&gt;
  name = "office-vpn"&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;resource "google_compute_address" "vpn_external_ip_address" {&lt;br&gt;
  project = var.project_id&lt;br&gt;
  name = local.name&lt;br&gt;
  network_tier = "PREMIUM"&lt;br&gt;
  region = var.region&lt;br&gt;
  address_type = "EXTERNAL"&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;data "google_secret_manager_secret_version" "office_vpn_shared_secret" {&lt;br&gt;
  project = var.project_id&lt;br&gt;
  secret = "office-vpn-shared-secret"&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;module "office_site_to_site" {&lt;br&gt;
  source = "terraform-google-modules/vpn/google"&lt;br&gt;
  version = "2.2.0"&lt;/p&gt;

&lt;p&gt;project_id = var.project_id&lt;br&gt;
  network = var.vpc_id&lt;br&gt;
  region = var.region&lt;br&gt;
  gateway_name = local.name&lt;br&gt;
  tunnel_name_prefix = local.name&lt;br&gt;
  shared_secret = data.google_secret_manager_secret_version.office_vpn_shared_secret.secret_data&lt;br&gt;
  ike_version = 2&lt;br&gt;
  peer_ips = [ var.office_public_ip ]&lt;br&gt;
  remote_subnet = var.office_subnet_ranges&lt;br&gt;
  vpn_gw_ip = resource.google_compute_address.vpn_external_ip_address.address&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;resource "google_compute_firewall" "allow_office_traffic" {&lt;br&gt;
  project = var.project_id&lt;br&gt;
  name = "${local.name}-allow-office-traffic"&lt;br&gt;
  network = var.vpc_id&lt;br&gt;
  description = "Allow traffic from the office network"&lt;br&gt;
  allow { protocol = "icmp" }&lt;br&gt;
  allow {&lt;br&gt;
    protocol = "udp"&lt;br&gt;
    ports = [ "0-65535" ]&lt;br&gt;
  }&lt;br&gt;
  allow {&lt;br&gt;
    protocol = "tcp"&lt;br&gt;
    ports = [ "0-65535" ]&lt;br&gt;
  }&lt;br&gt;
  source_ranges = var.office_subnet_ranges&lt;br&gt;
}&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;`google_compute_address.vpn_external_ip_address` creates the external static IP address which becomes the VPN endpoint on the GCP end.
`google_secret_manager_secret_version.office_vpn_shared_secret` fetches the shared secret used to authenticate the VPN peers.
The `office_site_to_site` module creates a "classic" Cloud VPN. The UDM does not support BGP yet which means that we cannot create a "HA Cloud VPN" variant.
`google_compute_firewall.allow_office_traffic` allows traffic originating from the office subnet (`192.168.1.0/24`) to enter our VPC.

After applying the configuration the VPN tunnel will be in an error state because it cannot connect to it's peer. This is expected since we have not set up the UDM side yet.

### Create a VPN network on the UDM Pro
The UDM Pro does not support configuration by code as far as I know, so here we need to resort to manually use the management GUI.

Go to Settings-&amp;gt;Networks-&amp;gt;Add New Network, choose a name and select VPN-&amp;gt;Advanced-&amp;gt;Site-to-site-&amp;gt;Manual IPSec.
**Pre-shared Secret Key** is the `office-vpn-shared-secret` from above.
**Public IP Address (WAN)** is the IP address the UDM has on the office space network, ie it is __not__ the public IP our office space provider has. For example `192.168.10.150`.
In the **Remote Gateway/Subnets** section, add the subnet ranges in your VPC that you want to access from the office, eg `10.0.0.0/8` and `172.16.0.0/16`.
The **Remote IP Address** is the public static IP that was created for the VPN endpoint in GCP, eg `123.45.67.99`.
Expand the Advanced section and choose `IKEv2`. Leave PFS and Dynamic routing enabled.
Save the new network.

Unfortunately we are not ready yet, because with the current configuration the UDM will identify itself using the WAN IP we have configured, which doesn't match the IP it connects to GCP with.
To fix this last part of configuration we need to `ssh` into the UDM Pro. Once you are on the machine, we can update the ipsec configuration.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;$ cd /run/strongswan/ipsec.d/tunnels&lt;br&gt;
$ vi .config&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Add the following line just below the `left=192.168.10.150` line:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;leftid=123.45.67.99&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;This makes the UDM identify itself using it's actual public IP when it connects to the VPN on the GCP end.
Finally, refresh the IPSec configuration:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;$ ipsec update&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
## Thats it!
Verify that the connection on the UDM is up and running by invoking:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;$ swanctl --list-sas&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;The output should list information about the tunnel, and the tunnel should be in {% raw %}`ESTABLISHED` state.
Now the VPN tunnel state in GCP should move into a green state as well.
And, finally, accessing the Kubernetes cluster from a workstation in the Goals office network is possible:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;$ kubectl get nodes&lt;br&gt;
NAME                                      STATUS   ROLES    AGE   VERSION&lt;br&gt;
gke-XXXXX-controller-pool-28b7a87b-9ff2   Ready       17d   v1.21.6-gke.1500&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;We now have a setup looking like below:

![Private GKE and office with VPN](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zlornjjrpq8ya2eldfkx.png)

## Troubleshooting
While being connected to the UDM, run
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;$ tcpdump -nnvi vti64&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;to see all traffic routed via the VPN tunnel.

Sometimes routes are cached on the workstations, eg on Mac you can run
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;$ sudo route -n flush&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;a couple of times and disable/enable Wifi to make sure that your routing configuration is up to date.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

</description>
      <category>gcp</category>
      <category>kubernetes</category>
      <category>devops</category>
      <category>terraform</category>
    </item>
  </channel>
</rss>
