<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Ronen Lahat</title>
    <description>The latest articles on DEV Community by Ronen Lahat (@ronenl).</description>
    <link>https://dev.to/ronenl</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/ronenl"/>
    <language>en</language>
    <item>
      <title>The Thirteen-Factor Team</title>
      <dc:creator>Ronen Lahat</dc:creator>
      <pubDate>Wed, 02 Mar 2022 13:01:54 +0000</pubDate>
      <link>https://dev.to/ronenl/the-thirteen-factor-team-44o7</link>
      <guid>https://dev.to/ronenl/the-thirteen-factor-team-44o7</guid>
      <description>&lt;p&gt;Originally published on &lt;a href="https://medium.com/att-israel/the-thirteen-factor-team-191c36d6ae85"&gt;Medium&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Three years ago I joined the AT&amp;amp;T R&amp;amp;D center as a full-stack developer. I was welcomed directly into a brand new project, where I coincidentally witnessed a restructuring of the company's engineering practices. As I was getting my photo id and laptop, seating arrangements changed, the department rebranded, QAs were being trained as developers, and developers started assuming operational tasks.&lt;/p&gt;

&lt;p&gt;During these three years we moved fast and morale was high, even though we had to assume new responsibilities and learn new skills, often within a short time period. We grew professionally while witnessing rewarding metrics from almost 18 million devices in production.&lt;/p&gt;

&lt;p&gt;As the dust settled, I thought about the maxims and methodologies that allowed us to build a good, testable product, easy to refactor without breaking, and with a fast deployment to production. These are some of them.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4cPKzzEG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/02asklvg6mmljfbf4m4x.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4cPKzzEG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/02asklvg6mmljfbf4m4x.png" alt="developer icon" width="128" height="128"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Developers as Feature Owners
&lt;/h2&gt;

&lt;p&gt;A developer is not just a stop in the moving assembly line of a product, but a feature artisan from conception to delivery. The developer should be given technological autonomy and a sense of ownership of the feature. This begins in developers themselves slicing a feature into tasks, and by contributing in all stages of the software lifecycle, including observation and maintenance. They know best how to tackle the feature's challenges and bring it to fruition.&lt;/p&gt;

&lt;p&gt;Given autonomy we chose the latest technologies, versions, and conventions, which above all kept us engaged and motivated to continuously refactor.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--JGDKmwhH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/bfoqv5m466txb0hodtc4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--JGDKmwhH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/bfoqv5m466txb0hodtc4.png" alt="code icon" width="128" height="128"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Contribute in All Areas
&lt;/h2&gt;

&lt;p&gt;After the developer built a feature, they are the most knowledgeable about it and can contribute in all its lifecycle across testing, deployment, and maintenance. The more they contribute, the more they'll understand the big picture and take ownership of features.&lt;/p&gt;

&lt;p&gt;Beyond full-stack, a developer can contribute in operational tasks by managing cloud services, administering repositories, orchestrating clusters, writing scripts to automate changes, running deployments, writing libraries for CI tools, and creating metrics widgets and dashboards.&lt;/p&gt;

&lt;p&gt;We might have a lot of learning resources available, but jumping onboard is what makes really us feel professional progress and accomplishment.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--BSgKPKV_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wh8w83qp7ufdd3opehkz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BSgKPKV_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wh8w83qp7ufdd3opehkz.png" alt="infrastructure icon" width="128" height="128"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Infrastructure as Code
&lt;/h2&gt;

&lt;p&gt;The way to give developers the power to contribute in all areas is by making everything code. Define all infrastructure as code, versioned and committed into the same repository as the application code, and give developers the credentials to work on it like all other code. This includes cloud resources, templates and configuration files for the Kubernetes cluster, the CI pipeline, build scripts.&lt;/p&gt;

&lt;p&gt;We had a hard working DevOps team to support developers, but gradually we picked up many operational and infrastructure responsibilities.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--cQLaH8nx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/62s4ly4c9ex5ffr40rpm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--cQLaH8nx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/62s4ly4c9ex5ffr40rpm.png" alt="automation icon" width="128" height="128"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Automate All Processes
&lt;/h2&gt;

&lt;p&gt;Deployment from one environment to another should be automated, defined in code, never depending on someone's memory (and never before weekends). The same is true for updates and creation of services and databases, etc. This can be encouraged with multiple staging environments, where deployments from one to the other serve as practice and quality assurance.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8EwPN4Ga--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nzn51wj469xva4dewev2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8EwPN4Ga--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nzn51wj469xva4dewev2.png" alt="pipeline icon" width="128" height="128"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Let the Pipeline and Application Code Co-Evolve
&lt;/h2&gt;

&lt;p&gt;Development must begin with a good CI/CD pipeline, and developers should fix and improve it as their application's home. When developers work on the pipeline, they can find creative and elegant solutions to issues by tweaking the application, thus making it easier to build, test and deploy.&lt;/p&gt;

&lt;p&gt;Our pipeline originated from a dedicated DevOps team which worked very closely with us, and committed its code into the application's repo. With time, our pipeline and application code co-evolved into ever greater robustness and synergy.&lt;/p&gt;

&lt;p&gt;Pipelines are notoriously "antifragile," and keeping them stable can be frustrating. Fixing failures is part of the process. Developers learn through them to debug their apps and learn its quirks, avoiding similar issues in production.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wXpL20TN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8i0e3h88f5agy3x1psz3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wXpL20TN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8i0e3h88f5agy3x1psz3.png" alt="microservices icon" width="128" height="128"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Smash the Monolith
&lt;/h2&gt;

&lt;p&gt;When the project was just a blueprint and a few PoCs, the architects defined an API for us. The project had many parties and teams, but we all talked a shared vocabulary. Each could test the API on their end, and decouple themselves by mocking it.&lt;/p&gt;

&lt;p&gt;Splitting your app into components and your back-end into micro-services and serverless functions encourages decoupling, which reduces team dependencies. When in doubt, split. It's easier to merge components later than separate when conjoined. Think about creative ways to split your front end into separate deployments as well with micro-frontends.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--2IZ4A2E8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/759dbx3gswetz9yjec2w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2IZ4A2E8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/759dbx3gswetz9yjec2w.png" alt="git repo icon" width="128" height="128"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Monolith No, Monorepo Yes
&lt;/h2&gt;

&lt;p&gt;Ironically, although we decoupled many components, we still want to keep them together in source. We started with one repo for each microservice and for each client, but we found ourselves cloning them in one big folder and writing awkward scripts to link and build them together. As features spanned separate repositories, we had to open multiple pull requests which in turn triggered multiple builds.&lt;/p&gt;

&lt;p&gt;This is what monorepo tools are designed to solve, we use Lerna for all our JavaScript repos, and we're never going back.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bQ1oN8uK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fe6d77ece5ft07iichbs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bQ1oN8uK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fe6d77ece5ft07iichbs.png" alt="pull request icon" width="128" height="128"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Bite-Sized Pull Requests, Short-Lived Branches
&lt;/h2&gt;

&lt;p&gt;Some call it trunk-based development. It's easier and faster to merge changes when changes are small. Small Pull Requests are easier for the developer, the reviewers, and other developers pulling changes. Drastic code changes bode bugs, and often come with technical debt that no one wants to pay.&lt;/p&gt;

&lt;p&gt;Knowing pull requests are a continuous work in progress helps in code reviews. Nevertheless all pull requests should include tests and feature toggles, to avoid technical debt.&lt;/p&gt;

&lt;p&gt;In our case, we deleted the branch after merges. Before shipping, we would create a release branch with a version number and test it thoroughly. Nothing would get merged to it except bug fixes. After release we created a git tag for future reference.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--l7ZC56bB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/19z4ntg01i5bzys134sx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--l7ZC56bB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/19z4ntg01i5bzys134sx.png" alt="feedback loop icon" width="128" height="128"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Reduce Hoops, Shorten Feedback Loops
&lt;/h2&gt;

&lt;p&gt;As the software development cycle advances, changes become harder and time estimations longer. What a developer can detect immediately in their local environment can take hours in the CI builds, or days once deployed, sometimes weeks.&lt;/p&gt;

&lt;p&gt;Test as earlier and often as possible, use quality gates in the pipeline. The "merge" button was greyed out for us unless two reviewers approved it and the build returned green. This includes code analysis, lint, security scans, unit tests, end-to-end automation, mutation tests, etc. With coverage thresholds. All can also run locally, and most third-party code scanning services offer a CLI or IDE plugin.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--b5zdfD16--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gr4ycimrxexvichlip8p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--b5zdfD16--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gr4ycimrxexvichlip8p.png" alt="container icon" width="128" height="128"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Containerize
&lt;/h2&gt;

&lt;p&gt;For micro-services, this is one of the best ways to shorten feedback loops. The runtime that works on the developer's machine is the one that gets tested by the CI/CD pipeline, and is eventually shipped to production in the form of a Kubernetes pods. If a developer says "it works in my machine" and they use a container, we can be confident that it'll behave the same way in production.&lt;/p&gt;

&lt;p&gt;Backend developers should see the container as the application's deliverable unit, not the folders with compiled files within. While both backend and client developers can isolate their application's environment in a declarative way, and share that environment with the rest of the team. In one of our projects we used Nix for this purpose.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--cYXqSLok--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6i3z222jpx5c1r138tij.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--cYXqSLok--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6i3z222jpx5c1r138tij.png" alt="scoreboard icon" width="128" height="128"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Gamify Team Excellence, We Like to Show Off
&lt;/h2&gt;

&lt;p&gt;The gamification of the office TV dashboards was accidental. We displayed pipeline build results - which were either green or red - together with various quality metrics, organized by teams. It turned into a competition which also improved team cohesion. Teams also showcased their work at demos, which was an opportunity for all involved to take pride in their contributions and congratulate each other.&lt;/p&gt;

&lt;p&gt;When we started working from home we lost the effect of shared office TV dashboard. Build results and metrics turned into individualized alerts and notifications, but the demos and the congratulations didn't change.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--D7dNwk1r--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7ft0atvd5qgve5xasqog.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--D7dNwk1r--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7ft0atvd5qgve5xasqog.png" alt="book icon" width="128" height="128"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  We're All Learners, We Make Rookie Mistakes
&lt;/h2&gt;

&lt;p&gt;We all want to excel professionally, but this sometimes comes in the form of perfectionism. One of the subtle signs of perfectionism is the fear of under-performing in fields in which we're inexperienced. A seasoned Java developer may refrain from writing JavaScript, as they might not know its quirks and feel awkward. This harms your ability to contribute to all processes and see the big picture.&lt;/p&gt;

&lt;p&gt;Forgive yourself and others and allow yourself to be a rookie again, we have your back.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--CBCZHvQX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/75rjljlbl52i0p6ovyfb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--CBCZHvQX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/75rjljlbl52i0p6ovyfb.png" alt="thumb up icon" width="128" height="128"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Give Kudos, Show You Care
&lt;/h2&gt;

&lt;p&gt;This can't be automated, but it can become second nature. It's a basic human need to be recognized and feel that one's work is meaningful. This feels true when busy managers join demos and give feedback. A short email with "the client loved this feature" goes a long way. Likewise, think about congratulating a team member or welcoming someone new.&lt;/p&gt;

&lt;p&gt;All the automation scripting and pipeline fixing can be tedious and mechanical, but it's the people which make it humane, feeling recognized and meaningful at the end of the day.&lt;/p&gt;




&lt;h2&gt;
  
  
  Future
&lt;/h2&gt;

&lt;p&gt;As a developer, I'd like to "commit" what works. But it's impossible to define "methodology as code" like infrastructure, it's a soft science involving people. Behavior cannot be programmed, but it can be measured, fostered and encouraged.&lt;/p&gt;

&lt;p&gt;There are many ways to improve our current working practices. Some involving data-driven decision making, and some by adding even more automation, often in creative ways.&lt;/p&gt;

&lt;p&gt;With data, we can research trends and predict targets to aid in time and effort estimations, identify bottlenecks, and find ways to further decouple dependencies, automate, and reduce feedback loops.&lt;/p&gt;

&lt;p&gt;Recently, we improved pull request pickup time by automatically pinging the relevant reviewers. This reduced the nudging we had from each other to review our pull requests, while clicking on the message conveniently redirects us to review the code. The message even includes estimated review time based on the type of file changed and its changes.&lt;/p&gt;

&lt;p&gt;This wasn't a script someone in our team wrote, but part of a dedicated project on its own, currently scaling up. The project is also working to combine all dashboard, metrics, scores, and quality scan results into one big-data framework, a project on which I'm currently participating. We aim to foster a common language for the organization, a language with actionable feedback for teams, hopefully resulting in a healthier software lifecycle, and developers motivated to give their best work.&lt;/p&gt;

&lt;p&gt;Twitter: &lt;a class="mentioned-user" href="https://dev.to/ronenl"&gt;@ronenl&lt;/a&gt;&lt;/p&gt;

</description>
      <category>programming</category>
      <category>architecture</category>
      <category>agile</category>
      <category>motivation</category>
    </item>
    <item>
      <title>How Nix-Shell Saved Our Team’s Sanity</title>
      <dc:creator>Ronen Lahat</dc:creator>
      <pubDate>Wed, 27 Oct 2021 15:00:23 +0000</pubDate>
      <link>https://dev.to/ronenl/how-nix-shell-saved-our-teams-sanity-101k</link>
      <guid>https://dev.to/ronenl/how-nix-shell-saved-our-teams-sanity-101k</guid>
      <description>&lt;p&gt;&lt;em&gt;Originaly published at: &lt;a href="https://medium.com/att-israel/how-nix-shell-saved-our-teams-sanity-a22fe6668d0e" rel="noopener noreferrer"&gt;https://medium.com/att-israel/how-nix-shell-saved-our-teams-sanity-a22fe6668d0e&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;We're developing a large React Native app which relies heavily on native components already written in Java, C++, and Objective-C. This means that we needed to develop, build, and test many different platforms on complex developer environments and build tools, that change often with platform updates.&lt;/p&gt;

&lt;p&gt;This became a burden for our teams, with knowledge spread across many developers, installation guides, readme files, and internal wiki pages. It became expected that installations would take several days, and even a minor change in a dependency version resulted in inconsistent builds with obscure error messages.&lt;/p&gt;

&lt;p&gt;Some suggested Dockerizing the environment, but after several attempts, Nix became our tool of choice. Nix allows us to share the same development environment across Linux and macOS with exact dependencies for tools such as CMake, Ninja, Android NDK, etc. With Nix installed, when opening the repository the developer is greeted by all the required dependencies available in their shell. We use Linux for android builds and macOS for both android and apple builds.&lt;/p&gt;

&lt;h2&gt;
  
  
  So, What’s Nix?
&lt;/h2&gt;

&lt;p&gt;Nix is both a package manager and build tool. Generally, these two are separate things, such as RPM and Make. This unity becomes useful with Nix’s source deployment model in which packages are built from source. Most of the time the package is substituted transparently for a cached binary from a server (as long as the hash of the build instructions is the same.)&lt;/p&gt;

&lt;p&gt;Nix prioritizes consistency, and to achieve this it forces you to declare all dependencies and inputs explicitly while sandboxing the build environment from your shell environment and the internet. Not only is the package built from source, but also its dependencies and their dependencies, which can depend on each other, as nodes in a graph mesh.&lt;br&gt;
Nix-env, the Package Manager&lt;/p&gt;

&lt;p&gt;With nix-env you can manage user environments. nix-env creates an abstraction layer over bin directories in your PATH with symlinks to /nix/store. As it uses symlink references it can do several important things:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;    It keeps track of versions of your environment, and in O(1) time it can roll back to a different version by changing the symlink to a previous profile.&lt;/li&gt;
&lt;li&gt;    Installations and uninstallations are atomic. The later version isn’t referenced until the installation is complete.&lt;/li&gt;
&lt;li&gt;    As dependencies are not installed in a global folder, multiple users on a machine cannot override or compromise each other’s dependencies, and therefore are allowed to install packages without privileges.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is possible because each version of a package is installed in a different directory under /nix/store and erasing a dependency doesn’t remove it from disk until it is completely de-referenced and garbage-collected.&lt;/p&gt;

&lt;p&gt;Nix takes versioning into its own hands by hashing the build instructions and its input. Even the slightest change constitutes a new version, as the hash is different. Components reside in the Nix Store, alongside all their dependencies, as:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;/nix/store/f2rrk276criwxn19bf82cglym4dkv9gr-ninja-1.9.0.drv
/nix/store/iwm3knkdi294rj50w9ai5rdwaglgr362-ninja-1.9.0/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The last characters are the human-readable name attribute. Nix-env is managed with the nix-env command and the .nix-profile directory.&lt;/p&gt;

&lt;h2&gt;
  
  
  Installation Issue on Mac
&lt;/h2&gt;

&lt;p&gt;Nix can either be installed for a single user (who owns /nix) or as multi-user (in which root owns /nix). However, on a Mac neither will work anymore, as the root filesystem (anything under /) has been read-only since macOS 10.15. Nix can’t trivially change the path for the Nix Store, as all their binary cache has been compiled with /nix/store as its path. The current workaround is to change the path but mount it as an unencrypted (encrypted at rest) APFS Volume.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ sh &amp;lt;(curl -L https://nixos.org/nix/install) --darwin-use-unencrypted-nix-store-volume --daemon
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The installation will explain what it will do, and will request super-user access, which it will call dozens of times. This is how the Nix Store Volume looks with Disk Utility:&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxctyekvflz5qol0mjfq0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxctyekvflz5qol0mjfq0.png" alt="Nix Store APFS Volume in Disk Utility"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And here it is in Finder:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvmvodbemf3zuindhbrs7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvmvodbemf3zuindhbrs7.png" alt="Nix Store Volume in Finder"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Nix Store Volume in Finder. For some reason, the Unix timestamp is at 0 (and I’ve given out my timezone).&lt;br&gt;
Nix-shell, the Virtual Environment&lt;/p&gt;

&lt;p&gt;It was however nix-shell that made an impact for us. With nix-shell, we can create virtual environments per-project, without having to install dependencies on a user or system level with nix-env.&lt;/p&gt;

&lt;p&gt;Just add a shell.nix file in your project. Then, when you enter nix-shell, the environment and all the dependencies are ready for use. This file is, of course, committed to source control and shared among all developers. The file lists dependencies, environment variables, and shell hooks to be run when loaded.&lt;br&gt;
Example shell.nix file with two different Nixpkgs sources.&lt;/p&gt;

&lt;p&gt;This can be further integrated into the shell with Direnv, which automatically activates the environment when the directory changes; and Lorri, a daemon process that monitors the project’s shell.nix for changes, and automatically reloads the environment if it has. Niv eases the dependency management of a project with a sources.json file, like a higher-order package manager for Nix-shell.&lt;/p&gt;

&lt;p&gt;Some prefer the use of Nix-shell over Nix-env for entire user-level environments, as it can be controlled in an isolated, declarative way. Home Manager enables the configuration of user-specific (non-global) packages and “dot-files.” See what you can do in the NixOS wiki. Finally, Nix-drawin enables the configuration of your Mac the way NixOS does with a configuration.nix file.&lt;/p&gt;

&lt;p&gt;Nix-shell can be extended into your OS with the tools above, but it can also be used in a narrower, specific way. It’s possible to run commands in Nix-shell without entering its interactive shell with:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;nix-shell --run "node ./index.js".
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And it’s possible to specify Nix-shell as an interpreter for a file with a shebang at the top of the file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#! /usr/bin/env nix-shell
#! nix-shell -i real-interpreter -p packages...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The above file will be executed inside of nix-shell, along with its environment.&lt;/p&gt;

&lt;h2&gt;
  
  
  Nix-build, the Build Tool
&lt;/h2&gt;

&lt;p&gt;Nix-build is a build manager with correctness in its top priority. That is, all builds will be identical given the same build tools and inputs.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftmm2xbrl6jb2hyrxof8d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftmm2xbrl6jb2hyrxof8d.png" alt="The task of a Build Manager"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Build managers take sources, such as source code and dependencies, and invoke generators such as compilers, to create derivates such as binaries. Both sources and derivates are components. This is the task of tools like Make, CMake, Ant, or Gradle.&lt;/p&gt;

&lt;p&gt;Nix builds are based on a derivation, which is a set that lists exact (hashed) dependencies and exact (hashed) build scripts, which look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Derive([("out","/nix/store/winl36i87aydwj5qgrz0nbc7kq3w0yzi-user-environment","","")],[],["/nix/store/kygr761f08l1nanw27lfxkg8qibf0qn1-env-manifest.nix"],"builtin","builtin:buildenv",[],[("allowSubstitutes",""),("builder","builtin:buildenv"),("derivations","true 5 1 /nix/store/9nqninr2aaicvmq83q10d5a1hwagbzyc-hello-2.10 true 5 1 /nix/store/df26nnjiw55rvv6mxy4kapps9h4kfvw7-niv-0.2.19-bin true 5 1 /nix/store/f3swypnb5zi5yd3w7k2ycwyv6b3sv8fa-direnv-2.28.0 true 5 1 /nix/store/vgdizqicd30k4183ssq7g6i07dvys6xl-home-manager-path true -10 1 /nix/store/4023c0ymrxsg1x36jxmnircqjl1y9fkq-nodejs-14.17.6"),("manifest","/nix/store/kygr761f08l1nanw27lfxkg8qibf0qn1-env-manifest.nix"),("name","user-environment"),("out","/nix/store/winl36i87aydwj5qgrz0nbc7kq3w0yzi-user-environment"),("preferLocalBuild","1"),
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Nix Expressions, the Language
&lt;/h2&gt;

&lt;p&gt;The above is a minified version of its human-readable version, written functionally with a Nix Expression:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://gist.github.com/ronenlh/c2c9ca9ed319bfadd212f2eb15e29629#file-default-nix" rel="noopener noreferrer"&gt;https://gist.github.com/ronenlh/c2c9ca9ed319bfadd212f2eb15e29629#file-default-nix&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The entire file is a single function. Lines 1 to 6 describe a set, passed as the only parameter. The set defines all the dependencies needed to build the component. : in line 6 defines the beginning of the function’s body.&lt;/p&gt;

&lt;p&gt;The entire body is a call to stdenv.mkDerivation, which will minimize the data into the derivation written above. rec is a function that will enable recursion inside the data set, allowing the definition of values in terms of other keys in the set.&lt;/p&gt;

&lt;p&gt;For didactic purposes, the syntax could be rewritten as a JavaScript lambda as:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;({ stdenv, ... }) =&amp;gt; stdenv.mkDerivation(rec({ ... }))
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The value for src is retrieved from a remote URL and validated with a hash. src is the expected key for the standard build tool, which will perform the standard autoconf (./configure ; make ; make install) shell script.&lt;/p&gt;

&lt;p&gt;It’s possible to experiment with the Nix language in its interactive shell.&lt;br&gt;
Nixpkgs, the Package Repository&lt;/p&gt;

&lt;p&gt;The above function is not yet callable, as we don’t have the parameters for the function. We can achieve the same result with another rec which recursively defines the needed components and its parameters. e.g.,&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;rec {
  lib1 = import package1/default.nix { };
  program2 = import package2/default.nix { inherit lib1; };
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This turns all dependencies into a dependency graph, and as long as they are acyclical, Nix can build all of them. This set can be abstracted with the callPackage function. This is how it is done in the Nix Packages Collection in this amazing file all-packages.nix.&lt;/p&gt;

&lt;p&gt;This file is queried implicitly when we install a package in the form:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;nix-env -i hello
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is the equivalent of:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;nix-env -f .../all-packages.nix -i hello
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Both will build and install hello. Nix will represent all the dependencies as a graph and build them as needed. It’s important to note that Nix is Lazy: The parameters are not evaluated until called, which means that dependencies will not be built until (or if) needed.&lt;/p&gt;

&lt;p&gt;The file for all-packages can be changed using the nix-channel command. Channels are sorted by stability status.&lt;br&gt;
How Can I Install a Specific Version of a Package with Nix?&lt;/p&gt;

&lt;p&gt;The Nixpkgs repository includes the latest versions of packages (according to the selected stability branch). Packages depend on each other and are built as a whole. To pin a specific version of a dependency, you must switch to a different revision of Nixpkgs altogether. A great utility to reverse-search a Nixpkgs revision according to a package’s version is Lazamar’s Nix Package Search.&lt;/p&gt;

&lt;p&gt;It’s best practice to always pin your build dependencies to a specific revision of Nixpkgs, for consistency (as you’d do with Docker), and to update to the latest version of Nixpkgs on Nix-env, according to your selected Nix-channel (as you’d do with Homebrew).&lt;/p&gt;
&lt;h2&gt;
  
  
  Other Nix Tools
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;    NixOS — using the primitives listed above, builds and configures an entire Linux distribution. The whole of NixOS is defined inside Nixpkgs repository, which is incredible.&lt;/li&gt;
&lt;li&gt;    NixOps — related to Cloud deployment, deploys NixOS system configurations to remote machines, as well as provisions cloud resources.&lt;/li&gt;
&lt;li&gt;    Hydra — CI tool that periodically checks out the source code of a project, builds it, tests it, and produces reports for developers. Hydra is used to check the stability status of the Nix channels.&lt;/li&gt;
&lt;li&gt;    Flakes —an upcoming feature that will remove much of the hassle of pinning dependencies with syntactic sugar. Each dependency’s commit hash will be stored inside a flake.lock file. This is intuitive for NPM/Yarn or Cargo users.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  So, Why Not Docker?
&lt;/h2&gt;

&lt;p&gt;Nix and Container engines such as Docker are two very different tools. One is a package and build manager, the other is a resource isolation mechanism that virtualizes the host’s operating system. Both have great caching mechanisms behind them, and both can be used for consistent environments on Linux machines. See below about how Replit migrated from Docker to Nix.&lt;/p&gt;

&lt;p&gt;The main abstraction of Docker is the Container: a loosely isolated, lightweight, portable, and encapsulated environment that contains everything needed to run the application. The container — which is runnable — is described by a read-only Image. The image is created by a Dockerfile, where each directive creates a separate Layer, tagged by its cryptographic hash and cached.&lt;/p&gt;

&lt;p&gt;Like layers, images can be built one on top of the other and vertically stacked, e.g., the official Node image is built on top of the tiny Alpine Linux image. Your node app would probably be stacked on top of the node image.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3vypd5d7wr9brlvljsmk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3vypd5d7wr9brlvljsmk.png" alt="Layers of Docker node image"&gt;&lt;/a&gt;&lt;br&gt;
Layers of Docker node image (node:slim) from Docker Hub&lt;/p&gt;

&lt;p&gt;Containers define the implementation of an image or a layer in terms of another, its parent. Nix creates new functionality by assembling or composing dependencies. Nix requires dependencies to be explicit, and these dependencies are black-boxed and consumed through their interface.&lt;/p&gt;

&lt;p&gt;However, Dockerfiles don’t have to be linear. Multi-stage builds introduce a new abstraction: the stage. Docker’s new BuildKit traverses stages from the bottom (of the target stage) to top in a graph data structure, skipping unneeded ones, and building stages concurrently where applicable.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F639mkf42w54kmftaeelo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F639mkf42w54kmftaeelo.png" alt="Graph of BuildKit’s Multi-stage build"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Graph of BuildKit’s Multi-stage build, starting from the bottom (the target stage) to the top, discarding unneeded stages. From ‘Dockerfile Best Practices’ talk: &lt;a href="https://youtu.be/JofsaZ3H1qM?t=1169" rel="noopener noreferrer"&gt;https://youtu.be/JofsaZ3H1qM?t=1169&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Favor Composition Over Inheritance
&lt;/h2&gt;

&lt;p&gt;It’s difficult to change layers in Docker, as we’re not sure what each component does or how it will affect the lower layer. Also, developers are disincentivized from changing higher layers as they risk rebuilding all the lower layers in the Dockerfile. This is also a performance bottleneck in terms of concurrency, as Docker builds layers in sequence, and unneeded stages will be unnecessarily built and then discarded.&lt;/p&gt;

&lt;p&gt;Docker has a great advantage which is immediately familiar to developers and ops alike. Nix originated as a Ph.D. thesis and it sometimes feels like that. But a design that doesn’t take change into account risks major redesign in the future. Docker hashes machine states, Nix hashes the precise components of a build. As explained earlier, the two tools serve different purposes.&lt;/p&gt;

&lt;p&gt;In our case, we were building a library for a client app, so there was no need to ship a machine container as would’ve been the case when developing a Node microservice in Kubernetes. We just needed to share a consistent build environment to create replicable builds. Furthermore, with nix-shell, we can still use our local XCode and the rest of macOS’s walled garden for our tvOS and iOS builds.&lt;/p&gt;
&lt;h2&gt;
  
  
  The Case of Replit
&lt;/h2&gt;

&lt;p&gt;Replit is a collaborative in-browser IDE with support for a huge number of languages. Replit started with a separate Docker image for each language, but concluded that it was simpler and more efficient to use a single monolithic image: Polygott. This has become a huge burden to maintain, in their own words, as “every new package creates a new exciting way things can break.”&lt;/p&gt;

&lt;p&gt;With Nix, Replit users themselves can define infinite combinations of sandboxed environments without the need to maintain a monolithic Docker image. Each machine has /nix/store (with all the binaries cached) mounted, so the instantiation of their environment is immediate.&lt;/p&gt;
&lt;h2&gt;
  
  
  How Does it Compare with Homebrew?
&lt;/h2&gt;

&lt;p&gt;Homebrew is an incredible tool which has become second nature for most macOS users. Installations work out of the box and is intuitive to use.&lt;/p&gt;

&lt;p&gt;Like Nix, Homebrew builds from source unless it finds a “bottle,” that is, a pre-built binary. Similarly — and for the same reason — Homebrew has to be installed into a default path (/opt/homebrew on Apple Silicon or /usr/local on Intel) to enjoy pre-build binaries. This folder is referred to as the cellar.&lt;/p&gt;

&lt;p&gt;Homebrew uses Ruby for its formulae, which provides instructions and metadata for Homebrew to install a piece of software. A formula is defined as a class that inherits from Formula. This follows the object-oriented paradigm, unlike the functional Nix derivations which are defined with a function.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;class Wget &amp;lt; Formula
  homepage "https://www.gnu.org/software/wget/"
  url "https://ftp.gnu.org/gnu/wget/wget-1.15.tar.gz"
  sha256 "52126be8cf1bddd7536886e74c053ad7d0ed2aa89b4b630f76785bac21695fcd"

  def install
    system "./configure", "--prefix=#{prefix}"
    system "make", "install"
  end
end
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Homebrew can be used in Linux (formerly Linuxbrew), although Linux distributions often have popular package managers. Similar to nix-channels, brew uses “Taps,” which are third-party repositories.&lt;/p&gt;

&lt;p&gt;The immense popularity of Homebrew in Mac gives it an advantage over Nix’s build reliability and thoughtful dependency graph. Most installations are pre-built and “just work.”&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;From a marketing perspective, I find that Nix lacks branding and distinctive names for their services (except for Hydra and Flakes), which makes it difficult to search for documentation. Nix has merged Nix and NixOS documentation, so trivial beginner searches about nix-env easily lead to solutions about the modification of configuration.nix, which is only applicable to NixOS.&lt;/p&gt;

&lt;p&gt;The use of /nix/store has been a bit non-conventional on the part of Nix, as it breaks the FHS guidelines. It would have been more appropriate to put it under /var somewhere. I don’t think macOS follows FHS, but now the root (/) level is read-only in macOS, and Nix had to scratch their head to find workarounds.&lt;/p&gt;

&lt;p&gt;Nix isn’t as intuitive as other build tools, but it excels at correctness. As such it aims to have the rigor of science and shows the hard work from academia. It has been embraced by the communities of functional languages such as Haskell and NixOS has piqued the interest of the entire Linux community.&lt;/p&gt;

</description>
      <category>reactnative</category>
      <category>javascript</category>
      <category>devops</category>
      <category>docker</category>
    </item>
    <item>
      <title>How I Switched from TypeScript to ReScript</title>
      <dc:creator>Ronen Lahat</dc:creator>
      <pubDate>Wed, 20 Jan 2021 19:59:02 +0000</pubDate>
      <link>https://dev.to/ronenl/how-i-switched-from-typescript-to-rescript-1g34</link>
      <guid>https://dev.to/ronenl/how-i-switched-from-typescript-to-rescript-1g34</guid>
      <description>&lt;p&gt;&lt;em&gt;A glimpse into a more civilized (yet challenging) tool in the JavaScript ecosystem&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Article originally published at &lt;a href="https://medium.com/att-israel/how-i-switched-from-typescript-to-rescript-637aa5ef8d3" rel="noopener noreferrer"&gt;Medium&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;This is not evangelism of ReScript or a one-to-one comparison with TypeScript. I love TypeScript. I decided to rewrite a small TypeScript+React+Jest side project into ReScript.&lt;/p&gt;

&lt;p&gt;ReScript is not new. In a way it’s as old as JavaScript itself. ReScript is a rebranding of ReasonML (&lt;em&gt;Facebook&lt;/em&gt;) and BuckleScript (&lt;em&gt;Bloomberg&lt;/em&gt;), which wrap &lt;a href="https://ocaml.org/" rel="noopener noreferrer"&gt;OCaml&lt;/a&gt; on both ends. The former is an interface of the OCaml syntax, while the latter makes sure to compile the AST into JavaScript. ReasonML was created by Jordan Walke, the creator of React. &lt;a href="https://reasonml.github.io/" rel="noopener noreferrer"&gt;ReasonML&lt;/a&gt; still exists as a parallel project to ReScript, with a slightly different syntax and mission.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmiro.medium.com%2Fmax%2F1156%2F1%2AyRmSBblfxfX81txBJYTcEg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmiro.medium.com%2Fmax%2F1156%2F1%2AyRmSBblfxfX81txBJYTcEg.png" alt="Image for post"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;ReScript syntax compiling into OCaml Abstract-Syntax-Tree, and BuckleScript compiling into readable, optimized JavaScript&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;ReScript is not just a rebranding: it’s a ReasonML which freed itself of the yoke of the OCaml ecosystem. By doing so, it forfeited compilation to native code and OCaml library interop, but gained a freer syntax which further resembles JavaScript to embrace its developers, eager for better tools.&lt;/p&gt;

&lt;h1&gt;
  
  
  First Impression
&lt;/h1&gt;

&lt;p&gt;My first attempt was to just install ReScript on my project, start the watcher, rename an easy file into &lt;code&gt;.res&lt;/code&gt; and be guided by the errors. I immediately learned that refactoring into ReScript is not “breadth-first” but “depth-first.” Simply renaming the file extension won’t work, as the compiler stops completely at type errors.&lt;/p&gt;

&lt;p&gt;In TypeScript one can gradually assign types and interfaces to dynamic types, while tagging some as &lt;code&gt;unknown&lt;/code&gt; or &lt;code&gt;any&lt;/code&gt;. Depth-first means that you start with one small function, or one small React component, and write it properly. If all the types are right — and with mathematical precision— your code will compile into JavaScript.&lt;/p&gt;

&lt;p&gt;While TypeScript often transpiles into unreadable code, it’s good practice to keep an open tab on the auto-generated &lt;code&gt;js&lt;/code&gt; file from ReScript. You’ll be pleasantly surprised by the speed of transpilation, the conciseness and readability of the code, and the performance of such code. If the ReScript code compiled, it means its types are safe and sound, so it can optimize away all the noise.&lt;/p&gt;

&lt;p&gt;The only exception I saw to readability and performance of the generated JavaScript was in curried functions. All functions in ReScript are curried by default, and some of them generate code which imports a Currying library. This didn’t happen often, and currying can be disabled.&lt;/p&gt;

&lt;p&gt;But what about TypeScript? Inter-operation with JavaScript code is trivial, but importing and exporting types from TypeScript (or Flow) can be more complex, and it creates two sources of truth: one for ReScript types and another for TypeScript.&lt;/p&gt;

&lt;p&gt;GenType, described below, auto-generates a typed &lt;code&gt;tsx&lt;/code&gt; file from your ReScript code which you can import into other modules. This helped for exporting ReScript types, but it’s not possible to import TypeScript ones. The automation of type conversions &lt;a href="https://www.javierchavarri.com/adopting-reason-strategies-dual-sources-of-truth-and-why-gentype-is-a-big-deal/" rel="noopener noreferrer"&gt;eased the problem of the two sources of truth.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Furthermore, the generated ts code uses CommonJs &lt;code&gt;require&lt;/code&gt; syntax, which break when using native ECMAScript module support. I also had to tweak my &lt;code&gt;tsc&lt;/code&gt; to not transpile the auto-generated &lt;code&gt;tsx&lt;/code&gt; into a fourth (!) source file:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;code&gt;.res&lt;/code&gt; ReScript source code.&lt;/li&gt;
&lt;li&gt;  &lt;code&gt;.bs.js&lt;/code&gt; compiled JavaScript, which you can ignore in your source control&lt;/li&gt;
&lt;li&gt;  &lt;code&gt;.gen.tsx&lt;/code&gt; auto-generated by GenType, which import the compiled JavaScript code and re-export it with proper types. Also add to your &lt;code&gt;.gitignore&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;  &lt;code&gt;.gen.jsx&lt;/code&gt; accidentally transpiled by TypeScript, delete it and reconfigure your &lt;code&gt;tsconfig.json&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I first rewrote my algorithms, since they didn’t have any third-party imports to inter-operate with, and the import syntax was daunting for me at first. Some teams go for a &lt;a href="https://www.javierchavarri.com/adopting-reason-strategies-dual-sources-of-truth-and-why-gentype-is-a-big-deal/" rel="noopener noreferrer"&gt;data-first strategy, or a UI-first one&lt;/a&gt; (as Facebook did in 2017 for Messenger.com, &lt;a href="https://reasonml.github.io/blog/2017/09/08/messenger-50-reason.html" rel="noopener noreferrer"&gt;rewriting 50% of the codebase&lt;/a&gt;).&lt;/p&gt;

&lt;h1&gt;
  
  
  Types
&lt;/h1&gt;

&lt;p&gt;ReScript is part of the &lt;strong&gt;statically typed functional programming language family&lt;/strong&gt;, which means it’s not compiling. Just kidding, it means it uses the Hindley-Milner type algorithm, which deduces types with 100% certainty and can prove it mathematically as long as your variables are immutable (and a few other language design choices). TypeScript on the other hand tries to do it’s best at finding a common type for all your usages.&lt;/p&gt;

&lt;p&gt;This might blow your mind as a TypeScript user, but the following ReScript function is fully statically typed:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;let add = (a, b) =&amp;gt; a + b
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;ReScript knows with provable certainty that &lt;code&gt;a&lt;/code&gt; and &lt;code&gt;b&lt;/code&gt; are both &lt;code&gt;int&lt;/code&gt; and that the function returns an &lt;code&gt;int&lt;/code&gt;. This is because the &lt;code&gt;+&lt;/code&gt; operator only works on two &lt;code&gt;int&lt;/code&gt; and returns an &lt;code&gt;int&lt;/code&gt; . To concatenate two strings you’d use &lt;code&gt;++&lt;/code&gt; and for two floats use &lt;code&gt;+.&lt;/code&gt;. To combine two different types you need to convert either of them. Also, no semicolons.&lt;/p&gt;

&lt;p&gt;If you’re like me and like to type your code as you prototype, you can do so as you’d expect:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;let add = (a: int, b: int): int =&amp;gt; a + b
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The generated JavaScript code in both cases is the same (ReScript v8.4.2):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;'use strict';
function add(a, b) {  
    return a + b | 0;  
}
exports.add = add;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Notice how I didn’t specify any module exports but the resulting code did. This shows how everything in the module/file is exported by default. The JavaScript function itself is not type safe, so importing it in a JavaScript module and using it there won’t have all the advantages of ReScript.&lt;/p&gt;

&lt;p&gt;You can try it for yourself in the official &lt;a href="https://rescript-lang.org/try?code=DYUwLgBAhgJjEF4IAooBoICMCUiB80EA1FkA" rel="noopener noreferrer"&gt;playground&lt;/a&gt;.&lt;/p&gt;

&lt;h1&gt;
  
  
  Generating TypeScript
&lt;/h1&gt;

&lt;p&gt;To interoperate with TypeScript with proper type information you’ll use third-party &lt;a href="https://github.com/reason-association/genType" rel="noopener noreferrer"&gt;genType&lt;/a&gt;. Add it as a devDependency and annotate the module export you want to generate with &lt;code&gt;@genType&lt;/code&gt; (in previous versions you’d surround annotations with square brackets).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// MyModule.res
@genType  
let add = (a,b) =&amp;gt; a + b
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will result in the following TypeScript. Notice how the generated TypeScript imports the generated JavaScript &lt;code&gt;MyModule.bs.js&lt;/code&gt; file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// MyModule.gen.tsx
const MyModuleBS = require('./MyModule.bs');
export const add: (_1:number, _2:number) =&amp;gt; number = MyModuleBS.add;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;GenType generates a one-line re-export of your generated &lt;code&gt;.bs.js&lt;/code&gt; file, with proper TypeScript typing. From this example you’ll notice two more things:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Every file is a module.&lt;/li&gt;
&lt;li&gt;  Everything is exported.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here’s an &lt;a href="https://github.com/reason-association/genType/tree/master/examples/typescript-react-example" rel="noopener noreferrer"&gt;example&lt;/a&gt; repo genTyping to TypeScript with React.&lt;/p&gt;

&lt;p&gt;For using TypeScript types, see “Importing TypeScript Types” below.&lt;/p&gt;

&lt;h1&gt;
  
  
  Records
&lt;/h1&gt;

&lt;p&gt;There is only one type which does need a type declaration, which is the &lt;a href="https://rescript-lang.org/docs/manual/latest/record" rel="noopener noreferrer"&gt;record&lt;/a&gt; type. A type declaration will look like this and produces no JavaScript code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;type student = {  
  age: int,  
  name: string  
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Types must begin with a lowercase! If we prepend it with &lt;code&gt;@genType&lt;/code&gt;, the generated TypeScript will look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// tslint:disable-next-line:interface-over-type-literal_  
export type student = {  
    readonly age: number;  
    readonly name: string  
};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you’re wincing at the lower-cased type breaking all your conventions you can rename the type on conversion with &lt;code&gt;@genType.as("Student")&lt;/code&gt;. This will add another line of code below the previous one:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;export type Student = student;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Also it includes a &lt;code&gt;tslint&lt;/code&gt; ignore line, which I hope they switch soon to &lt;code&gt;eslint&lt;/code&gt; as the former is deprecated.&lt;/p&gt;

&lt;p&gt;These are record types, not ReScript &lt;a href="https://rescript-lang.org/docs/manual/latest/object" rel="noopener noreferrer"&gt;objects&lt;/a&gt; (don’t misuse the &lt;code&gt;string&lt;/code&gt; type on them). As soon you type something like &lt;code&gt;foo.age&lt;/code&gt; ReScript will know that &lt;code&gt;foo&lt;/code&gt; is of type &lt;code&gt;student&lt;/code&gt;. In case there’s another record with and &lt;code&gt;age&lt;/code&gt; field, it will infer it’s the last one declared. In that case you might want to explicitly annotate the type.&lt;/p&gt;

&lt;p&gt;In the case you don’t want that much ceremony, you can use the &lt;code&gt;object&lt;/code&gt; type and index it with a string: &lt;code&gt;student["age"]&lt;/code&gt;; then you don’t need to declare a type.&lt;/p&gt;

&lt;p&gt;Furthermore you can use &lt;code&gt;student&lt;/code&gt; as a variable name, so &lt;code&gt;student.age&lt;/code&gt; is a valid expression, TypeScript would scream at something like this. Variables (that is, bindings) and Types live in a separate namespace, so a &lt;code&gt;student&lt;/code&gt; of type &lt;code&gt;student&lt;/code&gt; an be written as &lt;code&gt;student: student&lt;/code&gt;.&lt;/p&gt;

&lt;h1&gt;
  
  
  Nominal Typing
&lt;/h1&gt;

&lt;p&gt;Record types have “&lt;em&gt;nominal typing&lt;/em&gt;” similar to Java or C#, as opposed to TypeScript’s “&lt;em&gt;structural typing.&lt;/em&gt;” This is why interfaces are so important in TypeScript, and are used much more than Types. TypeScript doesn’t really care about “&lt;em&gt;what you are&lt;/em&gt;”, it cares about “&lt;em&gt;how you look&lt;/em&gt;.”&lt;/p&gt;

&lt;p&gt;For instance, if there’s another type, say, &lt;code&gt;teacher&lt;/code&gt; with the same fields of a &lt;code&gt;student&lt;/code&gt;, you cannot assign a &lt;code&gt;student&lt;/code&gt; to somewhere expecting a &lt;code&gt;teacher&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// defined first  
type student = {  
  age: int,  
  name: string  
}

// defined last  
type teacher = {  
    age: int,  
    name: string  
}

// t is a teacher  
let t = {  
    age: 35,  
    name: "Ronen"  
}

let s: student = t // Error!
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You’d get a colored error saying:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;We've found a bug for you!
//...This has type: teacher
Somewhere wanted: student  
FAILED: cannot make progress due to previous errors.  
&amp;gt;&amp;gt;&amp;gt;&amp;gt; Finish compiling(exit: 1)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Unlike TypeScript’s &lt;code&gt;tsc&lt;/code&gt; compiler, &lt;code&gt;bsb&lt;/code&gt; won’t begrudgingly continue its transpilation work into working JavaScript. It will stop with a non-zero exit code, and you have to fix the issue in order to make any progress.&lt;/p&gt;

&lt;h1&gt;
  
  
  Optionals
&lt;/h1&gt;

&lt;p&gt;One of the features I most like in modern TypeScript (or future JavaScript) are the optionals. They make working with nullable types easy and concise:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;const something: string = foo?.bar?.baz ?? "default";&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;code&gt;something&lt;/code&gt; will be the content of &lt;code&gt;baz&lt;/code&gt; if it reached that far, or be &lt;code&gt;"default"&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;There are no &lt;code&gt;null&lt;/code&gt; or &lt;code&gt;undefined&lt;/code&gt; in ReScript. But we can work with nullable values using the Variant &lt;a href="https://rescript-lang.org/docs/manual/latest/null-undefined-option" rel="noopener noreferrer"&gt;option&lt;/a&gt;. But how can we get the elegance of the above TypeScript code? I tried to answer this question but, we can’t, currently. Not enough sugar.&lt;/p&gt;

&lt;p&gt;As with other functional languages, we can use a myriad of interesting library functions. Some of &lt;a href="https://rescript-lang.org/docs/manual/latest/api/belt" rel="noopener noreferrer"&gt;Belt utility functions&lt;/a&gt; are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;code&gt;Belt.Option.Map&lt;/code&gt; will execute a function on the optional value if it exists, or return &lt;code&gt;None&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;  &lt;code&gt;Belt.Option.getWithDefault&lt;/code&gt; will return a default if the optional is &lt;code&gt;None&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;  &lt;code&gt;Belt.Array.keepMap&lt;/code&gt; will trim away all &lt;code&gt;None&lt;/code&gt; values from an array.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;But for this case, the best option is with &lt;a href="https://rescript-lang.org/docs/manual/latest/pattern-matching-destructuring" rel="noopener noreferrer"&gt;Pattern Matching&lt;/a&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;let baz = switch foo {  
   | Some({ bar: Some({ baz: baz })}) =&amp;gt; baz  
   | None =&amp;gt; None  
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;There isn’t yet a sugared syntax for optionals; the optional operators are very new to TypeScript as well.&lt;/p&gt;

&lt;p&gt;The important quality of pattern matching is that the compiler will complain if there’s any case — doesn’t matter how deeply nested — you haven’t addressed. It’s best practice for most cases.&lt;/p&gt;

&lt;h1&gt;
  
  
  Pipes
&lt;/h1&gt;

&lt;p&gt;Pipes are great. They compile this code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;person  
  -&amp;gt;parseData  
  -&amp;gt;getAge  
  -&amp;gt;validateAge
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Into this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;validateAge(getAge(parseData(person)));
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Previous versions used a triangle operator &lt;code&gt;|&amp;gt;&lt;/code&gt;. The difference is in where to shove the data: as the first parameter, as the arrow does, or as the last parameter, as the deprecated triangle does. &lt;a href="https://www.javierchavarri.com/data-first-and-data-last-a-comparison/" rel="noopener noreferrer"&gt;More about this.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Notice that in the case of a one-parameter function we don’t write the &lt;code&gt;unit&lt;/code&gt;, that is &lt;code&gt;()&lt;/code&gt;. This is a common beginner’s mistake. In the case of multiple parameters, the value gets passed as the first one and the other parameters begin with the second one.&lt;/p&gt;

&lt;p&gt;This is especially important in a functional language, since we lose some of the elegance of calling methods in objects.&lt;/p&gt;

&lt;p&gt;What would be a JavaScript method call such as map:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;myArray.map(value =&amp;gt; console.log(value));
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Has to be written functionally in ReScript as:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Belt.Array.map(myArray, value =&amp;gt; Js.log(value))
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;But can be rewritten as:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;myArray -&amp;gt; Belt.Array.map(value =&amp;gt; Js.log(value))
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As a newcomer I try to find a use for it anywhere I can, which can lead to the bad practice of rewriting code around it to impress my coworkers. To use it on JavaScript libraries you’ll have to &lt;a href="https://rescript-lang.org/docs/manual/latest/pipe#js-method-chaining" rel="noopener noreferrer"&gt;write the correct bindings&lt;/a&gt; for them. This is one thing I’d like to see in JavaScript. &lt;a href="https://github.com/tc39/proposal-pipeline-operator/wiki" rel="noopener noreferrer"&gt;Here are a few stage-1 proposal&lt;/a&gt;s.&lt;/p&gt;

&lt;p&gt;By the way, if you’re not using &lt;a href="https://github.com/tonsky/FiraCode" rel="noopener noreferrer"&gt;Fira Code&lt;/a&gt; then you’re missing out on a lot of the pipe’s aesthetics.&lt;/p&gt;

&lt;h1&gt;
  
  
  Promises
&lt;/h1&gt;

&lt;p&gt;This was very frustrating for me. I love using modern &lt;code&gt;async&lt;/code&gt; and &lt;code&gt;await&lt;/code&gt; syntax in my code, which ReScript didn’t implement yet. I had to go back into thinking about &lt;code&gt;then&lt;/code&gt; and &lt;code&gt;resolve&lt;/code&gt;, which made simple code look complex.&lt;/p&gt;

&lt;p&gt;The following code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const getName = async (id: number): Promise&amp;lt;string&amp;gt; =&amp;gt; {  
    const user = await fetchUser(id);  
    return user.name;  
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Is de-sugared into:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const getName = async (id: number): Promise&amp;lt;string&amp;gt; =&amp;gt;   
    fetchUser(id).then(user =&amp;gt; user.name);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now consider &lt;code&gt;then&lt;/code&gt; to be a function in the &lt;code&gt;Js.Promises&lt;/code&gt; module instead of a method, which accepts &lt;code&gt;fetchUser(id)&lt;/code&gt; as its last parameter, and you can write it like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;let getName = (id) =&amp;gt;  
    Js.Promise.then_(  
        user =&amp;gt; Js.Promise.resolve(user.name),  
        fetchUser(id))
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Typed as &lt;code&gt;Js.Promise.t&amp;lt;string&amp;gt;&lt;/code&gt;, and with arrow pipe syntax for readability, the above function can be written as:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;let getName = (id): Js.Promise.t&amp;lt;string&amp;gt; =&amp;gt;  
    fetchUser(id) |&amp;gt; Js.Promise.then_(  
        user =&amp;gt; Js.Promise.resolve(user.name))
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The Promise library still uses the old convention of passing the data as the last argument, so in order to use the newer arrow pipe, an underscore has to be placed in the proper location.&lt;/p&gt;

&lt;p&gt;Here are &lt;a href="https://gist.github.com/lilactown/a8d1dc6aa2043efa62b23e559291053e" rel="noopener noreferrer"&gt;examples&lt;/a&gt; for Promises written in the (almost-identical) ReasonML syntax.&lt;/p&gt;

&lt;p&gt;The ReScript team &lt;a href="https://rescript-lang.org/docs/manual/latest/promise" rel="noopener noreferrer"&gt;promised&lt;/a&gt; (no pun intended) to implement a Promise API revamp with their own &lt;code&gt;async&lt;/code&gt; and &lt;code&gt;await&lt;/code&gt;.&lt;/p&gt;

&lt;h1&gt;
  
  
  Import JavaScript Modules
&lt;/h1&gt;

&lt;p&gt;If you’re writing only in ReScript you don’t need to bother with imports or exports, and this is done under the hood. Every file is a module and everything in it is exported. If you only want specific things exported you do so with an interface file. To import JavaScript modules however, the syntax can get complicated.&lt;/p&gt;

&lt;p&gt;To import &lt;code&gt;dirname&lt;/code&gt; from the &lt;code&gt;path&lt;/code&gt; module, you’d write:&lt;/p&gt;

&lt;p&gt;&lt;a class="mentioned-user" href="https://dev.to/bs"&gt;@bs&lt;/a&gt;.module("path") external &lt;strong&gt;dirname&lt;/strong&gt;: string =&amp;gt; string = "dirname"&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmiro.medium.com%2Fmax%2F2402%2F1%2AThMyXVEUdIQOQ5LGARvwHg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmiro.medium.com%2Fmax%2F2402%2F1%2AThMyXVEUdIQOQ5LGARvwHg.png" alt="Image for post"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;the elements of an import from JavaScript files&lt;/p&gt;

&lt;p&gt;Then use it accordingly:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;let root = dirname("/User/github") // returns "User"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For ReasonReact this became particularly tiresome, as I had to define inline modules for each React Component, and reexport the default export as the “make” function, paying attention to named parameters such as “children.” Here I imported the Container from &lt;a href="https://react-bootstrap.github.io/" rel="noopener noreferrer"&gt;react-bootstrap&lt;/a&gt; and used it in ReasonReact:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;module Container = {  
    @bs.module("react-bootstrap/Container")  
    @react.component  
    external make: (~children: React.element) =&amp;gt; React.element = "default"  
}
@react.component  
let make = () =&amp;gt; &amp;lt;Container&amp;gt; ...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  Redex
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmiro.medium.com%2Fmax%2F2736%2F1%2AcFbCBs3cNQnvsGsh12Nlpg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmiro.medium.com%2Fmax%2F2736%2F1%2AcFbCBs3cNQnvsGsh12Nlpg.png" alt="REDEX: Reason Package Index"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For this case I can get the bindings from &lt;a href="https://redex.github.io/" rel="noopener noreferrer"&gt;redex,&lt;/a&gt; and add it as a dependency both to my &lt;code&gt;package.json&lt;/code&gt; and my &lt;code&gt;bsconfig.json&lt;/code&gt;. I can then import it with &lt;code&gt;open ReactBootstrap&lt;/code&gt; at the top of my file. This is similar to &lt;a href="https://www.google.com/url?sa=t&amp;amp;rct=j&amp;amp;q=&amp;amp;esrc=s&amp;amp;source=web&amp;amp;cd=&amp;amp;ved=2ahUKEwjc45OLh5TuAhWEsHEKHRnVBsYQFjABegQIARAC&amp;amp;url=https%3A%2F%2Fdefinitelytyped.org%2F&amp;amp;usg=AOvVaw0b68ZVMiJ_WzZ7tP-Yh8p0" rel="noopener noreferrer"&gt;DefinitelyTyped,&lt;/a&gt; where you can find high-quality type definitions for TypeScript.&lt;/p&gt;

&lt;p&gt;For this case however I ran into an error, as the package I needed was not updated to the latest version. I had to fork it and &lt;a href="https://dev.to/erykpiast/reasonreact-migration-guide-to-jsx-3-and-hooks-37op"&gt;manually update&lt;/a&gt; it to &lt;code&gt;react-jsx&lt;/code&gt; version 3.&lt;/p&gt;

&lt;h1&gt;
  
  
  Importing TypeScript Types
&lt;/h1&gt;

&lt;p&gt;You can’t import a type from TypeScript and use it in ReScript, you have to re-declare it. However, you can link the type you created to the original TypeScript one for correct inter-operation. Here’s an example with Node.js’ &lt;code&gt;fs&lt;/code&gt; module:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;@genType.import(("fs", "Dirent"))  
type dirent
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Notice that I passed a &lt;code&gt;tuple&lt;/code&gt; to &lt;code&gt;import&lt;/code&gt;, not an argument list. This will link my type &lt;code&gt;dirent&lt;/code&gt; to &lt;code&gt;fs.Dirent&lt;/code&gt;, and will generate the following TypeScript:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import {Dirent as $$dirent} from 'fs';_// tslint:disable-next-line:interface-over-type-literal_  
export type dirent = $$dirent;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can declare the entire type, in case you need to use its properties, or leave it as is.&lt;/p&gt;

&lt;p&gt;Because of the syntax overhead of TypeScript-ReScript inter-operation, I recommend doing it as little as possible, using each language in separate areas of your app.&lt;/p&gt;

&lt;h1&gt;
  
  
  ReasonReact
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmiro.medium.com%2Fmax%2F1376%2F1%2A7i6JxB1WnC6d7kGojw98NQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmiro.medium.com%2Fmax%2F1376%2F1%2A7i6JxB1WnC6d7kGojw98NQ.png" alt="Image for post"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;ReasonML (now ReScript) was created by Jordan Walke, the creator of React. Reason+React pushes the React philosophy further by utilizing the language syntax and features for ReactJS’s programming patterns.&lt;/p&gt;

&lt;p&gt;ReasonReact provides smooth JS interop and uses built-in language features to integrate into UI framework patterns left unaddressed by ReactJS, such as routing and data management. Using them feels like “&lt;em&gt;just using Reason.&lt;/em&gt;”&lt;/p&gt;

&lt;p&gt;The documentation for ReasonReact still uses the old syntax, so things like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[@react.component]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Needs to be changed into:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;@react.component
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you want to use the old syntax, just change the file extension to &lt;code&gt;.re&lt;/code&gt; instead of &lt;code&gt;.res&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;ReasonReact is stricter than ReactJS, mainly in its use of types (e.g., strings need to be used with &lt;code&gt;React.string()&lt;/code&gt;in JSX. Other than this, the &lt;code&gt;React.useState&lt;/code&gt; returns a proper &lt;code&gt;tuple&lt;/code&gt; instead of an array, the way it was originally intended. Finally, React Components are rendered through a &lt;code&gt;make&lt;/code&gt; function, and prepended with &lt;code&gt;@react.component&lt;/code&gt; (I added &lt;code&gt;@genType&lt;/code&gt; as well for TypeScript generation):&lt;/p&gt;

&lt;p&gt;For the example, I imported this component into a React TypeScript file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// index.tsx
import { make as Demo } from "./pages/Demo.gen";
// ...&amp;lt;Demo name={"Foo"} /&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Which, when rendered, looks like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmiro.medium.com%2Fmax%2F420%2F1%2Ax7eXd21SMWeESmc9GHugjw.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmiro.medium.com%2Fmax%2F420%2F1%2Ax7eXd21SMWeESmc9GHugjw.gif" alt="Image for post"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In case we don’t want GenType for TypeScript generation, we just import &lt;code&gt;Demo.bs&lt;/code&gt; instead.&lt;/p&gt;

&lt;h1&gt;
  
  
  Testing
&lt;/h1&gt;

&lt;p&gt;In order to write tests in ReScript, and thus test your code directly, you can use &lt;a href="https://github.com/glennsl/bs-jest" rel="noopener noreferrer"&gt;bs-jest,&lt;/a&gt; which provides ReScript bindings to Jest. If you prefer, you can also use the slightly less mature &lt;a href="https://github.com/reasonml-community/bs-mocha" rel="noopener noreferrer"&gt;bs-mocha&lt;/a&gt;. You can also test the generated JavaScript or TypeScript files with no extra configuration.&lt;/p&gt;

&lt;p&gt;Since ReScript is in the JavaScript ecosystem it makes little sense to create specialized testing tools for ReScript, and the direction seems to be in developing bindings for JavaScript testing tools.&lt;/p&gt;

&lt;p&gt;With bs-jest, you have to name you can’t name your file &lt;code&gt;foo.spec.res&lt;/code&gt;, only with a valid module name, such as &lt;code&gt;foo_spec.res&lt;/code&gt;. Jest will run on the compiled folder, by default inside &lt;code&gt;lib/js&lt;/code&gt;. Also, assertions are not executed immediately, but instead returned by the function and run at the end of the suite. It’s a functional way to thing about tests. Consequently, you can only write one assertion per test, which is best practice anyway.&lt;/p&gt;

&lt;h1&gt;
  
  
  Tooling
&lt;/h1&gt;

&lt;p&gt;ReScript devs did well on prioritizing the &lt;a href="https://github.com/rescript-lang/rescript-vscode" rel="noopener noreferrer"&gt;plugin for VSCode,&lt;/a&gt; which works really well. With the ReScript’s watcher running, you’ll see your Type errors underlined in red, with a descriptive bubble on hover. You also get type hints, formatting, and jumps to definitions. There’s also official support for &lt;a href="https://github.com/rescript-lang/vim-rescript" rel="noopener noreferrer"&gt;Vim&lt;/a&gt; (both plain Vim and Coc Language Server) and &lt;a href="https://github.com/rescript-lang/rescript-sublime" rel="noopener noreferrer"&gt;Sublime&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmiro.medium.com%2Fmax%2F1200%2F1%2ASpbTRW8aFP85FlHbC3SIEQ.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmiro.medium.com%2Fmax%2F1200%2F1%2ASpbTRW8aFP85FlHbC3SIEQ.gif" alt="Image for post"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Screen capture from &lt;a href="https://marketplace.visualstudio.com/items?itemName=chenglou92.rescript-vscode" rel="noopener noreferrer"&gt;rescript-vscode&lt;/a&gt;.&lt;/p&gt;

&lt;h1&gt;
  
  
  The Community
&lt;/h1&gt;

&lt;p&gt;A few times in my coding career I had to work with small communities, and I always loved it. I developed smart-contracts in &lt;a href="https://en.wikipedia.org/wiki/Solidity" rel="noopener noreferrer"&gt;Solidity&lt;/a&gt;, some database queries in the functional language &lt;a href="https://en.wikipedia.org/wiki/Q_(programming_language_from_Kx_Systems)" rel="noopener noreferrer"&gt;Q,&lt;/a&gt; and Roku channels in &lt;a href="https://developer.roku.com/docs/references/brightscript/language/brightscript-language-reference.md" rel="noopener noreferrer"&gt;BrightScript&lt;/a&gt;. You end up working with Slack/Discord/Gitter open and code together with the few others going through your similar problems. You don’t even bother checking StackOverflow for answers.&lt;/p&gt;

&lt;p&gt;This forces you to read and reread the official documentation and examples, as you don’t want to look like dumb in the chatroom. Also, you’re part of a community maintained by real people, where you can always contribute something interesting, and even shape its development.&lt;/p&gt;

&lt;p&gt;Not all communities are alike, of course. I personally found the ReasonML/ReScript community to be welcoming. ReScript has an &lt;a href="https://forum.rescript-lang.org/" rel="noopener noreferrer"&gt;official forum&lt;/a&gt; where you can communicate asynchronously and with a permanent paper record you can search. The core team consists of a &lt;a href="https://forum.rescript-lang.org/" rel="noopener noreferrer"&gt;handful of developers&lt;/a&gt; with public Twitter accounts, and there’s an &lt;a href="https://rescript-lang.org/blog" rel="noopener noreferrer"&gt;official blog&lt;/a&gt;. I found however that the community hangs around in the &lt;a href="https://discord.gg/mVY4NBdRvU" rel="noopener noreferrer"&gt;ReasonML’s Discord server&lt;/a&gt;, in an unofficial ReScript room.&lt;/p&gt;

&lt;p&gt;Finally, there’s &lt;a href="https://anchor.fm/reason-town" rel="noopener noreferrer"&gt;ReasonTown&lt;/a&gt;, “a podcast about the ReasonML language and the community that makes it good,” &lt;a href="https://www.youtube.com/channel/UCtFP_Hn5nIbZY4Xi47qfHhw/videos" rel="noopener noreferrer"&gt;ReasonConf&lt;/a&gt;’s YouTube channel, and &lt;a href="https://redex.github.io/" rel="noopener noreferrer"&gt;Redex&lt;/a&gt;, to find bindings for your libraries.&lt;/p&gt;

&lt;h1&gt;
  
  
  Conclusion
&lt;/h1&gt;

&lt;p&gt;The switch is not easy; a refactor of an existing app is even more difficult given its fatal stop on the first issue. This will certainly hinder its adoption. Popular transpilers, such as TypeScript, SCSS, or CoffeeScript garnered adoption by its ease. Just copy-paste your code — or rename your file — and you’re done.&lt;/p&gt;

&lt;p&gt;This is different. ReScript, as with other statically typed functional languages, aims at changing the way code is approached at a fundamental level. I believe we’ll see a greater adoption of functional programming in the future, eventually becoming the default for some industries. This is due to the mathematical approach to types, formal verification of a program’s correctness, and given immutability: less moving pieces and mental mapping.&lt;/p&gt;

&lt;p&gt;We’re already at the first stage of adopting a “functional style” in the ecosystem with &lt;code&gt;map&lt;/code&gt;, &lt;code&gt;filter&lt;/code&gt;, &lt;code&gt;reduce&lt;/code&gt; functions in JavaScript. ReScript represent the next — hybrid stage — of a properly functional language from the ML family which compiles to the industry’s standard JavaScript.&lt;/p&gt;

&lt;p&gt;Functional programming at its core takes itself seriously. It’s mathematical, formal, and doesn’t comply with hacks. It aspires to deals with truths, not processes. Writing a «functional style» in JavaScript only whets one’s appetite for more, as the language brings one’s good intentions down, not up. ReScript, while frustrating, might be the precision tool for a more civilized future in the ecosystem.&lt;/p&gt;

</description>
      <category>typescript</category>
      <category>javascript</category>
      <category>react</category>
      <category>rescript</category>
    </item>
  </channel>
</rss>
