<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Maciej Swiderski</title>
    <description>The latest articles on DEV Community by Maciej Swiderski (@mswiderski).</description>
    <link>https://dev.to/mswiderski</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/mswiderski"/>
    <language>en</language>
    <item>
      <title>Apache Camel or workflows or both?</title>
      <dc:creator>Maciej Swiderski</dc:creator>
      <pubDate>Fri, 24 Feb 2023 06:57:02 +0000</pubDate>
      <link>https://dev.to/mswiderski/apache-camel-or-workflows-or-both-5e5g</link>
      <guid>https://dev.to/mswiderski/apache-camel-or-workflows-or-both-5e5g</guid>
      <description>&lt;p&gt;In this post I would like to take up for discussion a topic that comes up quite often in coversation that oscilate around workflows. Especially common when you talk with Java developers that have interest or experience in that field.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;So, what can you expect from this article?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A high level comparison of Apache Camel and workflows - what they have in common, what is the difference between them and how they can be combined without the need to choose one or the other but take the best of both to bring an ultimate solution to your business problems.&lt;/p&gt;

&lt;h2&gt;
  
  
  Apache Camel
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;a href="https://camel.apache.org/" rel="noopener noreferrer"&gt;Apache Camel&lt;/a&gt; is an Open Source integration framework that empowers you to quickly and easily integrate various systems consuming or producing data&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;It is really popular, it does a great job, it has active community and it evolves all the time. I could spent almost complete article about Camel and its strenghts in the integration area but I would like to mainly focus on its capabilities that are in many cases compared with workflows.&lt;/p&gt;

&lt;p&gt;Let's start with what is at the heart of Camel - routes. These are definition of the integration logic to be exected. It can be as simple as receiving message on one channel and then putting that message on another. But it can also be way more complex than that, it implements Enterprise Integration Patterns which cover significant area of system integration. But it also comes with components - sort of connectors (consumers or producers or both) that allow routes to connect to outside world.&lt;/p&gt;

&lt;p&gt;These capabilities make Camel also a good fit for one of the use cases that workflows are usually considered - service orchestration. And yes, it can do that very well, especially if that orchestration is stateless or short lived. It can cover it pretty well - including error handling, retries, routing, and so on.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;So if it is so good why it hasn't replaced workflows?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The main reason for that is the state - as soon as the use case requires persisting state of the execution, Camel does not provide that out of the box. And if you ask me it's actually good design decision to keep it like that. For that workflows are more prepared, let's expore that in more details.&lt;/p&gt;

&lt;h2&gt;
  
  
  Workflows
&lt;/h2&gt;

&lt;p&gt;Workflows on the other hand are not really meant to be for integration. Or to rephrese it - only for integration. The biggest misconception about workflows is that they are only for service orchestration. While that is one of common use cases it is not the only one. Quite the opposite, the full power of workflows is with another one - &lt;strong&gt;business entity life cycle management&lt;/strong&gt;. This might sound fancy, but the idea behind this term is actually to use workflows as a definition of the complete life cycle of the entity of your business.&lt;/p&gt;

&lt;p&gt;To give an example, imaging automotive industry, product development where one of the most important business entities is a &lt;strong&gt;part&lt;/strong&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Every thing is made of parts&lt;/li&gt;
&lt;li&gt;Parts have defined life cycle&lt;/li&gt;
&lt;li&gt;Parts are managed by systems and humans&lt;/li&gt;
&lt;li&gt;Parts are composed and used in various situations&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Being able to clearly define the life cycle of a part, that allows to keep track of it at any point in time, easily communicate across external parties is what brings the biggest value to the organization. And this is where workflows shine the most.&lt;/p&gt;

&lt;p&gt;As it can be already deducted, the best landscape for workflows is within the stateful environment. Meaning there is a need to keep the data of the given instance after execution is completed. This brings the concept of long lived execution instances aka workflow instance. Workflows are build of two main parts&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;definition - which is sort of a blue print that defines what can happen&lt;/li&gt;
&lt;li&gt;instance - which is a individual representation of the blue print&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;There are many instances of the same definition living at the same time, but they are in different state, holding different data sets and are in complete isolation from each other.&lt;br&gt;
You might ask can workflows be used for stateless use cases? And the answer is - of course they can and in many situations they are but it does not mean they should always be used. As with everything, use the right tool for the job... which brings us to the next section - why not combining both for the ultimate solution?&lt;/p&gt;

&lt;h2&gt;
  
  
  Combining Apache Camel and workflows
&lt;/h2&gt;

&lt;p&gt;We explored both Apache Camel and Workflows from the point of thier capabilities (very briefly as both can do a lot) so let's look at an idea to combine both for their strengths that be:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Apache Camel for its fantastic capabilities to connect to essentially everything&lt;/li&gt;
&lt;li&gt;Workflow for its excellent capabilities of state management and human actors interaction&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To provide little bit more details, workflows define the business logic and delegate to Camel whenever there is a need to interact with external systems (of any kind). The most important is that it should be explicit - meaning by looking at the workflow definition you will directly know that there is an integration. Let Camel do its magic at integration and let workflows keep the state and data under control.&lt;/p&gt;

&lt;p&gt;That's exactly what &lt;a href="https://automatiko.io/" rel="noopener noreferrer"&gt;Automatiko&lt;/a&gt; does, it leverages Camel for integration purpose via messaging concept that is well defined in workflows. To be more precise when it comes techincalities, Automatiko uses Reactive Messaging specification to combine Apache Camel and Workflow messaging. Let's take a look at simple example that uses camel file component to watch a folder for files and for each file a new instance of the workflow is started.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1gv6j7jy2365rhhojimi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1gv6j7jy2365rhhojimi.png" alt="Start workflow based on incomin file watched by Camel."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As can be seen on the screenshot above, camel endpoint is defined as attribute on the message defined in the workflow. More details how messaging (and Camel integration) works can be found in &lt;a href="https://docs.automatiko.io/main/0.0.0/components/messaging.html" rel="noopener noreferrer"&gt;Automatiko documentation&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Combining both will give a lot of power to build sofisticated business solution that bring visibility to the team and allow to make use of battle tested integration capabilities of Camel. At the same time not ubusing it for stateful use cases where state and the data needs to be kept.&lt;/p&gt;

&lt;h2&gt;
  
  
  When to use which?
&lt;/h2&gt;

&lt;p&gt;That's essentially the main question for this article to answer. The simple and short answer would be - if you have pure integration use case that do not involve state then go ahead and use Apache Camel - it was designed and made for that. But if you need to support long running instances, retries over time (like after a day, week etc) or you need to involve human actors then workflows will certainly fit better.&lt;/p&gt;

&lt;p&gt;Last but not least, I'd like to point out once again, do not consider workflows to be orchestration only - even though there is a lot of buzz about that use case from many vendors consider it more for expressing your business logic as you might already know it, workflows are also known as business processes and they are well understood by a lot of business stake holders.&lt;/p&gt;

&lt;h2&gt;
  
  
  Interested to know more?
&lt;/h2&gt;

&lt;p&gt;Here are few links that can help you to learn more about the concepts described here:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.automatiko.io/main/0.0.0/examples/batch.html" rel="noopener noreferrer"&gt;Document processor example&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/mswiderski/devoxxpl-2022" rel="noopener noreferrer"&gt;Business entity life cycle presentation at Devoxx Poland 2022&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>java</category>
      <category>apachecamel</category>
      <category>workflow</category>
      <category>opensource</category>
    </item>
    <item>
      <title>Define workflows with Java DSL</title>
      <dc:creator>Maciej Swiderski</dc:creator>
      <pubDate>Thu, 06 Oct 2022 07:03:24 +0000</pubDate>
      <link>https://dev.to/mswiderski/define-workflows-with-java-dsl-97c</link>
      <guid>https://dev.to/mswiderski/define-workflows-with-java-dsl-97c</guid>
      <description>&lt;p&gt;Workflows are usually seen as a way to orchestrate systems. But that is just one of the uses cases. Workflows can be really good at expressing business logic to increase visibility into the logic implemented.&lt;/p&gt;

&lt;p&gt;But then the question is how to define workflows?&lt;/p&gt;

&lt;p&gt;In many cases, workflows are represented as models e.g. flow charts. That requires modelling tools as well which is not always what developers want to do. &lt;/p&gt;

&lt;p&gt;So to help developers enter the world of workflows, &lt;a href="https://automatiko.io" rel="noopener noreferrer"&gt;Automatiko&lt;/a&gt; (an open source toolkit to build services and functions based on workflows) introduced recently a Workflow Java DSL. &lt;/p&gt;

&lt;p&gt;This domain specific language (DSL) comes with a handy fluent like API to define workflows. It is very well integrated with any IDE to provide easy entry point for any developer.&lt;/p&gt;

&lt;p&gt;Let's have a quick look at the basics of it&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

@Workflows
public class MyWorkflows {

    public WorkflowBuilder splitAndJoin() {

        WorkflowBuilder builder = WorkflowBuilder.newWorkflow("splitAndJoin", "Sample workflow with exclusive split and join");

        String x = builder.dataObject(String.class, "x");
        String y = builder.dataObject(String.class, "y");

        SplitNodeBuilder split = builder.start("start here").then()
                .log("log values", "X is {} and Y is {}", "x", "y")
                .thenSplit("split");

        JoinNodeBuilder join = split.when(() -&amp;gt; x != null).log("first branch", "first branch").thenJoin("join");

        split.when(() -&amp;gt; y != null).log("second branch", "second branch").thenJoin("join");

        join.then().log("after join", "joined").then().end("done");

        return builder;
    }
  }


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Few important aspects to note here:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;class that is annotated with &lt;code&gt;@Workflows&lt;/code&gt; instructs to create an service API for every public method that returns &lt;code&gt;WorkflowBuilder&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;splitAndJoin&lt;/code&gt; method defines a simple workflow that

&lt;ul&gt;
&lt;li&gt;declares two data objects &lt;code&gt;x&lt;/code&gt; and &lt;code&gt;y&lt;/code&gt; of type string&lt;/li&gt;
&lt;li&gt;first thing it logs the values of both data objects&lt;/li&gt;
&lt;li&gt;then it splits into different paths depending on the value of the data objects&lt;/li&gt;
&lt;li&gt;then joins it back &lt;/li&gt;
&lt;li&gt;logs a message&lt;/li&gt;
&lt;li&gt;and ends the workflow&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;Based on this workflow definition, Automatiko at build time will generate a fully featured (REST) service API that will expose this workflow as a service.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffs9ppw4btr91ttznqdrk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffs9ppw4btr91ttznqdrk.png" alt="Service API built from workflow definition"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In addition to that, workflow diagram will also be generated based on above workflow definition.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmk770y9jc05nmicudjuv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmk770y9jc05nmicudjuv.png" alt="Workflow definition diagram"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With just few lines of workflow DSL developers not only gain possibility to define business logic in descriptive way but get out of the box service api and visualisation of the execution for every instance.&lt;/p&gt;

&lt;p&gt;This is just very simple introduction to the Workflow Java DSL and more complete getting started guide can be found in &lt;a href="https://docs.automatiko.io/main/0.19.0/getting-started-code.html" rel="noopener noreferrer"&gt;Automatiko documentation&lt;/a&gt; which I encourage you to have a look and give it a try.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;If you have some interesting use cases that could challenge the workflows I would be more than happy to try it out! Let's see how far workflows can be taken to cover business logic.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Lastly, a short video demonstrating Automatiko Workflow Java DSL&lt;br&gt;
&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/FcXn5CLcHwY"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

</description>
      <category>workflow</category>
      <category>java</category>
      <category>dsl</category>
      <category>opensource</category>
    </item>
    <item>
      <title>An alternative look at workflows in cloud era</title>
      <dc:creator>Maciej Swiderski</dc:creator>
      <pubDate>Mon, 04 Jul 2022 06:52:52 +0000</pubDate>
      <link>https://dev.to/mswiderski/an-alternative-look-at-workflows-in-cloud-era-3dfb</link>
      <guid>https://dev.to/mswiderski/an-alternative-look-at-workflows-in-cloud-era-3dfb</guid>
      <description>&lt;p&gt;Workflows have been around software development for many decades. With its peak popularity around the SOA (service oriented architecture) times. Although it was always positioned as something on top, something superior. In fact, it led to smaller adoption as it became quite complex to build solutions on centralized workflow platforms. &lt;/p&gt;

&lt;p&gt;Currently workflows are starting to be more and more popular again, though their main use case is around service orchestration. In my opinion, this is the biggest misconception around workflows - they are actually way more than just service orchestration. &lt;/p&gt;

&lt;p&gt;What are workflows good at then?&lt;br&gt;
Workflows serve very well in following use cases&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;business entity life cycle&lt;/strong&gt; - a common use case in various enterprises or domains is to build systems that are responsible for an end to end life cycle of the business entity, e.g. parts in the automotive industry. Parts have well defined life cycle that goes through number of phases&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;event streams&lt;/strong&gt; - execute business logic on top of the event stream. More and more common are IoT based use cases where workflows can run very close to the sensors without the need to push data to some cloud offerings for processing&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;batch processing&lt;/strong&gt; - workflows are perfect fit for defining batch processors that are usually triggered by time events (run every night at 10pm) or by incoming message&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;human centric systems&lt;/strong&gt; - workflows come with out of the box features that allow to model and implement advanced interactions with human actors such as reassignment, notifications, escalations and more&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Kubernetes operators&lt;/strong&gt; - operator pattern in Kubernetes is an excellent example where workflows have a natural place, operator logic is a constantly repeating the set of steps to reach the desired state of the resource, workflows can easily define the steps and then repeat consistently for each resource deployed to Kubernetes cluster&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The other aspect of a bit misleading approach to workflows is that they are usually outside of the service or system. While this makes sense for service orchestration use cases, it does not bring much value when it comes to above mentioned scenarios. Instead, workflows could play an essential role in the service or system being built. Let’s explore what this could look like and how it corresponds to the cloud.&lt;/p&gt;

&lt;p&gt;With that, let’s introduce three new concepts around workflows&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;workflow as a service&lt;/li&gt;
&lt;li&gt;workflow as a function&lt;/li&gt;
&lt;li&gt;workflow as a function flow&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Workflow as a service
&lt;/h2&gt;

&lt;p&gt;Workflow as a service aims at using workflows to be the base for creating service on top of it. This means that developers use workflows as a sort of programming language, yet another one in their tool box to avoid the need of developing the boilerplate code. Let’s break this down a little bit &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--LGqEAvYw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/px5pw66r8ahosjbc2cru.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--LGqEAvYw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/px5pw66r8ahosjbc2cru.png" alt="Workflow definition" width="880" height="768"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the above workflow definition we can find few important aspects&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;set of activities that tells what will actually happen&lt;/li&gt;
&lt;li&gt;data model (partNumber, info, status and valid) that represents a complete set of information associated with given resource&lt;/li&gt;
&lt;li&gt;metadata of the workflow like id, name, version, etc&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;All of that is considered source information to build up a service from it. Essentially, workflow definition represents CRUD (create, read, update, delete) service interface. The data model represents the resource behind the service interface - it’s the entity that the CRUD operations are managing. Set of activities extends the CRUD service interface with additional capabilities to allow consumers of the service a richer interaction model. &lt;/p&gt;

&lt;p&gt;Workflow as a service concept aims at using workflow definition as input and transforming it to a fully functional service interface with complete business logic implementation instead of just having a set of stubs generated.  Main idea behind it is to allow developers to focus on what is important - the business logic that is specific to the domain they work in rather than developing things that can be easily derived from the business model - CRUD service interface.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--n36rrUji--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/i5aphu58iqbgqmm89zof.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--n36rrUji--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/i5aphu58iqbgqmm89zof.png" alt="Service interface definition with OpenAPI" width="880" height="462"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Workflow as a function
&lt;/h2&gt;

&lt;p&gt;Workflow as a function aims at taking more advantage of the various cloud offerings to offload developers from taking care of infrastructure. Cloud functions become quite popular where the most famous is AWS Lambda, but certainly it is not the only one. Other ones that are popular are Azure Functions and Google Cloud Functions but there are others that are starting to pop out as well. &lt;/p&gt;

&lt;p&gt;Workflows can be used to model a business logic that will be then deployed as a function to one of the cloud offerings. What is important is that the workflow acts like an abstraction layer that again allows one to focus on the business needs rather than the plumbing code to know what it takes to run it as AWS Lambda or Azure Function.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--lJL5cIWx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/udl5wzcaxmvkqs9f4v0q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lJL5cIWx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/udl5wzcaxmvkqs9f4v0q.png" alt="Workflow as a function" width="880" height="279"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Above workflow, that is a user registration use case, clearly defines what is the business logic behind it and it is specific to a given domain. Workflow as a function means that it will be considered as a function, with well defined input and output. Deployment to the cloud function environment becomes the secondary aspect that is taken care of automatically.&lt;/p&gt;

&lt;h2&gt;
  
  
  Workflow as a function flow
&lt;/h2&gt;

&lt;p&gt;Last but not least is the workflow as a function flow. It expands on the idea of workflow as a function where the main principle is to model business use case as complete as possible but break it down to number of functions that are &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;self contained - represent given piece of business logic&lt;/li&gt;
&lt;li&gt;independent - are not aware of any other function&lt;/li&gt;
&lt;li&gt;invokable at any time - can be triggered at any point in time regardless of the other functions&lt;/li&gt;
&lt;li&gt;scalable - functions can be easily scaled to accommodate the traffic needs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Va3NSub1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/bo92s920zk9dyij4yu9s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Va3NSub1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/bo92s920zk9dyij4yu9s.png" alt="Workflow as a function flow" width="880" height="279"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Business logic is then represented at runtime as individual functions that are triggered by events. Each function has input (an event) and can produce zero or many outputs (events). In turn, produced events can trigger other functions. What is important to mention is that functions do not trigger (or call) other functions explicitly, they simply produce events that other functions can consume.&lt;/p&gt;

&lt;p&gt;This approach opens the doors for greater scalability as events can be efficiently distributed across many replicas of the functions.&lt;/p&gt;

&lt;p&gt;Regardless of the concept used, workflows come with many features that make them useful to build core business logic. Something that is usually overlooked is the isolation characteristic that workflows bring by design. Workflows are built as a definition, sort of a blueprint and then this blueprint is instantiated to represent individual instances. Each instance exists in complete isolation, including its data, state, etc. Other features that are important are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;reliable persistence, &lt;/li&gt;
&lt;li&gt;distributed timer/job scheduling and execution, &lt;/li&gt;
&lt;li&gt;messaging integration and many more.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Enough theory, can this actually run?&lt;br&gt;
Being introduced to the concepts, an obvious question is - can it actually run? Is there anything that implements those concepts? &lt;/p&gt;

&lt;p&gt;The short answer is &lt;strong&gt;YES&lt;/strong&gt;!&lt;/p&gt;

&lt;p&gt;This leads us to an open source framework called &lt;a href="https://automatiko.io/"&gt;Automatiko&lt;/a&gt; that aims at building services and functions based on workflows. It implements all three concepts in a unique way. Let’s explore a little bit about the framework itself.&lt;/p&gt;

&lt;p&gt;Automatiko is built with Java, it’s fully open source under Apache 2 license and can be freely used for any type of use cases. It is built on top of &lt;a href="https://quarkus.io/"&gt;Quarkus&lt;/a&gt;, a cloud native Java toolkit for building services of any kind. Automatiko seamlessly integrates with Quarkus to allow developers to be effective and to have the best developer experience.&lt;/p&gt;

&lt;p&gt;So how does Automatiko deliver workflow as a service, function and function flow?&lt;br&gt;
First and foremost, it follow Quarkus philosophy to perform as many things at build time. So it does all the heavy lifting at build time. The main parts of this hard work is to transform the workflows into service, function or function flow. Let’s dive into each of the concepts' implementation to understand it better.&lt;/p&gt;

&lt;h2&gt;
  
  
  Workflow as a service in Automatiko
&lt;/h2&gt;

&lt;p&gt;Automatiko at build time will look at workflow definitions and transform them to service interface - REST service interface with a complete definition based on OpenAPI. It can also create a GraphQL service interface that will open up for more advanced use cases to take advantage of applying principles of under and over fetching that GraphQL comes with. &lt;/p&gt;

&lt;p&gt;As mentioned before, it integrates with Quarkus and allows users to use any feature of Quarkus to pair it with the need of workflows. Taking it even further Automatiko discovers what is available and binds to it without much of a hassle. An example of it is integration with data stores or messaging brokers that can be easily used from workflows.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/automatiko-io/automatiko-examples/tree/main/event-streams-sensors"&gt;A complete and ready to run example&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Workflow as a function in Automatiko
&lt;/h2&gt;

&lt;p&gt;To implement workflow as a function, Automatiko relies on &lt;a href="https://quarkus.io/guides/funqy"&gt;Funqy&lt;/a&gt;, a Quarkus approach to building portable java functions. Similar to how it is done for workflow as a service, Automatiko at build time examines workflows and creates functions from them. As the aim for workflow as function is to be completely agnostic from the deployment platform, there is no need to change any line of code to make the function runnable on AWS Lambda, Azure Functions or Google Cloud Functions. It only requires project configuration (like dependencies) which Automatiko provides as configuration profiles.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/automatiko-io/automatiko-examples/tree/main/user-registration-function"&gt;A complete and ready to run example&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Workflow as a function flow in Automatiko
&lt;/h2&gt;

&lt;p&gt;Lastly, workflow as a function flow in Automatiko is also based on Funqy, but this time it leverages &lt;a href="https://knative.dev/"&gt;Knative&lt;/a&gt; project to build function chaining based on events (&lt;a href="https://cloudevents.io/"&gt;Cloud Events&lt;/a&gt;). Knative is a kubernetes based platform to deploy and manage modern serverless workloads. In particular, Knative eventing comes with universal subscription, delivery and management of events that allows to build modern applications by attaching business logic on top of the data stream - the events.&lt;/p&gt;

&lt;p&gt;Again, at build time Automatiko breaks down the workflows into a set of functions and creates all the Knative manifest files required to deploy it. It comes with a trigger setup that binds all the pieces together (Knative eventing and the functions) so developers can easily move this from development to production.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Y1zjXIDb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hk8f3fr9x6dws1gah5uo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Y1zjXIDb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hk8f3fr9x6dws1gah5uo.png" alt="Knative usage with workflow as a function flow" width="640" height="482"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Note that Knative is a container based serverless platform. With that, Automatiko follows this approach and packages all functions into a single container image that can be scaled without problems. Knowing the characteristic of the functions - self-contained and invokable at any time, this container can be scaled to any number of replicas to provide maximum throughput as each replica has same runtime responsibility and can take execution based on incoming events.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/automatiko-io/automatiko-examples/tree/main/user-registration"&gt;A complete and ready to run example&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A more complete article on workflow as a function flow is also available in &lt;a href="https://knative.dev/blog/articles/workflow-as-function-flow/"&gt;Knative blog&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping up
&lt;/h2&gt;

&lt;p&gt;This article was intended to put a slightly different light at workflows, especially in relation to the cloud. Main takeaway is to give readers a bit of food for thought that workflows are much more than service orchestration and that using them as part of the core business logic has a lot to offer. Concepts introduced and the implementation of them should be a good proof that workflows used to represent business logic is not just possible but as well efficient from development and maintenance stand point.&lt;/p&gt;

</description>
      <category>java</category>
      <category>workflow</category>
      <category>automation</category>
    </item>
  </channel>
</rss>
