<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Santhosh Thomas </title>
    <description>The latest articles on DEV Community by Santhosh Thomas  (@sats268842).</description>
    <link>https://dev.to/sats268842</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/sats268842"/>
    <language>en</language>
    <item>
      <title>Best Practices For Kafka in Python</title>
      <dc:creator>Santhosh Thomas </dc:creator>
      <pubDate>Mon, 14 Mar 2022 00:36:55 +0000</pubDate>
      <link>https://dev.to/sats268842/best-practices-for-kafka-in-python-2me5</link>
      <guid>https://dev.to/sats268842/best-practices-for-kafka-in-python-2me5</guid>
      <description>&lt;p&gt;Specifically, this blog will assist you in introducing tiny terminology and providing a brief introduction of technology. But, based on my experience, my major motivation here is to give a better configuration that aids in the efficient use of Kafka.&lt;/p&gt;

&lt;h1&gt;
  
  
  What is an Event Driven Architecture, and how does it work?
&lt;/h1&gt;

&lt;p&gt;Events are used to trigger and communicate across disconnected services in an event-driven architecture, which is prevalent in today's microservices applications. A state change or update, such as when an item is purchased on an e-commerce website, is referred to as an event. It produces items like productCreated, cartCreated, and cartUpdated. The main advantage of using an event-driven architecture is that you don't have to wait for a database response or data processing. It's decoupled, so we can quickly respond to the user interface without causing too much delay. As a result, it will provide the best possible user experience. Every architecture has benefits and drawbacks. There are some drawbacks to event-driven architecture in this case.&lt;/p&gt;

&lt;h1&gt;
  
  
  What is Apache Kafka?
&lt;/h1&gt;

&lt;p&gt;Apache Kafka is an open-source stream-processing software platform created by LinkedIn in 2011 to handle throughput, low latency transmission, and processing of the stream of records in real-time.&lt;/p&gt;

&lt;p&gt;It has the following three significant capabilities, which makes it ideal for users:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;A high-throughput system. Kafka can handle high-velocity and high-volume data even if it doesn't have a lot of hardware.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Short Latency&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Fault-Tolerant &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Longevity&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Scalability is a term used to describe the ability of a system to scale up or down&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h1&gt;
  
  
  How to integrate kafka with python
&lt;/h1&gt;

&lt;p&gt;There are numerous Python libraries for Apache Kafka, including kafka-python, confluent-kafka, and pykafka. Kafka-python and confluent-kafka were two of the tools I utilised.&lt;/p&gt;

&lt;p&gt;Kafka-python, in my opinion, was simple to set up, and there are numerous tutorials and blogs available. But the worst thing was that once I utilised it, it would occasionally lose connection with the cluster when the producer wrote a message to it; I believe this was due to a broker compatibility issue. Another issue I had was that I was getting a lot of duplicate messages, so I tried to adjust the setup on the python side. They did not, however, resolve the issue. The first reason is because the kafka-python library does not support idempotence, which was the only feature that required the library to provide ids for each message.&lt;/p&gt;

&lt;p&gt;Confluent Python is one of the best Python libraries for managing Kafka. It also supports idempotence, and I was able to lower the maximum number of duplicate messages produced by the producer by using this library. There are numerous blogs on how to connect to Kafka, but none on how to use this library effectively. As a result, I'll give a few samples.&lt;/p&gt;

&lt;h3&gt;
  
  
  Producer Sample Code
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from confluent_kafka import Producer
import socket

 config= {
            'bootstrap.servers':  '**********',
            'security.protocol':  '**********',
            'ssl.ca.location': '**********',
            'client.id': socket.gethostname(),
            'enable.idempotence': True,
            'acks': 'all',
            'retries': 10,
            'compression.type': 'gzip',
            'max.in.flight.requests.per.connection': 5,
            'compression.codec': 'gzip'
            }

payload={
"message": "hello"
}
producer = Producer(config)
producer.produce(topic="sample_topic", key=str(uuid.uuid4().hex), value=json.dumps(payload).encode('utf-8'))
producer.poll(1)
producer.flush(1)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;topic (str) – Topic to produce message to.&lt;/li&gt;
&lt;li&gt;key (object, optional) – Message key.&lt;/li&gt;
&lt;li&gt;value (object, optional) – Message payload.&lt;/li&gt;
&lt;li&gt;bootstrap.servers - Initial list of brokers as a CSV list of broker host &lt;/li&gt;
&lt;li&gt;client.id - Client identifier&lt;/li&gt;
&lt;li&gt;enable.idempotence - When set to true, the producer will ensure that messages are successfully produced exactly once and in the original produce order. The following configuration properties are adjusted automatically (if not modified by the user) when idempotence is enabled: max.in.flight.requests.per.connection=5 (must be less than or equal to 5), retries=INT32_MAX (must be greater than 0), acks=all, queuing.strategy=fifo. Producer instantation will fail if user-supplied configuration is incompatible.&lt;/li&gt;
&lt;li&gt;acks - This field indicates the number of acknowledgements the leader broker must receive from ISR brokers before responding to the request: 0=Broker does not send any response/ack to client, -1 or all=Broker will block until message is committed by all in sync replicas (ISRs). If there are less than min.insync.replicas (broker configuration) in the ISR set the produce request will fail.&lt;/li&gt;
&lt;li&gt;retries - How many times to retry sending a failing Message. Note: retrying may cause reordering unless enable.idempotence is set to true.&lt;/li&gt;
&lt;li&gt;compression.type - compression codec to use for compressing message sets.&lt;/li&gt;
&lt;li&gt;compression.codec -Compression codec to use for compressing message sets. inherit = inherit global compression.codec&lt;/li&gt;
&lt;li&gt;max.in.flight.requests.per.connection - Maximum number of in-flight requests per broker connection. This is a generic property applied to all broker communication, however it is primarily relevant to produce requests. In particular, note that other mechanisms limit the number of outstanding consumer fetch request per broker to one.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Consumer Sample Code
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def create_consumer():

    def commit_callback(kafka_error, topic_partition):
        response = {
            "kafka_error": kafka_error,
            "topic_partition": topic_partition
        }
        logging.info("Commit info: "+ str(response))

    """Tries to establing the Kafka consumer connection"""
    try:
        brokers = '*******************'
        logger.info(f"Creating new kafka consumer using brokers: str(brokers) + ' and topic  is {configure.get_topic_name()}" )
        config= {
        'bootstrap.servers': brokers,
        'group.id': 'sample-consumer-group',
        'enable.auto.commit': False,
        'auto.offset.reset': 'earliest',
        'group.instance.id': socket.gethostname(),
        'security.protocol': 'SSL', 
        'on_commit': commit_callback,
        'ssl.ca.location': '*******************',
        'client.id': socket.gethostname()
        }
        return Consumer(config)  

    except Exception as e:
        logger.error("Error when connecting with kafka consumer: " + str(e))


consumer.subscribe(['sample_topic'])
try: 
     while True:
            msg = consumer.poll(timeout=1.0)
            if msg is None:
                continue
            if msg.error():
                if msg.error().code() == KafkaError._PARTITION_EOF:
                    # End of partition event
                    logger.error('%% %s [%d] reached end at offset %d\n' % (msg.topic(), msg.partition(), msg.offset()))
                elif msg.error():
                    raise KafkaException(msg.error())
                raise KafkaException(msg.error())
            else:
                consumer.commit(message=msg, asynchronous=False)
                in_kafka_message = msg.value().decode('utf-8')
except Exception as e:
        logger.error(str(e))
finally:
        consumer.close()
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;group.id - Client group id string. All clients sharing the same group.id belong to the same group.&lt;/li&gt;
&lt;li&gt;group.instance.id - Enable static group membership. Static group members are able to leave and rejoin a group within the configured session.timeout.ms without prompting a group rebalance. This should be used in combination with a larger session.timeout.ms to avoid group rebalances caused by transient unavailability (e.g. process restarts). Requires broker version &amp;gt;= 2.3.0&lt;/li&gt;
&lt;li&gt;enable.auto.commit - Automatically and periodically commit offsets in the background. Note: setting this to false does not prevent the consumer from fetching previously committed start offsets. To circumvent this behaviour set specific start offsets per partition in the call to assign().&lt;/li&gt;
&lt;li&gt;auto.offset.reset - Action to take when there is no initial offset in offset store or the desired offset is out of range: 'smallest','earliest' - automatically reset the offset to the smallest offset, 'largest','latest' - automatically reset the offset to the largest offset, 'error' - trigger an error (ERR__AUTO_OFFSET_RESET) which is retrieved by consuming messages and checking 'message-&amp;gt;err'.&lt;/li&gt;
&lt;/ol&gt;

</description>
      <category>python</category>
      <category>kafka</category>
      <category>eventdriven</category>
    </item>
    <item>
      <title>Convert Keys in Json to Snake Case</title>
      <dc:creator>Santhosh Thomas </dc:creator>
      <pubDate>Tue, 18 Jan 2022 07:22:44 +0000</pubDate>
      <link>https://dev.to/sats268842/convert-json-to-snake-case-1c14</link>
      <guid>https://dev.to/sats268842/convert-json-to-snake-case-1c14</guid>
      <description>&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="nn"&gt;re&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;convert_to_snakecase&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;original_dict&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;

    &lt;span class="n"&gt;transformed_dict&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt;
    &lt;span class="n"&gt;array_items&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="nb"&gt;isinstance&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;original_dict&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;list&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
      &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;k&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;original_dict&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;keys&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
          &lt;span class="n"&gt;value&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;re&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;sub&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;r&lt;/span&gt;&lt;span class="s"&gt;'(?&amp;lt;!^)(?=[A-Z])'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;'_'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="n"&gt;lower&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
          &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="nb"&gt;isinstance&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;original_dict&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="nb"&gt;list&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
              &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="nb"&gt;isinstance&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;original_dict&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="nb"&gt;dict&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
                  &lt;span class="n"&gt;transformed_dict&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;value&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;convert_to_snakecase&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;original_dict&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
              &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                  &lt;span class="n"&gt;transformed_dict&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;value&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;original_dict&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
          &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; 

              &lt;span class="n"&gt;array_items&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
              &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nb"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;original_dict&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="p"&gt;])):&lt;/span&gt;
                  &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="nb"&gt;isinstance&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;original_dict&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="nb"&gt;dict&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
                      &lt;span class="n"&gt;array_items&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;convert_to_snakecase&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;original_dict&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;]))&lt;/span&gt;
                      &lt;span class="n"&gt;transformed_dict&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;value&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt;  &lt;span class="n"&gt;array_items&lt;/span&gt;
                  &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                      &lt;span class="n"&gt;transformed_dict&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;value&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;original_dict&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;array_items&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;original_dict&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
          &lt;span class="n"&gt;array_items&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;convert_to_snakecase&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
        &lt;span class="n"&gt;transformed_dict&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;update&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;array_items&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;transformed_dict&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>python</category>
      <category>json</category>
      <category>beginners</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Billie | The Most Best Way to Create a Professional Invoice</title>
      <dc:creator>Santhosh Thomas </dc:creator>
      <pubDate>Sat, 08 Jan 2022 14:28:56 +0000</pubDate>
      <link>https://dev.to/sats268842/billie-the-most-best-way-to-create-a-professional-invoice-71f</link>
      <guid>https://dev.to/sats268842/billie-the-most-best-way-to-create-a-professional-invoice-71f</guid>
      <description>&lt;h3&gt;
  
  
  Overview of My Submission
&lt;/h3&gt;

&lt;p&gt;Billie is a user-friendly and customizable free invoice producing online invoicing software that allows you to make professional invoices in seconds. It does not require any setup, installation, or upkeep! By simply altering the templates, Billie can also be used for business activities. Dashboard for customer and invoice management has been added. You can send invoice emails to customers directly from the dashboard. This project use mongodb serverless as database. &lt;/p&gt;

&lt;p&gt;Features:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Scheduled Invoice Sending&lt;/li&gt;
&lt;li&gt;Customer Management&lt;/li&gt;
&lt;li&gt;Manage Invoices&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Submission Category:
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Own Adventure
&lt;/h4&gt;

&lt;h3&gt;
  
  
  Link to Code
&lt;/h3&gt;

&lt;h4&gt;
  
  
  &lt;a href="https://github.com/sats268842/billie-invoice" rel="noopener noreferrer"&gt;https://github.com/sats268842/billie-invoice&lt;/a&gt;
&lt;/h4&gt;

&lt;h3&gt;
  
  
  Additional Resources / Info
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Here is a working live demo :  &lt;a href="https://billie.digital" rel="noopener noreferrer"&gt;https://billie.digital&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Screenshots
&lt;/h3&gt;

&lt;p&gt;Landing Page&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frvk2eigo2gbh0o7zjwln.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frvk2eigo2gbh0o7zjwln.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Dashboard&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuk053cxkxwvpiyldeo2x.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuk053cxkxwvpiyldeo2x.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzu6klnkl0a4v4m65baqs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzu6klnkl0a4v4m65baqs.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Built with
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://angular.io/" rel="noopener noreferrer"&gt;Angular&lt;/a&gt; - Angular is a platform for building mobile and desktop web applications.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://www.mongodb.com/" rel="noopener noreferrer"&gt;Mongodb&lt;/a&gt; - MongoDB is a source-available cross-platform document-oriented database program.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://fastapi.tiangolo.com/" rel="noopener noreferrer"&gt;FastApi&lt;/a&gt; - FastAPI is a modern, fast (high-performance), web framework for building APIs with Python 3.6+ based on standard Python type hints.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://pypi.org/project/pdfkit/" rel="noopener noreferrer"&gt;PDFKit&lt;/a&gt; - Wkhtmltopdf python wrapper to convert html to pdf using the webkit rendering engine and qt&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://aws.amazon.com/ses/" rel="noopener noreferrer"&gt;AWS SES&lt;/a&gt; - Amazon Simple Email Service (SES) is a cost-effective, flexible, and scalable email service that enables developers to send mail from within any application. You can configure Amazon SES quickly to support several email use cases, including transactional, marketing, or mass email communications. Amazon SES's flexible IP deployment and email authentication options help drive higher deliverability and protect sender reputation, while sending analytics measure the impact of each email. With Amazon SES, you can send email securely, globally, and at scale.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://auth0.com/" rel="noopener noreferrer"&gt;auth0&lt;/a&gt; - Auth0 is an easy to implement, adaptable authentication and authorization platform.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>atlashackathon</category>
      <category>sideprojects</category>
      <category>mongodb</category>
      <category>showdev</category>
    </item>
    <item>
      <title>How To Store Certificate (.pem) in Azure Keyvault using Secrets and fetch values from secrets into pem file using python</title>
      <dc:creator>Santhosh Thomas </dc:creator>
      <pubDate>Sat, 01 Jan 2022 21:33:12 +0000</pubDate>
      <link>https://dev.to/sats268842/how-to-store-certificate-pem-in-azure-keyvault-using-secrets-and-fetch-values-from-secrets-into-pem-file-using-python-56c7</link>
      <guid>https://dev.to/sats268842/how-to-store-certificate-pem-in-azure-keyvault-using-secrets-and-fetch-values-from-secrets-into-pem-file-using-python-56c7</guid>
      <description>&lt;h3&gt;
  
  
  Convert .pem Certificate file into base64 using certutil
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;certutil -encode filename.cer newfilename.cer
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Go to azure portal&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select ketvault service&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create a new keyvault&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select secrets from setting on sidepanel&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create a new secret&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Copy paste base 64 into secret value and save it&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;### Python code to fetch certificate value from keyvault and store into a pem file&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from azure.identity import DefaultAzureCredential
from azure.keyvault.secrets import SecretClient
credentials = DefaultAzureCredential()
secret_client = SecretClient(vault_url=key_vault_url, credential=credentials)
cert_value =  secret_client.get_secret("Certificate").value

with open('certificate.pem','w') as fopen:
        fopen.write(base64.b64decode(cert_value).decode())
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>azure</category>
      <category>python</category>
      <category>keyvault</category>
      <category>certificate</category>
    </item>
    <item>
      <title>Convert DER .cer (certificate) format to Base64 .CER</title>
      <dc:creator>Santhosh Thomas </dc:creator>
      <pubDate>Fri, 31 Dec 2021 21:02:43 +0000</pubDate>
      <link>https://dev.to/sats268842/convert-der-cer-certificate-format-to-base64-cer-1kfc</link>
      <guid>https://dev.to/sats268842/convert-der-cer-certificate-format-to-base64-cer-1kfc</guid>
      <description>&lt;p&gt;To convert from a DER to a base64, you can use &lt;strong&gt;certutil&lt;/strong&gt; :&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;certutil -encode filename.cer newfilename.cer
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>pem</category>
      <category>certificate</category>
      <category>keyvault</category>
      <category>azure</category>
    </item>
    <item>
      <title>Create And Delete Kafka Topic Using Python</title>
      <dc:creator>Santhosh Thomas </dc:creator>
      <pubDate>Fri, 31 Dec 2021 21:00:03 +0000</pubDate>
      <link>https://dev.to/sats268842/create-and-delete-kafka-topic-using-python-3ni</link>
      <guid>https://dev.to/sats268842/create-and-delete-kafka-topic-using-python-3ni</guid>
      <description>&lt;h3&gt;
  
  
  Install kafka-python package
&lt;/h3&gt;

&lt;p&gt;Python client for the Apache Kafka distributed stream processing system.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install kafka-python
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from kafka.admin import KafkaAdminClient, NewTopic


admin_client = KafkaAdminClient(bootstrap_servers=[ipaddress:port])


topic_names = ['topic1', 'topic2', 'topic3' , 'topic3']

def create_topics(topic_names):

    existing_topic_list = consumer.topics()
    print(list(consumer.topics()))
    topic_list = []
    for topic in topic_names:
        if topic not in existing_topic_list:
            print('Topic : {} added '.format(topic))
            topic_list.append(NewTopic(name=topic, num_partitions=3, replication_factor=3))
        else:
            print('Topic : {topic} already exist ')
    try:
        if topic_list:
            admin_client.create_topics(new_topics=topic_list, validate_only=False)
            print("Topic Created Successfully")
        else:
            print("Topic Exist")
    except TopicAlreadyExistsError as e:
        print("Topic Already Exist")
    except  Exception as e:
        print(e)

def delete_topics(topic_names):
    try:
        admin_client.delete_topics(topics=topic_names)
        print("Topic Deleted Successfully")
    except UnknownTopicOrPartitionError as e:
        print("Topic Doesn't Exist")
    except  Exception as e:
        print(e)


consumer = KafkaConsumer(
    bootstrap_servers = "ip_address",
    )
create_topics(topic_names)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>kafka</category>
      <category>python</category>
      <category>eventdriven</category>
    </item>
    <item>
      <title>Flask Upload Image Directly To S3 Without Saving In Server</title>
      <dc:creator>Santhosh Thomas </dc:creator>
      <pubDate>Fri, 01 Oct 2021 20:35:26 +0000</pubDate>
      <link>https://dev.to/sats268842/flask-upload-image-directly-to-s3-without-saving-in-server-2eb9</link>
      <guid>https://dev.to/sats268842/flask-upload-image-directly-to-s3-without-saving-in-server-2eb9</guid>
      <description>&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import boto3
s3 = boto3.client( "s3", aws_access_key_id= "*************", aws_secret_access_key="**********" )


@login_blueprint.route('/uploadfile', methods=['POST']) 

file= request.files['file'] 

try: 
    filename = secure_filename(file.filename) 
    acl="public-read" 
    s3.upload_fileobj( file, 'bucket-name', file.filename, ExtraArgs={ "ACL": acl, "ContentType": file.content_type } ) 

except Exception as e:
    resp = jsonify(e)
    resp.status_code =200 
    return resp
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>flask</category>
      <category>backend</category>
      <category>aws</category>
      <category>python</category>
    </item>
    <item>
      <title>Deploy angular docker app to own server using GitLab pipeline</title>
      <dc:creator>Santhosh Thomas </dc:creator>
      <pubDate>Fri, 01 Oct 2021 20:31:48 +0000</pubDate>
      <link>https://dev.to/sats268842/deploy-angular-docker-app-to-own-server-using-gitlab-pipeline-54pl</link>
      <guid>https://dev.to/sats268842/deploy-angular-docker-app-to-own-server-using-gitlab-pipeline-54pl</guid>
      <description>&lt;p&gt;Here in test environment, it will deploy into aws s3 bucket and in production evironment, it will deploy docker image into own server.&lt;/p&gt;

&lt;p&gt;Gitlab container registry is used for building and storing docker image, then finally pull docker image into server.&lt;/p&gt;

&lt;p&gt;.gitlab.-ci.yml&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;image: node:14

variables:
  GIT_DEPTH: '0'
  DOCKER_HOST: tcp://docker:2375/
before_script:
  - apt-get update
  - apt-get install zip

stages:
  - install_dependencies
  - build
  - docker-image-creation
  - deploy


install_dependencies:
  stage: install_dependencies
  cache:
    key: ${CI_COMMIT_REF_SLUG}
    paths:
      - node_modules/
      - dist/
  script:
    - npm ci
  only:
    changes:
      - package-lock.json

build:
  stage: build
  script:
    - npm run build:ssr
    - cd dist/${PROJECT_NAME}
    - ls -al -F
    - echo "BUILD SUCCESSFUL"
  dependencies:
    - install_dependencies
  cache: 
    key: ${CI_COMMIT_REF_SLUG}
    paths:
      - node_modules/
    policy: pull
  only:
    changes:
      - src/*
      - angular.json  
  artifacts:
    when: on_success
    paths:
      - dist/${PROJECT_NAME}
  only:
    - develop
    - main

test-deploy_to_s3_bucker:
  image: python:latest  
  stage: deploy
  cache:
    policy: pull
  dependencies:
    - build
    - install_dependencies
  before_script:
    - pip install awscli 
  script:
    - ls -lh
    - find  dist/${PROJECT_NAME}/browser \( -name '*.*' \) -exec gzip --verbose --best --force {} \; -exec mv "{}.gz" "{}" \; # Gzip recursively all files in the directory.
    - aws   s3 rm s3://${TEST_S3_BUCKET_NAME} --recursive
    - aws  s3 cp ./dist/${PROJECT_NAME}/browser S3://${TEST_S3_BUCKET_NAME}/ --recursive --acl public-read --content-encoding gzip
    - echo "Deployed Successfully"
  only:
    - develop
  environment:
    name: test

deploy_production_to_server:
  stage: deploy
  cache:
    policy: pull
  before_script:
    - 'which ssh-agent || ( apt-get update -y &amp;amp;&amp;amp; apt-get install openssh-client -y )'
    - eval $(ssh-agent -s)
    - mkdir -p ~/.ssh
    - chmod 700 ~/.ssh
    - chmod 400 $PRIVATE_KEY
    - echo -e "Host *\n\tStrictHostKeyChecking no\n\n" &amp;gt; ~/.ssh/config
    - apt-get update -y
    - apt-get -y install rsync
  script:
    - ssh -i $PRIVATE_KEY ubuntu@$SERVER_IP_ADDRESS
    - rsync -zvhr -auv -e "ssh -i $PRIVATE_KEY" dist/${PROJECT_NAME}/browser ubuntu@$SERVER_IP_ADDRESS:/var/www/html/angular/
  only: ['main']
  environment:
    name: production


docker_image-creation:
  image: docker:git # image with docker installed to execute docker commands
  stage: docker-image-creation # notice a new 
  cache:
    policy: pull
  services:
    - docker:dind #used to be able to execute docker commands inside of a docker container
  before_script:
    - docker ps #overrides previous docker script
  script:
    - docker login -u $CI_REGISTRY_USER -p $DOCKER_CI_TOKEN registry.gitlab.com #logs into gitlab docker registery, make sure to have this variables defined
    - docker build -t registry.gitlab.com/****/*** . # creates a docker image
    - docker push registry.gitlab.com/*****/*** # pushes the create docker image to docker registry
  dependencies:
    - build 


deploy_docker_image_to_server:
  image: ubuntu
  cache:
    policy: pull
  before_script: #checks if ssh installed and if not, attempts to install it
    - "which ssh-agent || ( apt-get update -y &amp;amp;&amp;amp; apt-get install openssh-client git -y )"
    - eval $(ssh-agent -s)
    # Inject the remote's private key
    - echo "$PRIVATE_KEY" | tr -d '\r' | ssh-add - &amp;gt; /dev/null #adding a ssh private key from variables, pair of the one registered on digital ocean
    - mkdir -p ~/.ssh
    - chmod 700 ~/.ssh
    # Append keyscan output into known hosts
    - ssh-keyscan $SERVER_IP_ADDRESS &amp;gt;&amp;gt; ~/.ssh/known_hosts
    - chmod 644 ~/.ssh/known_hosts
  stage: deploy #new stage after release
  script:
    - ssh $SERVER_USERNAME@$SERVER_IP_ADDRESS  ls
    - ssh $SERVER_USERNAME@$SERVER_IP_ADDRESS "docker login -u ${CI_REGISTRY_USER} -p ${DOCKER_CI_TOKEN} registry.gitlab.com;
     docker stop aapp_name;
     docker rm app_name; 
     docker rmi "$(docker images -aq)"
     docker pull registry.gitlab.com/${PROJECT_NAME};
     docker run --name app_name -d -p 80:4000 ${PROJECT_NAME}"

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>angular</category>
      <category>devops</category>
      <category>docker</category>
      <category>gitlab</category>
    </item>
    <item>
      <title>Gitlab CI/CD for multiple environment</title>
      <dc:creator>Santhosh Thomas </dc:creator>
      <pubDate>Wed, 15 Sep 2021 17:07:05 +0000</pubDate>
      <link>https://dev.to/sats268842/gitlab-ci-cd-for-multiple-environment-54dj</link>
      <guid>https://dev.to/sats268842/gitlab-ci-cd-for-multiple-environment-54dj</guid>
      <description>&lt;p&gt;The problem arises when you want to use variables in different situations. A excellent example would be if you wanted to include the URLs of your production, staging, and development databases in the same task but didn't want to write separate jobs for each environment.&lt;/p&gt;

&lt;p&gt;When a single procedure (deploy to s3) necessitates multiple jobs for different environments, we have a problem. As a result, managing multiple jobs for a single procedure takes time. So I came up with a solution based on a workflow job. Using a workflow job and a rule condition, we can integrate variable values based on branch. As a result, instead of writing jobs for each environment, a single job can handle multiple environment credentials.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;If you have any questions about this topic, please contact me at &lt;a href="mailto:santhoshthomas015@gmail.com"&gt;santhoshthomas015@gmail.com&lt;/a&gt;.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A sample of code is provided below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;image: node:latest
variables:
  GIT_DEPTH: '0' 

stages:
  - build
  - deploy

workflow:
    rules:
      - if: $CI_COMMIT_REF_NAME ==  "develop"
        variables:
          DEVELOP: "true"
          ENVIRONMENT_NAME: Develop
          WEBSITE_URL: DEVELOP_WEBSITE_URL
          S3_BUCKET: (develop-s3-bucket-name)
          AWS_REGION: ************** develop
          AWS_ACCOUNT: ********develop

      - if: $CI_COMMIT_REF_NAME == "main" 
        variables:                                 
          PRODUCTION:  "true"
          ENVIRONMENT_NAME: PRODUCTION
          WEBSITE_URL: $PROD_WEBSITE_URL
          S3_BUCKET: $PROD-S3-BUCKET-NAME
          AWS_REGION: ************** (prod-region)
          AWS_ACCOUNT: ***********(prod-acct)
      - when: always 

build-app:
  stage: build
  script:
     #build-script
  environment: 
    name: $ENVIRONMENT_NAME

deploy-app:
  stage: deploy
  script:
     #deploy-script
  environment: 
    name: $ENVIRONMENT_NAME
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;NB: Please let me know if this procedure is correct. (Experts)&lt;/strong&gt;&lt;/p&gt;

</description>
      <category>gitlab</category>
      <category>devops</category>
      <category>pipeline</category>
      <category>cicd</category>
    </item>
  </channel>
</rss>
