<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Ademar Tutor</title>
    <description>The latest articles on DEV Community by Ademar Tutor (@iamademar).</description>
    <link>https://dev.to/iamademar</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/iamademar"/>
    <language>en</language>
    <item>
      <title>Observability with OpenTelemetry, Jaeger and Rails</title>
      <dc:creator>Ademar Tutor</dc:creator>
      <pubDate>Thu, 22 Feb 2024 15:53:48 +0000</pubDate>
      <link>https://dev.to/iamademar/observability-with-opentelemetry-jaeger-and-rails-3g8m</link>
      <guid>https://dev.to/iamademar/observability-with-opentelemetry-jaeger-and-rails-3g8m</guid>
      <description>&lt;p&gt;In a traditional monolithic architecture, the application's behavior is relatively straightforward because all components reside within a single process. You can easily trace requests from their entry to their exit points.&lt;/p&gt;

&lt;p&gt;However, in a microservices architecture, a single transaction or request might pass through many services hosted on different machines or across different data centers. This dispersion makes it hard to "observe."&lt;/p&gt;

&lt;p&gt;Does it apply to a Rails monolith?&lt;/p&gt;

&lt;p&gt;Yes, it does! &lt;/p&gt;

&lt;p&gt;Let's say you are working on an e-commerce application. You are fixing a production bug for the checkout workflow on this application.&lt;/p&gt;

&lt;p&gt;The workflow includes the following actions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Charging credit cards via a 3rd party payment gateway (Stripe, Braintree and etc.)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Sending a customer an email notification (Sendgrid, Twilio)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Send updates to your order management system responsible for handling your inventory.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The bug only arises during peak hours. &lt;/p&gt;

&lt;p&gt;This would only mean that one or two of the processes took too long to respond and caused the entire workflow to fail. &lt;/p&gt;

&lt;p&gt;You do not monitor your Rails application's internal components, but you must also monitor the interactions with these external services.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fju8pw6mdbaovvg3u0ryu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fju8pw6mdbaovvg3u0ryu.png" alt="Other processes hidden" width="800" height="791"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is where OpenTelemetry comes in.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1pcv6uinsfb0smkkj3cw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1pcv6uinsfb0smkkj3cw.png" alt="Other processes are seen with observability" width="800" height="606"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What is OpenTelemetry?
&lt;/h2&gt;

&lt;p&gt;OpenTelemetry is an open-source observability framework that helps you gather metrics, traces, and logs.&lt;/p&gt;

&lt;h3&gt;
  
  
  Why do I need that? I already have New Relic, Skylight, and AppSignal.
&lt;/h3&gt;

&lt;p&gt;OpenTelemetry is vendor-neutral. It doesn't lock you into a specific observability platform. You can switch to another vendor. Plus, OpenTelemetry is incubated by &lt;a href="https://www.cncf.io/projects/"&gt;CNCF&lt;/a&gt; and is supported by &lt;a href="https://opentelemetry.io/ecosystem/vendors/"&gt;industry leaders&lt;/a&gt;. This means it's here to stay!&lt;/p&gt;

&lt;h3&gt;
  
  
  Would OpenTelemetry replace New Relic, Skylight, and AppSignal?
&lt;/h3&gt;

&lt;p&gt;OpenTelemetry specifies how data should be structured, collected, and sent over the network, but it doesn't provide a backend system to store, visualize, or analyze this data. &lt;/p&gt;

&lt;p&gt;You can send this data to New Relic or AppSignal to store, visualize, or analyze. &lt;/p&gt;

&lt;p&gt;But you can set up open-source solutions to view your data. This is where Jaeger comes in.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Jaeger maps the flow of requests and data as they traverse a distributed system. These requests may make calls to multiple services, which may introduce their own delays or errors.&lt;br&gt;
&lt;a href="https://www.jaegertracing.io/"&gt;https://www.jaegertracing.io/&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  How do you set up OpenTelemetry + Jaeger with a Ruby on Rails application?
&lt;/h2&gt;

&lt;p&gt;These are the step-by-step process for integrating OpenTelemetry and Jaeger with your Ruby on Rails app.&lt;/p&gt;

&lt;h3&gt;
  
  
  1) Install the necessary gems
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;gem "opentelemetry-sdk"
gem "opentelemetry-exporter-otlp"
gem "opentelemetry-instrumentation-all"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  2) Run Rails app and set the tracer exporter to console.
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;env OTEL_TRACES_EXPORTER=console rails server
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, see if you can see the logs on the server. The logs should show something like this:&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/A1MtzHeAPm8"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;3) You can then set up a Jaeger backend to pass data. You can do that by running this image on your terminal:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker run -d --name jaeger \
  -e COLLECTOR_ZIPKIN_HOST_PORT=:9411 \
  -p 5775:5775/udp \
  -p 6831:6831/udp \
  -p 6832:6832/udp \
  -p 5778:5778 \
  -p 16686:16686 \
  -p 14268:14268 \
  -p 14250:14250 \
  -p 9411:9411 \
  jaegertracing/all-in-one:latest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You will see the Jaeger on your docker desktop. You need to make sure the container is running.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvtr6c6nwgr8aposryjh9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvtr6c6nwgr8aposryjh9.png" alt="Image description" width="800" height="437"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once it's running you should be able to access the Jaeger UI on your browser via &lt;a href="http://localhost:16686/search"&gt;http://localhost:16686/search&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqol580dl082f5xbppxfl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqol580dl082f5xbppxfl.png" alt="Image description" width="800" height="634"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By default, traces are sent to an OTLP endpoint listening on localhost:4318. Your rails app should be set up automatically. However, if you need to update the endpoint, you can set it up manually by adding an ENV variable.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;env OTEL_EXPORTER_OTLP_ENDPOINT="http://localhost:4318" rails server
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You should be able to see the request on Jaeger UI:&lt;br&gt;
&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/pQAalwVuguE"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Originally posted on: &lt;a href="https://blog.ademartutor.com/p/observability-with-opentelemetry"&gt;https://blog.ademartutor.com/p/observability-with-opentelemetry&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;Hello there!&lt;/p&gt;

&lt;p&gt;Do you have a startup idea or an exciting project you're passionate about? I'd love to bring your vision to life!&lt;/p&gt;

&lt;p&gt;I'm a software developer with 13 years of experience in building apps for startups, I specialize in Rails + Hotwire/React. &lt;/p&gt;

&lt;p&gt;Whether you're looking to innovate, grow your business, or bring a creative idea to the forefront, I'm here to provide tailored solutions that meet your unique needs.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.linkedin.com/in/ademar-tutor-0a95972a/"&gt;Let's collaborate to make something amazing!&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Sincerely,&lt;br&gt;
Ademar Tutor&lt;br&gt;
&lt;a href="mailto:hey@ademartutor.com"&gt;hey@ademartutor.com&lt;/a&gt;&lt;/p&gt;

</description>
      <category>opensource</category>
      <category>ruby</category>
      <category>rails</category>
    </item>
    <item>
      <title>Callbacks are evil! 😈</title>
      <dc:creator>Ademar Tutor</dc:creator>
      <pubDate>Thu, 22 Feb 2024 14:47:00 +0000</pubDate>
      <link>https://dev.to/iamademar/callbacks-are-evil-2mfn</link>
      <guid>https://dev.to/iamademar/callbacks-are-evil-2mfn</guid>
      <description>&lt;p&gt;I’m working on a super secret app to revolutionize service field businesses 😜. Yesterday, the codebase had the opportunity to be viewed by eyes from the outside world for the first time and got feedback.&lt;/p&gt;

&lt;p&gt;One of the things that stood out was the code related to converting Requests to Quotes. This workflow:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff0nc8x6x6i7p1z1ouhyn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff0nc8x6x6i7p1z1ouhyn.png" alt="Requests to Quotes workflow" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This particular code:&lt;br&gt;
&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5k7gg75jmjrifdtal492.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5k7gg75jmjrifdtal492.png" alt="Code using after_create callback" width="800" height="520"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This involved the callback &lt;code&gt;after_create&lt;/code&gt;!&lt;br&gt;
&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx7yf0hmwld2of8zvcqwl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx7yf0hmwld2of8zvcqwl.png" alt="Code that involved the callback  raw `after_create` endraw " width="395" height="498"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I know, I know. &lt;a href="https://medium.com/planet-arkency/the-biggest-rails-code-smell-you-should-avoid-to-keep-your-app-healthy-a61fd75ab2d3"&gt;Callbacks are evil&lt;/a&gt;!&lt;/p&gt;

&lt;p&gt;Well, my lame excuse at the time of development is I could not find the correct language for the conversion process when developing the &lt;a href="https://thedomaindrivendesign.io/developing-the-ubiquitous-language/"&gt;ubiquitous language&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Based on the stakeholders, whenever they need the quote to be produced, they just say to the employee: “Palihug ko pa.quote ani dong/dai. “&lt;/p&gt;

&lt;p&gt;This can be loosely translated as: “Hey [name], can you please create a quote for this customer’s request?”&lt;/p&gt;

&lt;p&gt;At that time of development, I could not find the right words for it. Based on what I was hearing from the stakeholders, they just said to create a quote.&lt;/p&gt;

&lt;p&gt;So, I opted for the easiest solution available, callbacks!&lt;/p&gt;

&lt;h2&gt;
  
  
  Why are callbacks bad?
&lt;/h2&gt;

&lt;p&gt;It is probably not that concerning in this particular case since the workflow is pretty simple.&lt;/p&gt;

&lt;p&gt;However, I’ve been burned by this before! When I was working on an e-commerce application, we used a &lt;code&gt;after_save&lt;/code&gt; callback that sent app notifications to listing owners when their listings were updated. When we ran a script that was intended to update silently thousands of listings on a Sunday evening. Long story short, it was not a very silent Sunday 😅.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F26b8uw6wci5rxp4qkfta.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F26b8uw6wci5rxp4qkfta.png" alt="Meme about failed deploys" width="650" height="602"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  So, how do we move away from callbacks?
&lt;/h2&gt;

&lt;p&gt;Modeling/developing/writing a better ubiquitous language for the domain you are working on helps you write better code.&lt;/p&gt;

&lt;p&gt;In my case, I could not find the right technical words while conversing with the stakeholders.&lt;/p&gt;

&lt;p&gt;However, now that I started developing the user interface, it seems it’s pretty straightforward:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjh3o6xbzkt3h8ysrsgwn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjh3o6xbzkt3h8ysrsgwn.png" alt="User interface" width="263" height="133"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  So, what does the change look like in the code?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdxqor8b867l4lm426u6l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdxqor8b867l4lm426u6l.png" alt="Changes to code" width="749" height="625"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1: Create a PORO to handle the business logic&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl0rwobssc8ziu2q2aywi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl0rwobssc8ziu2q2aywi.png" alt="PORO to handle business logic" width="800" height="1051"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Things to note on the code above:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;I created a PORO (Plain Old Ruby Object) for the Request to Quote Conversion business logic.&lt;/li&gt;
&lt;li&gt;Later, if we want to send email or mobile notifications to admins or employees, we should update this class.&lt;/li&gt;
&lt;li&gt;If anything fails, this class should handle the failure gracefully, especially if it involves payments, etc.&lt;/li&gt;
&lt;li&gt;Now that it’s a class, creating unit tests for this specific business logic would be easier, ensuring that this process is more robust when change requests come in later.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Step 2: Update controllers&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You can then use the PORO on the controller like this:&lt;br&gt;
&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi67u8fqsnw9n4v74ecqb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi67u8fqsnw9n4v74ecqb.png" alt="Using the PORO in controller" width="800" height="518"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Optional Steps:&lt;br&gt;
You can do it a step further and follow &lt;a href="https://www.youtube.com/playlist?list=PL3m89j0mV0pdNAg6x9oq6S8Qz_4C-yuwj"&gt;DHH’s approach&lt;/a&gt;, creating a method for the request model:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fog2opfj2n8l58isidwe0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fog2opfj2n8l58isidwe0.png" alt="Update on request model" width="800" height="581"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then, you can easily create a call this method on the controller like so:&lt;br&gt;
&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2lrx9khu2xgoxfgyhoy4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2lrx9khu2xgoxfgyhoy4.png" alt="Using the PORO" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Question: Are callbacks evil?
&lt;/h2&gt;

&lt;p&gt;There are two camps for this: people who are &lt;a href="https://dev.37signals.com/globals-callbacks-and-other-sacrileges/"&gt;for&lt;/a&gt; and &lt;a href="https://medium.com/planet-arkency/the-biggest-rails-code-smell-you-should-avoid-to-keep-your-app-healthy-a61fd75ab2d3"&gt;against&lt;/a&gt; callbacks.&lt;/p&gt;

&lt;p&gt;My take is:&lt;br&gt;
As developers, we constantly understand the domains that we are working on and develop a unique &lt;a href="https://thedomaindrivendesign.io/developing-the-ubiquitous-language/"&gt;ubiquitous language&lt;/a&gt; for them.&lt;/p&gt;

&lt;p&gt;Initially, we may not have the right words for it, thus we need to use convenient “generic” tools such as callbacks. But as soon as you learn more about the domain, and build a better ubiquitous language, the design of your code should follow also.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Originally posted on: &lt;a href="https://blog.ademartutor.com/p/callbacks-are-evil"&gt;https://blog.ademartutor.com/p/callbacks-are-evil&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;Hello there!&lt;/p&gt;

&lt;p&gt;Do you have a startup idea or an exciting project you're passionate about? I'd love to bring your vision to life!&lt;/p&gt;

&lt;p&gt;I'm a software developer with 13 years of experience in building apps for startups, I specialize in Rails + Hotwire/React. &lt;/p&gt;

&lt;p&gt;Whether you're looking to innovate, grow your business, or bring a creative idea to the forefront, I'm here to provide tailored solutions that meet your unique needs.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.linkedin.com/in/ademar-tutor-0a95972a/"&gt;Let's collaborate to make something amazing!&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Sincerely,&lt;br&gt;
Ademar Tutor&lt;/p&gt;

</description>
      <category>ruby</category>
      <category>rails</category>
    </item>
    <item>
      <title>Dynamic AI Prompts with PromptTemplates, Ruby, Langchain.rb and OpenAI</title>
      <dc:creator>Ademar Tutor</dc:creator>
      <pubDate>Mon, 15 Jan 2024 10:21:24 +0000</pubDate>
      <link>https://dev.to/iamademar/dynamic-ai-prompts-with-prompttemplates-ruby-langchainrb-and-openai-4bi9</link>
      <guid>https://dev.to/iamademar/dynamic-ai-prompts-with-prompttemplates-ruby-langchainrb-and-openai-4bi9</guid>
      <description>&lt;p&gt;This tutorial talks about how to build a Ruby Chatbot with Langchain, PromptTemplate and OpenAI.&lt;/p&gt;


&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
    &lt;a href="https://www.youtube.com/watch?si=yfXkq4C5xdRNWOEy&amp;amp;v=xQRpFtDctkE&amp;amp;feature=youtu.be" rel="noopener noreferrer"&gt;
      youtube.com
    &lt;/a&gt;
&lt;/div&gt;


&lt;p&gt;Previously, we've built a basic terminal chatbot with Ruby. Now, we are going to focus on PrompTemplates.&lt;/p&gt;

&lt;h2&gt;
  
  
  What are PromptTemplates?
&lt;/h2&gt;

&lt;p&gt;PromptTemplates in Langchain.rb are game-changers. They allow you to create flexible, dynamic prompts with placeholders, which can be filled in with user input or other data at runtime.&lt;/p&gt;

&lt;p&gt;This adaptability is critical in prompt engineering, where the goal is to generate prompts that elicit the most accurate and relevant responses from an AI model.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is Prompt Engineering?
&lt;/h2&gt;

&lt;p&gt;Prompt engineering is the art of crafting prompts that guide AI models to provide the desired output. The way you phrase a prompt can significantly impact the model's response.&lt;/p&gt;

&lt;p&gt;PromptTemplates offer a way to experiment with different phrasings and structures without hard-coding every possible prompt.&lt;/p&gt;

&lt;h2&gt;
  
  
  Building an example chatbot
&lt;/h2&gt;

&lt;p&gt;Imagine building a chatbot that explains to preschools what specific profession and skills they need to succeed in that profession.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1
&lt;/h3&gt;

&lt;p&gt;Require the necessary libraries: OpenAI and Langchain.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s2"&gt;"openai"&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s2"&gt;"langchain"&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 2
&lt;/h3&gt;

&lt;p&gt;Set the logger level of Langchain to info, which means it will log informational messages. It is not a required step, but it's always good to know what's happening in the background with Langchain.rb&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="no"&gt;Langchain&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;level&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt;  &lt;span class="ss"&gt;:info&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 3
&lt;/h3&gt;

&lt;p&gt;Initialize a new &lt;code&gt;Langchain::LLM::OpenAI&lt;/code&gt; object. This object is used to interact with OpenAI's API.&lt;br&gt;
You need to provide your API key and other options.&lt;br&gt;
&lt;code&gt;llm_options&lt;/code&gt; is an empty hash here, but it can be used to pass additional options.&lt;br&gt;
&lt;code&gt;default_options&lt;/code&gt; sets the default language model to GPT-4 for chat completions.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="n"&gt;openai&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Langchain&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="no"&gt;LLM&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="no"&gt;OpenAI&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="ss"&gt;api_key: &lt;/span&gt;&lt;span class="no"&gt;ENV&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"OPENAI_API_KEY"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
  &lt;span class="ss"&gt;llm_options: &lt;/span&gt;&lt;span class="p"&gt;{},&lt;/span&gt;
  &lt;span class="ss"&gt;default_options: &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="ss"&gt;chat_completion_model_name: &lt;/span&gt;&lt;span class="s2"&gt;"gpt-4"&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 4
&lt;/h3&gt;

&lt;p&gt;Create a new conversation object. This object will manage the conversation flow using the OpenAI model.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="n"&gt;chat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Langchain&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="no"&gt;Conversation&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="ss"&gt;llm: &lt;/span&gt;&lt;span class="n"&gt;openai&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 5
&lt;/h3&gt;

&lt;p&gt;Create a new PromptTemplate object. This template has a placeholder {profession} that can be filled in. &lt;code&gt;input_variables&lt;/code&gt; defines the placeholders expected in the template.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 6
&lt;/h3&gt;

&lt;p&gt;Prompt the user for the profession.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"Please enter the profession:"&lt;/span&gt;
&lt;span class="n"&gt;user_input&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;gets&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;chomp&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 7
&lt;/h3&gt;

&lt;p&gt;Format the prompt template with the user's input and send the message to the OpenAI model. The response from the model is then printed out.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;prompt&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;format&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="ss"&gt;profession: &lt;/span&gt;&lt;span class="n"&gt;user_input&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Full Code
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s2"&gt;"openai"&lt;/span&gt;
&lt;span class="nb"&gt;require&lt;/span&gt; &lt;span class="s2"&gt;"langchain"&lt;/span&gt;

&lt;span class="no"&gt;Langchain&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;level&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="ss"&gt;:info&lt;/span&gt;

&lt;span class="n"&gt;openai&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Langchain&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="no"&gt;LLM&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="no"&gt;OpenAI&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="ss"&gt;api_key: &lt;/span&gt;&lt;span class="no"&gt;ENV&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"OPENAI_API_KEY"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
  &lt;span class="ss"&gt;llm_options: &lt;/span&gt;&lt;span class="p"&gt;{},&lt;/span&gt;
  &lt;span class="ss"&gt;default_options: &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="ss"&gt;chat_completion_model_name: &lt;/span&gt;&lt;span class="s2"&gt;"gpt-4"&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;chat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Langchain&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="no"&gt;Conversation&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="ss"&gt;llm: &lt;/span&gt;&lt;span class="n"&gt;openai&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;prompt&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Langchain&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="no"&gt;Prompt&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="no"&gt;PromptTemplate&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="ss"&gt;template: &lt;/span&gt;&lt;span class="s2"&gt;"Explain to a pre-schooler what an {profession} is and what they need to in order to be successful in the field."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="ss"&gt;input_variables: &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt; &lt;span class="s2"&gt;"profession"&lt;/span&gt; &lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"Please enter the profession:"&lt;/span&gt;
&lt;span class="n"&gt;user_input&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;gets&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;chomp&lt;/span&gt;

&lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;prompt&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;format&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="ss"&gt;profession: &lt;/span&gt;&lt;span class="n"&gt;user_input&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>ruby</category>
      <category>langchain</category>
      <category>openai</category>
    </item>
    <item>
      <title>Hello AI world with Ruby!</title>
      <dc:creator>Ademar Tutor</dc:creator>
      <pubDate>Sat, 13 Jan 2024 07:08:02 +0000</pubDate>
      <link>https://dev.to/iamademar/hello-ai-world-with-ruby-200m</link>
      <guid>https://dev.to/iamademar/hello-ai-world-with-ruby-200m</guid>
      <description>&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/QhiKOhzuYr0"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;In this tutorial, we'll delve into creating your first AI-powered apps using Ruby, Langchain.rb library and OpenAI's GPTPT-4 API.&lt;/p&gt;

&lt;h2&gt;
  
  
  Objective
&lt;/h2&gt;

&lt;p&gt;We'll create a terminal-based chatbot that allows you to OpenAI'S GPT-4. This chatbot will be able to process user inputs and generate relevant, context-aware responses.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;First, we'll need to create a ruby file. Let's call this chat.rb&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;What we want to do next is import the necessary libraries&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;require "openai"
require "langchain"
require "reline"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;OpenAI&lt;/strong&gt;: The openai library is a Ruby client for interacting with OpenAI's API.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Langchain&lt;/strong&gt;: Langchain is a Ruby library designed for building language applications. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;reline&lt;/strong&gt;: Reline is a Ruby library used for readline-compatible input in command-line applications.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;Create an instance of &lt;code&gt;Langchain::LLM::OpenAI&lt;/code&gt;. This class is a wrapper around OpenAI's API, specifically tailored for language models (LLMs).&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Side Note&lt;/strong&gt;: The great thing with Langchain.rb is it supports other models, too, offering flexibility to switch between different AI models based on requirements.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Chat Interaction
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# What we'll do next is to Define a method to prompt for and handle user messages
def prompt_for_message
  # Print instructions for multiline input
  puts "(multiline input; type 'end' on its own line when done. or exit to exit)"

  # Read multiline input from the user, checking if the last word is a termination keyword
  user_message = Reline.readmultiline("Question: ", true) do |multiline_input|
    last = multiline_input.split.last # Get the last word of the input
    DONE.include?(last) # Check if the last word is a termination keyword
  end

  # Return a no-operation symbol if there is no user message
  return :noop unless user_message

  # Process the multiline input to handle the termination keyword
  lines = user_message.split("\n")
  if lines.size &amp;gt; 1 &amp;amp;&amp;amp; DONE.include?(lines.last)
    # Remove the "done" keyword from the message
    user_message = lines[0..-2].join("\n")
  end

  # Check if the user wants to exit the program
  return :exit if DONE.include?(user_message.downcase)

  # Return the processed user message
  user_message
end
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The method &lt;code&gt;prompt_for_message&lt;/code&gt;, handles multiline user inputs and implements commands for ending the input or exiting the program. The commands are: "done", "end", "eof", and "exit".&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Interactive Loop
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Our next step is to create infinite loop to continuously handle user input
begin
  loop do
    user_message = prompt_for_message # We get the user message using the defined method above

    # Handle different cases of user message
    case user_message
    when :noop # If no operation, continue to the next iteration of the loop
      next
    when :exit # If exit, break the loop to end the program
      break
    end

    # Using Langchain's instance method 'message, we send the user message to OpenAI and print the OpenAI's response
    puts chat.message(user_message)
  end
# Lastly, we handle the Interrupt exception to gracefully exit the program if interrupted (e.g., Ctrl+C)
rescue Interrupt
  exit 0
end
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The code continuously fetches user messages and uses &lt;code&gt;chat.message&lt;/code&gt; to obtain and display responses from the AI, creating a dynamic conversation flow.&lt;/p&gt;

&lt;h2&gt;
  
  
  Full code of chatbot.rb
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;require "openai"
require "langchain"
require "reline"

Langchain.logger.level = :info

openai = Langchain::LLM::OpenAI.new(
  api_key: ENV["OPENAI_API_KEY"], # Use the OpenAI API key from the environment variable
  llm_options: {}, # Options for the language learning model (empty in this case)
  default_options: { chat_completion_model_name: "gpt-4" } # Set GPT-4 as the default chat model
)

chat = Langchain::Conversation.new(llm: openai)
chat.set_context("You are a chatbot from the future") # Set the context of the conversation

DONE = %w[done end eof exit].freeze

puts "Welcome to the chatbot from the future!"

def prompt_for_message
  puts "(multiline input; type 'end' on its own line when done. or exit to exit)"

  user_message = Reline.readmultiline("Question: ", true) do |multiline_input|
    last = multiline_input.split.last # Get the last word of the input
    DONE.include?(last) # Check if the last word is a termination keyword
  end

  return :noop unless user_message

  lines = user_message.split("\n")
  if lines.size &amp;gt; 1 &amp;amp;&amp;amp; DONE.include?(lines.last)
    user_message = lines[0..-2].join("\n")
  end

  return :exit if DONE.include?(user_message.downcase)

  user_message
end

begin
  loop do
    user_message = prompt_for_message

    case user_message
    when :noop
      next
    when :exit
      break
    end
    puts chat.message(user_message)
  end

rescue Interrupt
  exit 0
end
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>openai</category>
      <category>ruby</category>
      <category>langchain</category>
    </item>
    <item>
      <title>Understanding Big-O Notation with Ruby</title>
      <dc:creator>Ademar Tutor</dc:creator>
      <pubDate>Mon, 08 Jan 2024 15:18:59 +0000</pubDate>
      <link>https://dev.to/iamademar/understanding-big-o-notation-with-ruby-19i1</link>
      <guid>https://dev.to/iamademar/understanding-big-o-notation-with-ruby-19i1</guid>
      <description>&lt;h1&gt;
  
  
  Big-O Notation in Ruby
&lt;/h1&gt;

&lt;p&gt;In college, Big-O Notations was taught in one of our courses. I really could not connect the dots back then. Big-O notations felt too theoretical at that time.&lt;/p&gt;

&lt;p&gt;So, as a professional developer, I relied upon performance tools like rack-mini-profiler, Benchmark, bullet, etc. &lt;/p&gt;

&lt;p&gt;When 3rd party monitoring apps like AppSignal and New Relic started becoming good at monitoring performance, I began creating rules when writing code to avoid performance issues later when my code hits production.&lt;/p&gt;

&lt;p&gt;These are some of my "performance" rules:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1) Be Mindful with Loops&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Rule: Reduce database hits inside loops.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Bad
User.all.each do |user|
  user.update_attribute(:status, "active")
end

# Good
User.update_all(status: "active")
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;2) Optimize Database Queries&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Rule: Avoid N+1 queries by eager loading associated records.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Bad
User.all.each do |user|
  puts user.profile.name
end

# Good
User.includes(:profile).each do |user|
  puts user.profile.name
end
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Note: Although, if you use the Russian Doll Caching strategy, this might not apply.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3) Background Jobs for Time-Consuming Tasks&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Rule: If all else fails, move long-running tasks to background jobs. Pray that the user does not need the data ASAP 🙏&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Using Sidekiq or similar
LongTaskWorker.perform_async(user_id)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I have realized you can get by with your app's performance by learning those simple rules. &lt;/p&gt;

&lt;p&gt;However, having a solid foundation in your bag that has been used since 1894 might be a good idea.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is Big-O notation?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--roBeVFSX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7jocr464zpswvbq0ygsm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--roBeVFSX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7jocr464zpswvbq0ygsm.png" alt="Wikipedia definition of Big-O notation" width="796" height="414"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In plain words, Big-O notation describes the performance of your code using algebraic terms. Performance in this context refers to either the speed of execution or the amount of RAM the code consumes. &lt;/p&gt;

&lt;p&gt;In computer science, these concepts are more formally known as 'time complexity' for speed and 'space complexity' for RAM usage. Big-O notation is used for both, but we're going to focus on speed here to simplify our discussion.&lt;/p&gt;

&lt;h2&gt;
  
  
  Big-O in real-life
&lt;/h2&gt;

&lt;p&gt;In a web application, handling a process that involves sorting or searching through 100 data entries will naturally be slower compared to dealing with just 10 entries. &lt;/p&gt;

&lt;p&gt;The key question is: how much slower is it? Does the time taken increase 10x, a 100x, or perhaps even a 1000x?&lt;/p&gt;

&lt;p&gt;This variance in processing speed might not be evident when dealing with smaller data sets. &lt;/p&gt;

&lt;p&gt;However, the situation changes if your application's performance degrades significantly with the addition of each new entry in the database. &lt;/p&gt;

&lt;p&gt;Such a decline in speed becomes a major concern, particularly as the volume of data grows larger in the app.&lt;/p&gt;

&lt;p&gt;Understanding Big O notation is crucial in this context. It helps developers predict how their code will scale as data volumes increase, allowing them to choose the most efficient algorithms and data structures. &lt;/p&gt;

&lt;p&gt;By anticipating performance issues before they occur, developers can optimize their applications for speed and efficiency, ensuring a smoother user experience even as data grows. This foresight is invaluable in maintaining a high-performing web application over time.&lt;/p&gt;

&lt;p&gt;Here's a super accurate emoji-based representation of Big-O notations to give you an idea of how fast they are as your data scales:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;🚀 O(1)&lt;/strong&gt; - Speed does not depend on the scale of the dataset&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;🚄 O(log n)&lt;/strong&gt; - If you have ten times more data, it only takes twice as long to process.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;🚗 O(n)&lt;/strong&gt; - If the amount of data increases tenfold, the processing time also increases tenfold.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;🚲 O(n log n)&lt;/strong&gt; - If you increase the data by ten times, the processing time will roughly increase by twenty times.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;🏃‍♂️ O(n^2)&lt;/strong&gt; - If the amount of data is multiplied by ten, the processing time increases by a factor of one hundred. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;🚶‍♂️ O(2^n)&lt;/strong&gt; - Doubling the amount of data leads to an exponential increase in processing time, making it significantly longer.&lt;/p&gt;




&lt;h2&gt;
  
  
  Big-O Notations In-depth
&lt;/h2&gt;

&lt;h3&gt;
  
  
  🚀 O(1)
&lt;/h3&gt;

&lt;p&gt;O(1) indicates that the execution time of an operation remains constant, regardless of the size of the data set.&lt;/p&gt;

&lt;p&gt;Example of accessing an element in a hash; the time taken is independent of the hash's size:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Access time remains 
# constant for these

hash_with_100_items[:a]
hash_with_1000_items[:a]
hash_with_10000_items[:a]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As a result, hashes generally perform better than arrays for large datasets. Here's another Ruby example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Retrieving an item 
# from arrays of varying 
# sizes

array_with_100_items[50]
array_with_1000_items[50]
array_with_10000_items[50]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this case, fetching an element at a specific index (like the 50th position) takes the same amount of time, regardless of the array's size.&lt;/p&gt;

&lt;h3&gt;
  
  
  🚄 O(log n)
&lt;/h3&gt;

&lt;p&gt;O(log n) describes an algorithm where the time it takes to complete a task increases logarithmically in relation to the size of the dataset. &lt;/p&gt;

&lt;p&gt;This means that each time the dataset doubles in size, the number of steps required increases by a relatively small, fixed amount. &lt;/p&gt;

&lt;p&gt;O(log n) operations are efficient and much faster than linear time operations for large datasets.&lt;/p&gt;

&lt;p&gt;Here is a Ruby example illustrating O(log n) complexity:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Binary Search:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A binary search algorithm in a sorted array is a classic example of O(log n) complexity. At each step, it divides the dataset in half, significantly reducing the number of elements to search through.&lt;/p&gt;

&lt;p&gt;Here’s an example of a Ruby code of binary search. We compare the time difference with benchmark when we double the dataset. Code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;require 'benchmark'

def binary_search(array, key)
  low = 0
  high = array.length - 1

  # This loop runs as long 
  # as the 'low' index is 
  # less than or equal to  
  # the 'high' index
  while low &amp;lt;= high
    # Find the middle 
    # index of the 
    # current range
    mid = (low + high) / 2
    value = array[mid]

    # If the value is less 
    # than the key, narrow 
    # the search to the 
    # upper half of the 
    # array
    if value &amp;lt; key
      low = mid + 1
    # If the value is 
    # greater than the 
    # key, narrow the 
    # search to the 
    # lower half of 
    # the array
    elsif value &amp;gt; key
      high = mid - 1
    # If the key is found, 
    # return its index
    else
      return mid
    end
  end

  # If the key is not 
  # found in the 
  # array, return nil
  nil
end

# Example usage with an 
# array of 100 million 
# elements
array = (1..100000000).to_a
puts Benchmark.measure {
  binary_search(array, 30)
}

# Example usage with an 
# array of 200 million 
# elements
array = (1..200000000).to_a
puts Benchmark.measure {
  binary_search(array, 30)
}

# Example usage with an 
# array of 300 million 
# elements
array = (1..300000000).to_a
puts Benchmark.measure {
  binary_search(array, 30)
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this code, the binary search algorithm has a time complexity of O(log n) because it divides the search interval in half with each step. &lt;/p&gt;

&lt;p&gt;In each iteration of the while loop, it either finds the element or halves the number of elements to check. &lt;/p&gt;

&lt;p&gt;This halving process continues until the element is found or the interval is empty. &lt;/p&gt;

&lt;p&gt;As a result, the time to complete the search increases logarithmically with the size of the array (n), making binary search efficient for large datasets.&lt;/p&gt;

&lt;p&gt;Result:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PX9iscpq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/u04m9ci2qt790ymy70wa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PX9iscpq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/u04m9ci2qt790ymy70wa.png" alt="Result of binary search algorithm with Benchmark gem" width="690" height="167"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Even got faster on with 300 million elements 😅&lt;/p&gt;

&lt;h3&gt;
  
  
  🚗 O(n)
&lt;/h3&gt;

&lt;p&gt;O(n) describes the time complexity of an algorithm where the time to complete a task grows linearly and in direct proportion to the size of the input data set. This means that if you double the number of elements in your data set, the algorithm will take twice as long to complete.&lt;/p&gt;

&lt;p&gt;Here is a Ruby example illustrating O(n) complexity:&lt;/p&gt;

&lt;p&gt;Summing all elements in an array:&lt;/p&gt;

&lt;p&gt;In this example, the time to sum all elements increases linearly with the number of elements in the array.&lt;/p&gt;

&lt;p&gt;Code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;require 'benchmark'

def sum_array(array)
  sum = 0
  # Iterating over each 
  # element in the array
  array.each do |element|
    # Adding each element 
    # to the sum
    sum += element  
  end
  # Returning the 
  # total sum
  sum  
end

# Summing an array of 
# 100 million elements
array = (1..100000000).to_a
puts Benchmark.measure {
  # The time taken here is 
  # proportional to the 
  # array's size
  sum_array(array)  
}

# Summing an array of 
# 200 million elements
array = (1..200000000).to_a
puts Benchmark.measure {
  # Time taken will be roughly 
  # twice that of the 100 
  # million array
  sum_array(array)  
}

# Summing an array of 
# 300 million elements
array = (1..300000000).to_a
puts Benchmark.measure {
  # Time taken will be roughly 
  # three times that of the 
  # 100 million array
  sum_array(array)  
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this code, the sum_array function has an O(n) complexity because it involves a single loop that iterates over each element of the array exactly once. &lt;/p&gt;

&lt;p&gt;The time taken to execute this function scales linearly with the number of elements in the array (n). &lt;/p&gt;

&lt;p&gt;Therefore, if you double the size of the array, the time to sum the array also doubles, which is characteristic of linear time complexity.&lt;/p&gt;

&lt;p&gt;Result:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ex1y41Vo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2oaciyx00u630i2pmv4h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ex1y41Vo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2oaciyx00u630i2pmv4h.png" alt="Benchmark of sum_array demonstrating O(n)" width="690" height="167"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Observe how the execution time scales linearly as we add 100 million more elements each time.&lt;/p&gt;

&lt;h3&gt;
  
  
  🚲 O(n log n)
&lt;/h3&gt;

&lt;p&gt;O(n log n) describes the time complexity of an algorithm where the time to complete a task increases at a rate proportional to the number of elements in the data set (n) multiplied by the logarithm of the number of elements (log n).&lt;/p&gt;

&lt;p&gt;Here is a Ruby example illustrating O(n log n) complexity:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Merge Sort:&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;Merge Sort is a classic example of an algorithm with O(n log n) complexity. It divides the array into halves, sorts each half, and then merges them back together.&lt;/p&gt;

&lt;p&gt;Code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;require 'benchmark'

def merge_sort(array)
  # Base case: a single 
  # element is already 
  # sorted
  if array.length &amp;lt;= 1
    array
  else
    # Find the middle 
    # of the array
    mid = array.length / 2

    # Recursively sort 
    # the left half
    left = merge_sort(array[0...mid])
    # Recursively sort 
    # the right half
    right = merge_sort(array[mid..-1])

    # Merge the two 
    # sorted halves
    merge(left, right)
  end
end

def merge(left, right)
  sorted = []
  # Until one of the arrays 
  # is empty,pick the smaller 
  # element from the front 
  # of each array
  until left.empty? || right.empty?
    if left.first &amp;lt;= right.first
      sorted &amp;lt;&amp;lt; left.shift
    else
      sorted &amp;lt;&amp;lt; right.shift
    end
  end

  # Concatenate the remaining 
  # elements (one of the arrays 
  # may have elements left)
  sorted.concat(left).concat(right)
end


# Benchmarking with 
# 10 million elements
array = (1..10000000).to_a
puts Benchmark.measure {
  merge_sort(array)
}

# Benchmarking with 
# 20 million elements
array = (1..20000000).to_a
puts Benchmark.measure {
  merge_sort(array)
}

# Benchmarking with 
# 30 million elements
array = (1..30000000).to_a
puts Benchmark.measure {
  merge_sort(array)
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The merge_sort function exhibits O(n log n) complexity because it involves a divide-and-conquer approach. &lt;/p&gt;

&lt;p&gt;The array is repeatedly divided into halves (resulting in a logarithmic number of divisions, i.e., log n), and each half is sorted and then merged. &lt;/p&gt;

&lt;p&gt;The merging process for each level is linear in the size of the input (i.e., n), leading to a total time complexity of O(n log n). &lt;/p&gt;

&lt;p&gt;This makes merge sort much more efficient for large datasets compared to algorithms with quadratic time complexities (like bubble sort or insertion sort), especially as the size of the data increases.&lt;/p&gt;

&lt;p&gt;Result:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_Mpgabtm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/128vz9gsk691zouijy2o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_Mpgabtm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/128vz9gsk691zouijy2o.png" alt="Benchmark of merge_sort" width="690" height="167"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The slowness of execution time of merge sort handling millions of elements becomes really obvious.&lt;/p&gt;

&lt;h3&gt;
  
  
  🏃‍♂️ O(n^2)
&lt;/h3&gt;

&lt;p&gt;O(n^2) describes the time complexity of an algorithm where the time to complete a task is proportional to the square of the number of elements in the input dataset. &lt;/p&gt;

&lt;p&gt;This means that if the number of elements doubles, the time to complete the task increases by four times. Algorithms with O(n^2) complexity are generally less efficient for large datasets compared to O(n log n) or O(n) algorithms.&lt;/p&gt;

&lt;p&gt;Here is a Ruby example illustrating O(n^2) complexity.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Bubble Sort:&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;Bubble Sort is a simple sorting algorithm that repeatedly steps through the list, compares adjacent elements, and swaps them if they are in the wrong order. The pass through the list is repeated until the list is sorted.&lt;/p&gt;

&lt;p&gt;Code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;require 'benchmark'

def bubble_sort(array)
  n = array.length
  # Loop until no more swaps are needed
  loop do
    swapped = false
    # Iterate through the array. The '-1' is because Ruby arrays are zero-indexed,
    # and we're comparing each element with the next.
    (n-1).times do |i|
      # Compare adjacent elements
      if array[i] &amp;gt; array[i+1]
        # Swap if they are in the wrong order
        array[i], array[i+1] = array[i+1], array[i]
        # Indicate a swap occurred
        swapped = true
      end
    end
    # Break the loop if no swaps occurred in the last pass
    break unless swapped
  end
  # Return the sorted array
  array
end

# Benchmarking with 10,000 elements
array = (1..10000).to_a.shuffle
puts Benchmark.measure {
  bubble_sort(array)
}

# Benchmarking with 20,000 elements
array = (1..20000).to_a.shuffle
puts Benchmark.measure {
  bubble_sort(array)
}

# Benchmarking with 30,000 elements
array = (1..30000).to_a.shuffle
puts Benchmark.measure {
  bubble_sort(array)
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this bubble_sort method, the O(n^2) complexity arises from the nested loop structure:&lt;/p&gt;

&lt;p&gt;a) The loop construct potentially iterates n times (in the worst case where the array is in reverse order).&lt;/p&gt;

&lt;p&gt;b) Inside this loop, there's a (n-1).times loop that also iterates up to n-1 times for each outer loop iteration.&lt;/p&gt;

&lt;p&gt;This nesting of loops, where each loop can run 'n' times, leads to the time complexity of O(n^2).&lt;/p&gt;

&lt;p&gt;Result:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--N_hbHPmR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mf4i35q0n169i84v2ila.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--N_hbHPmR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mf4i35q0n169i84v2ila.png" alt="Benchmark of bubble_sort" width="690" height="167"&gt;&lt;/a&gt;&lt;br&gt;
Noticed that I just maxed out the elements to 30,000. It was pretty slow already 😅&lt;/p&gt;
&lt;h3&gt;
  
  
  🚶‍♂️ O(2^n)
&lt;/h3&gt;

&lt;p&gt;O(2^n) describes the time complexity of an algorithm where the time to complete a task doubles with each additional element in the input data set. &lt;/p&gt;

&lt;p&gt;This kind of complexity is typical in algorithms that involve recursive solutions to problems where the solution involves creating two or more subproblems for each problem. &lt;/p&gt;

&lt;p&gt;Such algorithms are often seen as inefficient for large datasets due to their exponential time growth.&lt;/p&gt;

&lt;p&gt;Here is a Ruby example illustrating O(2^n) complexity.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Fibonacci Sequence (Recursive Implementation):&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A classic example of O(2^n) complexity is the recursive calculation of the Fibonacci sequence, where each number is the sum of the two preceding ones.&lt;/p&gt;

&lt;p&gt;Code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;require 'benchmark'

def fibonacci(n)
  return n if n &amp;lt;= 1
  fibonacci(n - 1) + fibonacci(n - 2)
end

puts Benchmark.measure {
  fibonacci(10)
}

puts Benchmark.measure {
  fibonacci(20)
}

puts Benchmark.measure {
  fibonacci(30)
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The number of operations grows exponentially with the input size, characteristic of O(2^n) time complexity. &lt;/p&gt;

&lt;p&gt;This exponential growth makes these algorithms impractical for large input sizes due to the rapid increase in computation time.&lt;/p&gt;

&lt;p&gt;Result:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--De4EJITX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3hnavw9tqqtynnilhz36.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--De4EJITX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3hnavw9tqqtynnilhz36.png" alt="Benchmark of fibonacci to demonstrate O(2^n)" width="690" height="167"&gt;&lt;/a&gt;&lt;br&gt;
I did not dare to use numbers above 100. It would be super slow 🐢&lt;/p&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;You should not be using these code examples in production apps. There are built-in &lt;a href="https://www.rubyguides.com/2017/07/ruby-sort/"&gt;Ruby sort&lt;/a&gt; and search. And if you are sorting records (in PostgreSQL), please use the &lt;a href="https://www.postgresqltutorial.com/postgresql-tutorial/postgresql-order-by/"&gt;database feature&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I hope you can appreciate Big-O notation and apply this to your daily work. 🚀&lt;/p&gt;

</description>
      <category>ruby</category>
    </item>
  </channel>
</rss>
