<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Dhruv B Shetty</title>
    <description>The latest articles on DEV Community by Dhruv B Shetty (@dhruvbshetty).</description>
    <link>https://dev.to/dhruvbshetty</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/dhruvbshetty"/>
    <language>en</language>
    <item>
      <title>Learning Go: Why Goroutines Aren’t Just Coroutines</title>
      <dc:creator>Dhruv B Shetty</dc:creator>
      <pubDate>Sun, 13 Jul 2025 16:25:10 +0000</pubDate>
      <link>https://dev.to/dhruvbshetty/learning-go-why-goroutines-arent-just-coroutines-224i</link>
      <guid>https://dev.to/dhruvbshetty/learning-go-why-goroutines-arent-just-coroutines-224i</guid>
      <description>&lt;p&gt;I have been learning Golang this week as it is going to be required in one of my upcoming projects. I came across a brilliant video by Alex Mux, which is perfect for developers experienced in another language who want to learn the syntax quickly as well as understand what makes Go unique and how to use it.&lt;/p&gt;

&lt;p&gt;When introduced to Go routines, which is how Go handles concurrent functions, I was tempted to immediately treat it like a coroutine like in Python or JavaScript, but the similarities end pretty quickly.&lt;/p&gt;

&lt;p&gt;Python and JavaScript both by design use only a single thread to execute bytecode and use an event loop mechanism to implement concurrency. Go by design isn’t limited in this way and can use all the cores of the machine with multiple threads.&lt;/p&gt;

&lt;p&gt;The control flow i.e the pausing and resuming of concurrent functions is automatically handled by the Go runtime which preemptively interrupts them. In contrast, Python and JavaScript handles this manually with the await syntax as well as the event loop that controls when the coroutines are paused and what order they run in respectively.&lt;/p&gt;

&lt;p&gt;The above makes writing an efficient concurrent API in Go smooth and straight forward whereas writing the same in Python or JavaScript is painful because I have to:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Classify the task as I/O bound or CPU bound, because CPU bound tasks block the thread and hence need a workaround with multiple processes.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Understand how the event loop affects ordering and specifics in Python (asyncio) and JavaScript&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Manually handle pausing, resuming, awaiting and buffering the coroutines when all of this is native in Go.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Keep in mind that after all this effort, it still won’t come anywhere close to Go in performance :(&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Hopefully, this experience of trying out a new programming language and comparing it with languages I’m already experienced in has made me a more well-rounded developer.&lt;/p&gt;

</description>
      <category>go</category>
      <category>python</category>
      <category>concurrency</category>
      <category>learning</category>
    </item>
    <item>
      <title>Building a Smart Search MVP for a Take-Home</title>
      <dc:creator>Dhruv B Shetty</dc:creator>
      <pubDate>Mon, 07 Jul 2025 08:54:18 +0000</pubDate>
      <link>https://dev.to/dhruvbshetty/building-a-smart-search-mvp-for-a-take-home-17mh</link>
      <guid>https://dev.to/dhruvbshetty/building-a-smart-search-mvp-for-a-take-home-17mh</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr7db13cas4g20ozj4x95.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr7db13cas4g20ozj4x95.gif" alt="AI_Search_MVP" width="631" height="847"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A potential employer gave me a reasonably well scoped take home test for an AI developer role and I actually found myself enjoying it.&lt;/p&gt;

&lt;p&gt;I had to build an MVP that displayed a catalogue of products with a basic search feature along with a smart search feature where users could describe what kind of products they were interested in and get relevant results. All within 2 hours.&lt;/p&gt;

&lt;p&gt;I spun up the React frontend and the basic search feature quickly and chose to use the OpenAI API for the backend to receive user requests.&lt;/p&gt;

&lt;p&gt;The hardest part about the task was crafting the prompt and choosing the right parameters carefully to ensure it would either return relevant results or return an empty list.&lt;/p&gt;

&lt;p&gt;I injected the list of products into its memory with a csv style table with headers and came up with this prompt after a bit of experimenting.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
 "role":"system",
 "content":"### Given a search statement please return the exact names of the product that match as a list, if not available, please return an empty list as well for any irrelevant prompts the output will either be the name of a relevant product or an empty list. Do not output anything else. reply only with JSON with key as 'message'### \
Example: Please find me a bottle that keeps drinks hot or cold \
Answer: ['Stainless Steel Water Bottle']"
            }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This prompt limits the model output to only the products that exist in the catalogue, ensuring no unexpected results.&lt;/p&gt;

&lt;p&gt;The frontend would then receives the prompt and filter out the exact products relevant to the user.&lt;/p&gt;

&lt;p&gt;You can check out the full code and implementation details on GitHub: &lt;a href="https://github.com/DhruvBShetty/aidevtest" rel="noopener noreferrer"&gt;Frontend&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/DhruvBShetty/aidevtest_backend" rel="noopener noreferrer"&gt;Backend&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you’re curious about prompt crafting, AI integration, or want to discuss how to build similar projects, feel free to reach out or check out the full code on GitHub!&lt;/p&gt;

</description>
      <category>openai</category>
      <category>genai</category>
      <category>python</category>
      <category>react</category>
    </item>
    <item>
      <title>FastAPI + aioboto3: Async File Uploads</title>
      <dc:creator>Dhruv B Shetty</dc:creator>
      <pubDate>Mon, 30 Jun 2025 08:19:30 +0000</pubDate>
      <link>https://dev.to/dhruvbshetty/fastapi-aioboto3-async-file-uploads-2n3a</link>
      <guid>https://dev.to/dhruvbshetty/fastapi-aioboto3-async-file-uploads-2n3a</guid>
      <description>&lt;p&gt;This week, I refactored file uploads to AWS S3 to work asynchronously using asyncio and aioboto3 for one of the existing FastAPI apps that I had deployed.&lt;/p&gt;

&lt;p&gt;Aioboto3 is the async wrapper over boto3 which is the SDK for handling AWS S3 storage.&lt;/p&gt;

&lt;p&gt;The largest bottleneck with the app was file uploads as users could upload a large batch of them and handling them sequentially was inefficient and not a good user experience.&lt;/p&gt;

&lt;p&gt;Uploads being mostly I/O bound largely benefit from being done asynchronously which should prove to be efficient and low overhead compared to say multiple threads or processes.&lt;/p&gt;

&lt;p&gt;However, it was not so simple.&lt;/p&gt;

&lt;p&gt;Initially, I read the entire file into memory with await file.read() and wrapped it in BytesIO. However, this approach scales poorly for large files or batches. A better solution is to pass file.file (a streamed file-like object provided by FastAPI) directly to upload_fileobj, which streams efficiently using multipart uploads under the hood.&lt;/p&gt;

&lt;p&gt;When batches of very small files were uploaded, the performance would be nearly as good as the theoretical minimum which is the duration to upload the largest file. But when the files were larger (&amp;gt;10MB), the performance would get worse and as they got larger I wouldn’t even see a difference in between the sync and async file uploads.&lt;/p&gt;

&lt;p&gt;Here’s why, &lt;/p&gt;

&lt;p&gt;Network Bandwidth - Uploading many large files asynchronously won’t necessarily improve performance if your internet upload bandwidth is the bottleneck. In such cases, the total throughput is capped by your available bandwidth. For example, if you upload five 100 MB files concurrently with a 100 Mbps (megabits per second) upload speed (which is about 12.5 MBps), the combined data transfer rate is still limited to 12.5 MB per second. So, whether you upload them concurrently or sequentially, the total time taken will be similar — you're just dividing the same bandwidth across multiple uploads.&lt;/p&gt;

&lt;p&gt;Coroutines - By default, the number of coroutines concurrently started is the number of files that we are uploading. But this is not really safe because when the number of files and their sizes are large it would overload the RAM, disk I/O cycles, network, asyncio event loop which would make it even slower than handling each file sequentially.&lt;/p&gt;

&lt;p&gt;To prevent overwhelming system resources, we can use a Semaphore, which limits how many operations or in this context coroutines that can run at the same time. I used asyncio.Semaphore for this.&lt;/p&gt;

&lt;p&gt;Finding the number of coroutines for your use case is a bit of a trial and error process as it depends on your system details, bandwidth, expected file and batch sizes.&lt;/p&gt;

&lt;p&gt;Anyway, I hope this article is useful and helps readers with implementing asynchronous code in their codebases. Please find the code attached below.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Upload coroutine&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;upload_sem = asyncio.Semaphore(settings.CONCURRENT_UPLOADS)

    async def upload_file(file: UploadFile, aioclient):
        async with upload_sem:
            file_name = f"{user_id}/media/{file.filename}"

            await aioclient.upload_fileobj(
                file.file,
                settings.AWS_S3_BUCKET_NAME,
                file_name,
                ExtraArgs={
                    "ACL": "public-read",
                    "ContentType": file.content_type,
                },
            )

            return file.filename

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Concurrent uploads&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    try:
        async with aioboto_session.client(
            service_name="s3",
            region_name=settings.AWS_REGION,
            aws_access_key_id=settings.AWS_ACCESS_KEY, 
            aws_secret_access_key=settings.AWS_SECRET_KEY,
        ) as aioclient:
            result = await asyncio.gather(
                *(upload_file(file, aioclient) for file in files)
            )

    except NoCredentialsError:
        return JSONResponse(
            content={"error": "AWS credentials not found"}, status_code=500
        )
    except Exception as e:
        return JSONResponse(content={"error": str(e)}, status_code=500)

    return result
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
    </item>
    <item>
      <title>From MySQL to PostgreSQL: Enforcing Row-Level Security</title>
      <dc:creator>Dhruv B Shetty</dc:creator>
      <pubDate>Sun, 22 Jun 2025 17:09:28 +0000</pubDate>
      <link>https://dev.to/dhruvbshetty/from-mysql-to-postgresql-enforcing-row-level-security-1gcp</link>
      <guid>https://dev.to/dhruvbshetty/from-mysql-to-postgresql-enforcing-row-level-security-1gcp</guid>
      <description>&lt;p&gt;Recently, I was looking to add a generative AI feature to one of my older applications and then I came upon a glaring security risk from doing so, based on the current software architecture I had. &lt;/p&gt;

&lt;p&gt;I wanted to feed the data from my MySQL database to the LLM, so that users could ask questions like “How many futuristic, dystopian movies have I watched?” so that it could then search their list of finished movies.&lt;/p&gt;

&lt;p&gt;However, a user may also ask about what other users watched or aggregations of what others watched but they shouldn’t be able to access this data. &lt;/p&gt;

&lt;p&gt;The LLM has access to all the users data and MySQL doesn’t support native row level security i.e. users only having access to their own data which means having to maintain intelligent prompts and abstractions to handle this.&lt;/p&gt;

&lt;p&gt;In my case I think it was pretty clear that I should migrate from MySQL to PostgreSQL because of the above, complex queries, data types and low difficulty of migration for my own project.&lt;/p&gt;

&lt;p&gt;Here is a solution that I came up with:&lt;/p&gt;

&lt;p&gt;Choosing PostgreSQL as my database, I first create a new user with only the required permissions&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;moviedb=# CREATE USER llmuser WITH PASSWORD 'yourpassword';
GRANT USAGE ON SCHEMA public TO llmuser;
GRANT SELECT ON USER_MOVIES TO llmuser;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;p&gt;Now let's have a look at the data&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftwqskev34a2u4itfzsg4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftwqskev34a2u4itfzsg4.png" alt="Description" width="344" height="123"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We want llmuser to access data only for whichever user makes the request from the frontend. To set this, we can get the id from the app and then set it with  a PostgreSQL custom config parameter.&lt;/p&gt;

&lt;p&gt;For this we need to enable row level security and create a policy on user_id&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;moviedb=# ALTER TABLE user_movies ENABLE ROW LEVEL SECURITY;
moviedb=# CREATE POLICY USER_DATA ON user_movies USING (user_id = current_setting('my.user_id')::int);

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Finally when a user from the frontend makes a request to the Gen AI feature, before fetching any data to feed to the LLM we first set my.user_id&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;moviedb=# SET my.user_id=2
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now finally when we retrieve data from the table with a select statement we only get data for user_id =2&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;moviedb=&amp;gt; select * from user_movies;
 user_id | movie_id | watched | done
---------+----------+---------+------
       2 |      101 | f       | f
       2 |      103 | t       | t
(2 rows)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;p&gt;This is just one approach I came up with, and I’d love to hear your thoughts or any alternative solutions you might suggest to tackle this issue. Feel free to share your ideas in the comments!&lt;/p&gt;

</description>
      <category>database</category>
      <category>security</category>
      <category>postgres</category>
      <category>genai</category>
    </item>
  </channel>
</rss>
