<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: viky</title>
    <description>The latest articles on DEV Community by viky (@vikyw89).</description>
    <link>https://dev.to/vikyw89</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/vikyw89"/>
    <language>en</language>
    <item>
      <title>🚀 Yo devs, wanna build collab apps that SLAP? Meet Jazz: the Real-Time Collab Framework That’s Straight Fire 🔥</title>
      <dc:creator>viky</dc:creator>
      <pubDate>Fri, 23 May 2025 10:51:10 +0000</pubDate>
      <link>https://dev.to/vikyw89/yo-devs-wanna-build-collab-apps-that-slap-meet-jazz-the-real-time-collab-framework-thats-23o</link>
      <guid>https://dev.to/vikyw89/yo-devs-wanna-build-collab-apps-that-slap-meet-jazz-the-real-time-collab-framework-thats-23o</guid>
      <description>&lt;p&gt;What’s up, code fam? If you’re tired of boring, laggy apps and want your squad to vibe together in real time (securely, obvi), you gotta peep Jazz by Garden Co. This open-source baddie is here to make your next project hit different. Let’s get into it:&lt;/p&gt;

&lt;p&gt;✨ Why Jazz is a Whole Mood&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Real-Time Sync, No Cap&lt;/strong&gt;: Your data updates instantly across all devices. No more “wait, did you save?” drama. It’s all live, all the time.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Offline? Still Vibing&lt;/strong&gt;: Even if your WiFi ghosts you, Jazz keeps your changes safe and syncs up when you’re back online. Resilient AF.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Squad Goals = Collab&lt;/strong&gt;: Multiple peeps can work together, edit, and see changes in real time. Think Google Docs energy, but for anything you wanna build.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;End-to-End Encryption&lt;/strong&gt;: Your tea stays private. Jazz encrypts everything so haters and hackers can’t slide in.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;React &amp;amp; React Native Hooks&lt;/strong&gt;: Plug Jazz right into your React apps with hooks and context. Mobile or web, you’re covered.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Granular Permissions&lt;/strong&gt;: Give your users roles and group access like a Discord mod—total control, no chaos.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Authentication for Days&lt;/strong&gt;: Passkeys, Clerk, anonymous—Jazz got options so your users can log in how they want.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;🛠️ What Can You Build? (Spoiler: Anything)&lt;/p&gt;

&lt;p&gt;Jazz comes stacked with example apps to get your creative juices flowing:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Chat App&lt;/strong&gt;: Slide into those DMs, real-time.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Todo List&lt;/strong&gt;: Collaborative productivity, so no one slacks.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Music Player&lt;/strong&gt;: Queue up bangers with your friends.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Password Manager&lt;/strong&gt;: Secure AF, keep those logins on lock.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pet Manager&lt;/strong&gt;: Because your fur babies deserve the cloud.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;💡 How Does It Work? (For the Nerds)&lt;/p&gt;

&lt;p&gt;At its core, Jazz uses CoJSON—fancy talk for “no more merge conflicts, ever.” It’s got CoMap, CoList, CoStream, and CoRichText for all your data needs. WebSocket transport keeps things snappy, and storage adapters (IndexedDB, SQLite) mean you’re set whether you’re coding for browser, mobile, or Node.&lt;/p&gt;

&lt;p&gt;🔗 Ready to Ship? Check the Docs &amp;amp; Examples&lt;/p&gt;

&lt;p&gt;Don’t just take my word—peep the docs, sample code, and start building something that’ll make your dev friends say “sheeeesh.”&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/garden-co/jazz" rel="noopener noreferrer"&gt;https://github.com/garden-co/jazz&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;TL;DR: Jazz is the glow-up your collab apps need. It’s secure, real-time, and straight-up easy to drop into your stack. Go build something legendary. 🚀&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>programming</category>
      <category>javascript</category>
      <category>database</category>
    </item>
    <item>
      <title>I got suspended on Github</title>
      <dc:creator>viky</dc:creator>
      <pubDate>Thu, 01 May 2025 11:04:25 +0000</pubDate>
      <link>https://dev.to/vikyw89/i-got-suspended-on-github-1hdp</link>
      <guid>https://dev.to/vikyw89/i-got-suspended-on-github-1hdp</guid>
      <description>&lt;p&gt;Yesterday, I did something I thought was harmless—left two comments on an open source project’s discussion thread. One comment was a reasonable OpenAPI proposal for supporting Server-Sent Events (SSE), the other offered a counter-proposal to my own suggestion. No malicious code, no spam links, nothing but honest feedback intended to improve the project.  &lt;/p&gt;

&lt;p&gt;Fast forward to today: I fire up GitHub, only to find myself mysteriously logged out. I shrug it off, click “Sign in,” and…  &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Account suspended&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
“Access to your account has been suspended due to a violation of our Terms of Service. Please contact support for more information.”  &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm32wppvmowgy2koai521.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm32wppvmowgy2koai521.png" alt="account suspended" width="478" height="377"&gt;&lt;/a&gt;&lt;br&gt;
A wash of disbelief hit me. A violation? What violation? All I did was participate in a community discussion.  &lt;/p&gt;




&lt;h2&gt;
  
  
  The Comments That Sparked It All
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;My OpenAPI SSE Proposal&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
I suggested extending the project’s API spec to support Server-Sent Events, aiming to improve real-time updates for clients.  &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;The Counter-Proposal&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
Realizing my first suggestion might introduce backward-compatibility issues, I proposed an alternate design that preserved existing behavior while enabling optional SSE support.  &lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Both comments were civil, clear, and collaborative. I even linked to a short proof-of-concept—no spam, no ads.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Suspension Notice
&lt;/h2&gt;

&lt;p&gt;When I saw “Account suspended,” my heart sank. I clicked &lt;strong&gt;Contact Support&lt;/strong&gt;, only to discover that you must be signed in to open a ticket. Irony alert: my account is suspended, so I can’t contact support with it.  &lt;/p&gt;




&lt;h2&gt;
  
  
  Creating a New Account—Déjà Vu?
&lt;/h2&gt;

&lt;p&gt;Left with no other choice, I created a brand-new GitHub account just to submit a support request. I explained:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Who I am (same email, same contributions)
&lt;/li&gt;
&lt;li&gt;What happened (logged out, suspended notice)
&lt;/li&gt;
&lt;li&gt;What I’d done (two comments)
&lt;/li&gt;
&lt;li&gt;My plea (please reinstate access)
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I clicked &lt;strong&gt;Submit&lt;/strong&gt;, and now I wait.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Allegation: “Spamming”
&lt;/h2&gt;

&lt;p&gt;In their terse email acknowledgement, GitHub claimed I was “spamming.” Here’s the kicker: I still have no clue when or how I spammed. No bulk messages, no unsolicited links—just two thoughtful comments.  &lt;/p&gt;




&lt;h2&gt;
  
  
  What’s Next?
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Await Support Reply&lt;/strong&gt;
I’ve opened the ticket. Now it’s out of my hands (mostly).
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Document Everything&lt;/strong&gt;
Sadly, I couldn't even access the comments I left. Github has all the proof while I don't have any&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Share My Experience&lt;/strong&gt;
That’s why you’re reading this—so you know the potential pitfalls of open source participation and platform policies.
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Lessons &amp;amp; Takeaways
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Always Keep Records&lt;/strong&gt;: Screenshots and local copies of your work can be lifesavers.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Be Prepared to Appeal&lt;/strong&gt;: Even “harmless” community contributions can trigger automated moderation.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Know Your Rights&lt;/strong&gt;: Familiarize yourself with GitHub’s Terms of Service and Code of Conduct.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Github as Oauth&lt;/strong&gt;: Think again before using Github as oauth, they can easily remove your access&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Storing all repo data&lt;/strong&gt;: If I have no backup, all my existing repo would be lost.&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;Stay tuned—once GitHub gets back to me, I’ll share the full story: how long it took, whether they reinstate me, and what (if anything) I could have done differently. In the meantime, remember: open source is a community built on mutual respect, but even well-meaning contributions can sometimes land you in hot water.  &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Have you ever been unexpectedly suspended or moderated on a platform?&lt;/strong&gt; Drop a comment (while you still can 😉) and let’s talk about keeping our online communities fair and transparent.&lt;/p&gt;

</description>
      <category>github</category>
      <category>programming</category>
      <category>opensource</category>
    </item>
    <item>
      <title>26 Key Takeaways from Building hundreds of AI Agents</title>
      <dc:creator>viky</dc:creator>
      <pubDate>Wed, 05 Mar 2025 17:31:08 +0000</pubDate>
      <link>https://dev.to/vikyw89/26-key-takeaways-from-building-hundreds-of-ai-agents-1dhe</link>
      <guid>https://dev.to/vikyw89/26-key-takeaways-from-building-hundreds-of-ai-agents-1dhe</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;We built over hundreds of AI agents under the new "agents as a service" model in the last several months. This document shares 26 key takeaways that we had to learn the hard way - lessons that cost us dissatisfied clients, time, and money, so you don't have to repeat our mistakes.&lt;/p&gt;

&lt;h2&gt;
  
  
  Agent Fundamentals
&lt;/h2&gt;

&lt;h3&gt;
  
  
  1. AI Agents Are Not Your Employees
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Agents are neither automations nor employees&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Automations&lt;/strong&gt;: Every step is hardcoded with exact sequence known in advance&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Employees&lt;/strong&gt;: Have more autonomy than agents&lt;/li&gt;
&lt;li&gt;Instead of thinking about agents in terms of roles, think in terms of SOPs (Standard Operating Procedures)&lt;/li&gt;
&lt;li&gt;Typically one agent can handle one SOP well, while one employee handles 5+ SOPs&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  2. Start From Well-Documented Processes
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;SOPs (Standard Operating Procedures) make training agents significantly simpler&lt;/li&gt;
&lt;li&gt;Well-documented processes contain most of what you need to train agents&lt;/li&gt;
&lt;li&gt;Find onboarding materials and SOPs first before collecting data manually&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  3. Business Owners Will Never Build Their Own Agents
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Even with agents that build other agents, business owners will still need specialists&lt;/li&gt;
&lt;li&gt;Similar to how no-code tools created no-code developers, not the end of developers&lt;/li&gt;
&lt;li&gt;AI agent platforms will spike demand for AI agent developers&lt;/li&gt;
&lt;li&gt;Determining which agents to build is harder than building them&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  4. Business Owners Have No Idea Which Agents They Need
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;~50% of initial client ideas are not the most valuable agents for their business&lt;/li&gt;
&lt;li&gt;Consulting is a huge part of the service&lt;/li&gt;
&lt;li&gt;Start by mapping customer journeys to identify automation opportunities&lt;/li&gt;
&lt;li&gt;Don't assume client ideas are the best - use them as feedback only&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  5. You Don't Need 20+ Agents
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Building too many agents makes systems more complex&lt;/li&gt;
&lt;li&gt;More agents means:

&lt;ul&gt;
&lt;li&gt;Harder maintenance&lt;/li&gt;
&lt;li&gt;More complex debugging&lt;/li&gt;
&lt;li&gt;Increased costs&lt;/li&gt;
&lt;li&gt;Longer response times&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Start with the smallest possible agent, then add more as needed&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  6. Data With Actions Deliver Results
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;GIGO (Garbage In, Garbage Out) applies to AI agents&lt;/li&gt;
&lt;li&gt;The biggest impact comes from combining data with relevant actions&lt;/li&gt;
&lt;li&gt;Combining knowledge (e.g., Facebook marketing strategies) with actions (Facebook API control) achieves significantly better results&lt;/li&gt;
&lt;li&gt;Scrape both internal and external sources to enhance agent performance&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Agent Development
&lt;/h2&gt;

&lt;h3&gt;
  
  
  7. Prompt Engineering Is an Art
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Prompt engineering is already a real job&lt;/li&gt;
&lt;li&gt;As models become larger and smarter, prompt engineering becomes more important&lt;/li&gt;
&lt;li&gt;Tips for effective prompts:

&lt;ul&gt;
&lt;li&gt;Provide examples - one example is worth a thousand words&lt;/li&gt;
&lt;li&gt;Order matters - arrange prompts with most important parts at the end&lt;/li&gt;
&lt;li&gt;Iterate and test constantly&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  8. Integrations Are Just as Important as Functionality
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Don't over-focus on agent capabilities at the expense of integration&lt;/li&gt;
&lt;li&gt;If it's not convenient for users, even powerful agents won't provide value&lt;/li&gt;
&lt;li&gt;Integrate agents into the same systems employees use daily&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  9. Agent Reliability Has Been Solved
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;If an agent isn't reliable, it's a developer problem, not an agent problem&lt;/li&gt;
&lt;li&gt;Pydantic (data validation library) can validate all agent inputs/outputs&lt;/li&gt;
&lt;li&gt;With proper validation logic, agents can't take harmful actions&lt;/li&gt;
&lt;li&gt;Pydantic allows building agents for any use case&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  10. Tools Are the Most Important Component
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;The three most important components: instructions, knowledge, and actions&lt;/li&gt;
&lt;li&gt;70% of work goes into building actions (tools)&lt;/li&gt;
&lt;li&gt;Standard chatbots generate value through responses&lt;/li&gt;
&lt;li&gt;Agents generate value through actions - they should do things, not just advise&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  11. No More Than 4-6 Tools Per Agent
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;More than 4-6 tools causes hallucination&lt;/li&gt;
&lt;li&gt;Agents start to confuse which tools to use or the proper sequence&lt;/li&gt;
&lt;li&gt;If an agent starts hallucinating, split it into multiple agents&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  12. Model Costs Don't Matter
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;If your use case makes sense, you'll almost always make tremendous ROI&lt;/li&gt;
&lt;li&gt;Example: Process reduced from $300 and 3 days to $1-2 and 20 minutes&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  13. Clients Don't Care About Which Model You Use
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Open source models aren't necessary&lt;/li&gt;
&lt;li&gt;Businesses care about value, not the model providing it&lt;/li&gt;
&lt;li&gt;For strict data privacy, use Azure OpenAI (runs models in private Azure instances)&lt;/li&gt;
&lt;li&gt;OpenAI remains the provider of choice due to developer experience&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  14. Don't Automate Until Value Has Been Established
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Don't automate businesses that don't exist yet&lt;/li&gt;
&lt;li&gt;First establish the process manually to ensure it works&lt;/li&gt;
&lt;li&gt;Development costs are the main concern, not model costs&lt;/li&gt;
&lt;li&gt;Hire someone on platforms like Upwork to test processes before automating&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  15. Don't Think About Use Cases, Think About ROI
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;ROI formula: (Rate × Hours - Operational Costs) ÷ Development Costs&lt;/li&gt;
&lt;li&gt;Rate × Hours = Employee hourly rate × Total process hours&lt;/li&gt;
&lt;li&gt;Operational costs = Model costs + Server costs (typically negligible)&lt;/li&gt;
&lt;li&gt;Example: $50/hour × 10 hours/week with $5,000 development cost = 5.6× ROI after one year&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  16. Agent Development Is an Iterative Process
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Like data science competitions, testing the most variations often wins&lt;/li&gt;
&lt;li&gt;Try different architectures and compare side by side&lt;/li&gt;
&lt;li&gt;Build multiple variations when agents underperform to find what works best&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  17. Use Divide and Conquer Approach
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Break complex problems into manageable tasks&lt;/li&gt;
&lt;li&gt;Deliver solutions incrementally:

&lt;ul&gt;
&lt;li&gt;Find an agent that can work independently&lt;/li&gt;
&lt;li&gt;Build and deliver that agent first&lt;/li&gt;
&lt;li&gt;Only proceed after client confirmation&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Automate by departments first before connecting systems&lt;/li&gt;

&lt;/ul&gt;

&lt;h2&gt;
  
  
  Scaling and Deployment
&lt;/h2&gt;

&lt;h3&gt;
  
  
  18. Evals Are a Big Deal (But Only for Big Companies)
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Evals track agent KPIs and performance over time&lt;/li&gt;
&lt;li&gt;Benefits for large companies:

&lt;ul&gt;
&lt;li&gt;Helps eliminate competition&lt;/li&gt;
&lt;li&gt;Continuously improves solutions&lt;/li&gt;
&lt;li&gt;New clients benefit from previous solutions&lt;/li&gt;
&lt;li&gt;Enables future self-improvement&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;SMBs may not need evals due to lower request volume&lt;/li&gt;

&lt;li&gt;Evals provide the last 20% of performance optimization&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  19. There Are Two Types of Agents
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Pure agents: Fully agentic systems&lt;/li&gt;
&lt;li&gt;Workflows: Processes with predetermined steps but agentic execution

&lt;ul&gt;
&lt;li&gt;Steps and sequence are fixed&lt;/li&gt;
&lt;li&gt;Individual steps have agentic capabilities&lt;/li&gt;
&lt;li&gt;Example: Lead research with specific search patterns&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  20. Agents Need to Be Adaptable on Feedback
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Agents must interact with their environment and receive feedback&lt;/li&gt;
&lt;li&gt;Add tools that allow agents to analyze their results&lt;/li&gt;
&lt;li&gt;Example: Don't just build database update capability; add ability to read records to verify task completion&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  21. Don't Build Around Limitations
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Models continuously improve - don't optimize for current limitations&lt;/li&gt;
&lt;li&gt;Example: Complex systems built to avoid context token limits became obsolete when 128k context models were released&lt;/li&gt;
&lt;li&gt;Avoid building obvious general use cases that major AI companies might develop themselves&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  22. Deploying Agents Is Harder Than Building Them
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Building an agent might take 2-3 days&lt;/li&gt;
&lt;li&gt;Deploying and integrating it takes another 3 days&lt;/li&gt;
&lt;li&gt;Consider specialized platforms for agent deployment&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  23. Waterfall Projects Don't Work
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Agentic projects are too agile for fixed scopes&lt;/li&gt;
&lt;li&gt;Start with one-time fees but transition to subscription models&lt;/li&gt;
&lt;li&gt;Work as a partner, not just an outsourced team&lt;/li&gt;
&lt;li&gt;The goal is to automate business processes, not just build agents&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  24. Include a Human in the Loop for Mission-Critical Agents
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Some agents have no margin for error&lt;/li&gt;
&lt;li&gt;Add human review steps for high-stakes actions&lt;/li&gt;
&lt;li&gt;Example: Having clients review marketing campaigns in Notion&lt;/li&gt;
&lt;li&gt;Remove human review only after consistent approval and fine-tuning&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  25. 2025 Is the Year of Vertical AI Agents
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Vertical agents specialize in specific use cases for specific industries&lt;/li&gt;
&lt;li&gt;Benefits:

&lt;ul&gt;
&lt;li&gt;Easier to scale&lt;/li&gt;
&lt;li&gt;Higher pricing potential&lt;/li&gt;
&lt;li&gt;Better problem-solving for specific businesses&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Start with horizontal agents, then identify patterns to create vertical agents&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  26. Agents Don't Replace People, They Help Businesses Scale
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Business owners typically don't fire people after automation&lt;/li&gt;
&lt;li&gt;Automation helps business owners think bigger and scale faster&lt;/li&gt;
&lt;li&gt;Employees can focus on higher-level tasks they enjoy&lt;/li&gt;
&lt;li&gt;Ultimately leads to a new age of abundance and prosperity&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>ai</category>
    </item>
    <item>
      <title>Unleashing Performance: A Deep Dive into FastAPI's Asynchronous Magic</title>
      <dc:creator>viky</dc:creator>
      <pubDate>Mon, 05 Aug 2024 16:09:06 +0000</pubDate>
      <link>https://dev.to/vikyw89/unleashing-performance-a-deep-dive-into-fastapis-asynchronous-magic-243p</link>
      <guid>https://dev.to/vikyw89/unleashing-performance-a-deep-dive-into-fastapis-asynchronous-magic-243p</guid>
      <description>&lt;p&gt;As developers, we are constantly seeking ways to optimize our applications to handle more requests, faster. Today, I'm excited to share the results of a benchmark I conducted on FastAPI, a modern, fast (high-performance) web framework for building APIs with Python. This benchmark sheds light on the stark differences between synchronous and asynchronous endpoints, and how leveraging asynchronous programming can significantly boost your application's performance.&lt;/p&gt;

&lt;p&gt;The Benchmark Setup:&lt;br&gt;
To provide a comprehensive comparison, I set up two types of endpoints in FastAPI: one synchronous and one asynchronous. Both endpoints simulate a delay of one second to mimic I/O-bound operations. I then used Apache Benchmark (ab) to measure the requests per second (RPS) under different server configurations.&lt;/p&gt;

&lt;p&gt;Key Findings:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Asynchronous Triumphs&lt;/strong&gt;: The asynchronous endpoint consistently outperformed its synchronous counterpart by a wide margin. With a concurrency level of 10,000 and 10,000 total requests, the asynchronous endpoint achieved an impressive 7,593.36 RPS, while the synchronous endpoint managed only 155.80 RPS.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Worker Impact&lt;/strong&gt;: Increasing the number of workers with Gunicorn had a positive effect on both types of endpoints. However, the asynchronous endpoint saw a more dramatic improvement, further highlighting the benefits of asynchronous processing.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Real-World Implications&lt;/strong&gt;: These results are particularly relevant for applications that rely on I/O-bound operations, such as database queries, file I/O, or network requests. By adopting an asynchronous approach, developers can ensure their applications are not only faster but also more scalable.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Conclusion:&lt;br&gt;
The benchmark clearly demonstrates the power of asynchronous programming in FastAPI. For developers looking to build high-performance APIs, embracing asynchronous endpoints is a no-brainer. Not only does it lead to better performance, but it also allows for more efficient resource utilization, which is crucial for modern, scalable applications.&lt;/p&gt;

&lt;p&gt;If you're intrigued by these results and want to explore how you can apply asynchronous programming in your FastAPI projects, I encourage you to check out the full benchmark details on my GitHub repository. Let's push the boundaries of what's possible with FastAPI and Python!&lt;/p&gt;

&lt;p&gt;GitHub Repository: &lt;a href="https://github.com/vikyw89/fastapi-deployment-benchmark" rel="noopener noreferrer"&gt;fastapi-deployment-benchmark&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Happy coding, and may your APIs be fast and your responses swift!&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Embracing Type Safety and Database Pulling with Prisma Client Python</title>
      <dc:creator>viky</dc:creator>
      <pubDate>Sun, 04 Aug 2024 14:43:50 +0000</pubDate>
      <link>https://dev.to/vikyw89/embracing-type-safety-and-database-pulling-with-prisma-client-python-gjj</link>
      <guid>https://dev.to/vikyw89/embracing-type-safety-and-database-pulling-with-prisma-client-python-gjj</guid>
      <description>&lt;p&gt;As modern software development continues to progress, the tools we use to interact with databases are more critical than ever. Among these tools, &lt;strong&gt;Prisma Client Python&lt;/strong&gt; has emerged as a powerful ORM that prioritizes type safety and efficient database operations, specifically providing features that traditional ORM libraries like SQLAlchemy might lack, such as seamless database pulling.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Importance of Type Safety
&lt;/h3&gt;

&lt;p&gt;In programming, especially when dealing with databases, ensuring type safety can help prevent many common errors and inconsistencies. Type safety allows developers to define strict schemas that the database must adhere to, reducing the chances of runtime errors due to mismatched data types. &lt;/p&gt;

&lt;p&gt;Prisma Client Python embraces this need by leveraging Python's type hinting capabilities. As a result, when you define your data models in the Prisma schema, you gain:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Compile-time Checks&lt;/strong&gt;: Identify mistakes during development rather than at runtime.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Enhanced Developer Experience&lt;/strong&gt;: With autocompletion support through Pylance/Pyright, writing queries becomes more intuitive, reducing the cognitive load on developers and allowing them to focus on building features rather than debugging type errors.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Imagine trying to create a new user entry in your database:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;user&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;prisma&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;user&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;create&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;name&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Alice&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;email&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;alice@example.com&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here, if you later change the model definition to make &lt;code&gt;email&lt;/code&gt; a non-nullable field or change its type, the static type checkers will alert you before you even run your application, minimizing the potential for bugs that arise from improper data handling.&lt;/p&gt;

&lt;h3&gt;
  
  
  Efficient Database Pull
&lt;/h3&gt;

&lt;p&gt;One of the standout features of Prisma Client Python is its &lt;strong&gt;database pull&lt;/strong&gt; capability. Database pulling allows you to introspect your database schema and generate the corresponding Prisma client automatically. This feature is particularly valuable for scenarios where your database schema evolves over time or when you are working with an existing database.&lt;/p&gt;

&lt;p&gt;In contrast, SQLAlchemy operates more on the premise of defining models that map to your database tables in code, requiring additional steps to synchronize changes with the actual database. With Prisma Client Python, you can simply run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;prisma db pull
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This command fetches the current state of your database, updating the Prisma schema and generating or updating the client accordingly. This seamless integration ensures that your application's data models are always in sync with the underlying database structure without manual intervention.&lt;/p&gt;

&lt;h3&gt;
  
  
  Advantages Over SQLAlchemy
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Simplicity and Clarity&lt;/strong&gt;: Prisma Client Python allows developers to define their data schema in a clear, explicit manner. In contrast, SQLAlchemy's ORM model can sometimes lead to confusion with complex relationships and mapping configurations.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Automatic Synchronization&lt;/strong&gt;: The &lt;code&gt;prisma db pull&lt;/code&gt; command is a game-changer for maintaining consistency. You don't have to worry about manually adjusting your models whenever you make changes to the database. SQLAlchemy requires manual migration scripts and potential downtime to ensure everything is in sync.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Type Safety with Ease&lt;/strong&gt;: While SQLAlchemy offers some degree of type checking, it doesn't provide the same level of safety as Prisma Client Python. Type hinting in Prisma can catch errors at compile-time rather than leaving them to runtime, thus improving overall code reliability.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;For developers seeking a robust ORM solution that emphasizes type safety and efficient database management, Prisma Client Python stands out as a superior option compared to traditional libraries like SQLAlchemy. Its innovative approach to database pulling and type safety not only enhances productivity but also fosters cleaner and more maintainable code.&lt;/p&gt;

&lt;p&gt;In an era where reliability and speed are paramount, why settle for anything less? Embrace Prisma Client Python, and take your database interactions to the next level, ensuring that your applications are built on a solid foundation of type safety and adaptability. Happy coding!&lt;/p&gt;

</description>
      <category>python</category>
      <category>fastapi</category>
      <category>prisma</category>
      <category>postgres</category>
    </item>
    <item>
      <title>Concurrent Futures in Python: Launching Parallel Tasks with Ease</title>
      <dc:creator>viky</dc:creator>
      <pubDate>Sun, 04 Aug 2024 14:32:11 +0000</pubDate>
      <link>https://dev.to/vikyw89/concurrent-futures-in-python-launching-parallel-tasks-with-ease-4eim</link>
      <guid>https://dev.to/vikyw89/concurrent-futures-in-python-launching-parallel-tasks-with-ease-4eim</guid>
      <description>&lt;p&gt;Achieving optimal performance through parallel execution is essential. Python, a versatile programming language, provides several tools for concurrent execution. One of the most powerful and user-friendly modules is &lt;code&gt;concurrent.futures&lt;/code&gt;, which allows developers to run calls asynchronously. In this article, we'll explore the functionality of this module and how to leverage it for various tasks, including file operations and web requests.&lt;/p&gt;

&lt;h2&gt;
  
  
  Overview of Concurrent Futures
&lt;/h2&gt;

&lt;p&gt;The &lt;code&gt;concurrent.futures&lt;/code&gt; module offers an abstract class known as &lt;code&gt;Executor&lt;/code&gt;, which facilitates the execution of calls asynchronously. Although it should not be used directly, developers can utilize its concrete subclasses, such as &lt;code&gt;ThreadPoolExecutor&lt;/code&gt; and &lt;code&gt;ProcessPoolExecutor&lt;/code&gt;, to perform tasks concurrently.&lt;/p&gt;

&lt;h3&gt;
  
  
  Key Features
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Submit Method&lt;/strong&gt;: The &lt;code&gt;submit&lt;/code&gt; method is where the magic happens. It schedules a callable function to be executed asynchronously and returns a &lt;code&gt;Future&lt;/code&gt; object. The callable is executed with provided arguments, allowing developers to run background tasks seamlessly.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;   &lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="nc"&gt;ThreadPoolExecutor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;max_workers&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;executor&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
       &lt;span class="n"&gt;future&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;executor&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;submit&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;pow&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;323&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1235&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
       &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;future&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;result&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this example, we use a &lt;code&gt;ThreadPoolExecutor&lt;/code&gt; to raise a number to a power in a separate thread.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Map Method&lt;/strong&gt;: The &lt;code&gt;map&lt;/code&gt; method is another fantastic feature that allows executing a function across multiple input iterables concurrently. It collects the iterables immediately and executes the calls asynchronously.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;   &lt;span class="n"&gt;results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;executor&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;load_url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;URLS&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;timeout&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This functionality is particularly useful when you have a list of tasks that you want to run in parallel.&lt;/p&gt;

&lt;h2&gt;
  
  
  Practical Application: Copying Files
&lt;/h2&gt;

&lt;p&gt;Consider a scenario where you need to copy multiple files efficiently. The following code snippet demonstrates how to use a &lt;code&gt;ThreadPoolExecutor&lt;/code&gt; to copy files concurrently:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;concurrent.futures&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;shutil&lt;/span&gt;

&lt;span class="n"&gt;files_to_copy&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;src2.txt&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;dest2.txt&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;src3.txt&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;dest3.txt&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;src4.txt&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;dest4.txt&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
&lt;span class="p"&gt;]&lt;/span&gt;

&lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;concurrent&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;futures&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;ThreadPoolExecutor&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;executor&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;futures&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;executor&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;submit&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;shutil&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copy&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;src&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;dst&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;src&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;dst&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;files_to_copy&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;future&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;concurrent&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;futures&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;as_completed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;futures&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;future&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;result&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This example leverages the &lt;code&gt;shutil.copy&lt;/code&gt; function to perform file copies in parallel, significantly improving performance for large-scale file operations.&lt;/p&gt;

&lt;h2&gt;
  
  
  Handling Web Requests Concurrently
&lt;/h2&gt;

&lt;p&gt;Another exciting application of the &lt;code&gt;concurrent.futures&lt;/code&gt; module is retrieving content from multiple URLs at once. Below is a simple implementation using &lt;code&gt;ThreadPoolExecutor&lt;/code&gt; to fetch web pages:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;concurrent.futures&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;urllib.request&lt;/span&gt;

&lt;span class="n"&gt;URLS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;http://www.foxnews.com/&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;http://www.cnn.com/&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;http://europe.wsj.com/&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;http://www.bbc.co.uk/&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;http://nonexistant-subdomain.python.org/&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;]&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;load_url&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;timeout&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;urllib&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;request&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;urlopen&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;timeout&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;timeout&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;conn&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;conn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;read&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;concurrent&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;futures&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;ThreadPoolExecutor&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;executor&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;executor&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;load_url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;URLS&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;timeout&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This code is a straightforward way to retrieve web content quickly, demonstrating just how easy it is to implement concurrent execution in your projects.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;The &lt;code&gt;concurrent.futures&lt;/code&gt; module provides a powerful way to execute tasks asynchronously in Python, simplifying the process of achieving parallelism in your applications. Through its &lt;code&gt;Executor&lt;/code&gt; class and methods like &lt;code&gt;submit&lt;/code&gt; and &lt;code&gt;map&lt;/code&gt;, developers can efficiently manage background tasks, whether they involve file operations, web requests, or any other I/O-bound processes.&lt;/p&gt;

&lt;p&gt;By incorporating these techniques into your programming practices, you'll be able to create more responsive and efficient applications, enhancing both performance and user experience. Happy coding!&lt;/p&gt;

</description>
      <category>programming</category>
      <category>python</category>
      <category>developers</category>
      <category>fastapi</category>
    </item>
    <item>
      <title>Exploring Performance with the Concurrency-Parallel-Benchmark Repository</title>
      <dc:creator>viky</dc:creator>
      <pubDate>Sun, 04 Aug 2024 13:55:32 +0000</pubDate>
      <link>https://dev.to/vikyw89/exploring-performance-with-the-concurrency-parallel-benchmark-repository-44mi</link>
      <guid>https://dev.to/vikyw89/exploring-performance-with-the-concurrency-parallel-benchmark-repository-44mi</guid>
      <description>&lt;p&gt;The efficiency with which our applications handle tasks can make or break a user experience. The &lt;code&gt;concurrency-parallel-benchmark&lt;/code&gt; repository offers a comprehensive framework for measuring and comparing the performance of different concurrency models in Python 3.12, focusing on both IO-bound and CPU-bound tasks. Whether you are an experienced developer or someone looking to optimize your Python applications, understanding these performance differences is essential.&lt;/p&gt;

&lt;h2&gt;
  
  
  Discover the Benchmark Features
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Task Types
&lt;/h3&gt;

&lt;p&gt;The benchmark evaluates the performance for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Sequential Tasks&lt;/strong&gt;: Traditional, one-at-a-time execution.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Concurrent Tasks with ThreadPool&lt;/strong&gt;: Utilizing threads to handle multiple tasks concurrently.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Concurrent Tasks with Async/Await&lt;/strong&gt;: Embracing the async capabilities of Python for efficient IO operations.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Concurrent Tasks Using Multiprocessing&lt;/strong&gt;: Using multiple processes to bypass Python's Global Interpreter Lock (GIL) for CPU-bound tasks.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  TL;DR of Findings
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Incredible Speed With Asynchronous IO&lt;/strong&gt;: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;For IO-bound tasks, using &lt;strong&gt;async&lt;/strong&gt; is a game-changer. It is about &lt;strong&gt;twice as fast&lt;/strong&gt; as the synchronous thread pool and &lt;strong&gt;20 times faster&lt;/strong&gt; than sequential execution.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Multiprocessing for the Win on CPU-Bound Tasks&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Multiprocessing&lt;/strong&gt; achieves &lt;strong&gt;6 times the performance&lt;/strong&gt; of other techniques (performance can vary based on CPU cores).&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Subpar Performance of Async in CPU Bound Tasks&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;While async can handle IO-bound tasks adeptly, it is the &lt;strong&gt;least effective&lt;/strong&gt; for CPU-bound tasks, showing only minor performance differences compared to synchronous execution.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Thread Pool is Unnecessary&lt;/strong&gt;: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Given the findings, there is almost &lt;strong&gt;no reason&lt;/strong&gt; to utilize thread pools for any tasks.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Detailed Performance Insights
&lt;/h2&gt;

&lt;h3&gt;
  
  
  IO-Bound Task Comparisons
&lt;/h3&gt;

&lt;p&gt;The metrics highlighting the performance in IO-bound tasks reveal significant differences across execution methods. Here’s a quick overview of the benchmark results when executing 100 tasks:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Benchmark Type&lt;/th&gt;
&lt;th&gt;Duration (seconds)&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Async&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;0.63&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;ThreadPool&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;1.54&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Multiprocessing&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;2.63&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Sequential&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;22.69&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The results clearly demonstrate the speed advantage of asynchronous programming for IO-bound tasks.&lt;/p&gt;

&lt;h3&gt;
  
  
  CPU-Bound Task Comparisons
&lt;/h3&gt;

&lt;p&gt;The performance of CPU-bound tasks also illustrates the effectiveness of multiprocessing over other methods. For 100 tasks, we observe:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Benchmark Type&lt;/th&gt;
&lt;th&gt;Duration (seconds)&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Multiprocessing&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;0.68&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Sequential&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;4.11&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;ThreadPool&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;4.24&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Async&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;4.33&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Here, the dominance of multiprocessing is palpable, underscoring its significance in CPU-heavy applications.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to Use the Benchmark
&lt;/h2&gt;

&lt;p&gt;The repository is easy to set up and run. Follow these commands to install the necessary dependencies and execute the benchmarks:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;poetry &lt;span class="nb"&gt;install
&lt;/span&gt;poetry run start
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After executing the above commands, you will find the results logged in &lt;strong&gt;.log files&lt;/strong&gt;, allowing you to analyze performance across various conditions.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;The findings in the &lt;code&gt;concurrency-parallel-benchmark&lt;/code&gt; repository not only highlight the power of choosing the right concurrency model in Python but also provide a straightforward, actionable guide for developers looking to enhance their applications’ performance. Whether you are dealing with IO-intensive workloads or CPU-bound operations, leveraging async programming and multiprocessing can yield tremendous performance gains.&lt;/p&gt;

&lt;p&gt;Be sure to visit the &lt;a href="https://github.com/vikyw89/concurrency-parallel-benchmark" rel="noopener noreferrer"&gt;GitHub repository&lt;/a&gt; to explore the full range of features and metrics provided. Start optimizing your applications today! &lt;/p&gt;




&lt;p&gt;By leveraging these insights, you can make an informed decision on task handling within your Python applications, ensuring that you achieve the best possible performance for your workloads. Happy coding!&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Elevate Your Developer Experience with LLMText: A Seamless Library for Language Models</title>
      <dc:creator>viky</dc:creator>
      <pubDate>Sun, 04 Aug 2024 13:50:03 +0000</pubDate>
      <link>https://dev.to/vikyw89/elevate-your-developer-experience-with-llmtext-a-seamless-library-for-language-models-2fm6</link>
      <guid>https://dev.to/vikyw89/elevate-your-developer-experience-with-llmtext-a-seamless-library-for-language-models-2fm6</guid>
      <description>&lt;p&gt;In the fast-evolving world of artificial intelligence, the way we interface with technology is becoming increasingly sophisticated. Language models (LLMs) like GPT-3 and its successors are revolutionizing how we communicate with machines. However, tapping into this power can often come with complexities that can be daunting for developers. Enter &lt;strong&gt;LLMText&lt;/strong&gt;, a streamlined library designed to make interacting with language models not just easy, but enjoyable.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is LLMText?
&lt;/h2&gt;

&lt;p&gt;LLMText is a comprehensive codebase offering a suite of tools that simplifies asynchronous interactions with language models. It marries functionality with an intuitive design, making it easier for developers to generate text, extract structured data, and create more complex workflows—without the usual headaches.&lt;/p&gt;

&lt;h3&gt;
  
  
  Key Features
&lt;/h3&gt;

&lt;p&gt;Here’s why LLMText is a game-changer for developers:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;User-Friendly Asynchronous Text Generation&lt;/strong&gt;: Generate responses quickly and concurrently with minimal code. Simply craft your message and let LLMText handle the rest.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Seamless Streaming Text Generation&lt;/strong&gt;: Enjoy real-time responses, allowing for smoother user interactions and more dynamic applications.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Structured Extraction&lt;/strong&gt;: Extract structured data from free-text inputs effortlessly, even from LLMs that don’t support function calls. This built-in capability aids in retrieving relevant information without additional overhead.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Type Safety with Pydantic&lt;/strong&gt;: Enjoy peace of mind knowing that your data models are type-checked, thanks to Pydantic. Build reliable applications with fewer runtime errors, improving the overall developer experience.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Flexible Workflows&lt;/strong&gt;: Create complex interactions that involve multiple tools and messages with ease, enabling automation and expanded application capabilities without unnecessary complexity.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Getting Started: Easy Installation &amp;amp; Usage
&lt;/h2&gt;

&lt;p&gt;Getting started with LLMText is a breeze. Install the library with a single command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install &lt;/span&gt;llmtext
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And be sure to set up your environment variables by creating a simple &lt;code&gt;.env&lt;/code&gt; file in your project root directory.&lt;/p&gt;

&lt;h3&gt;
  
  
  Quick Example: Asynchronous Text Generation
&lt;/h3&gt;

&lt;p&gt;Here’s how effortlessly you can get started with LLMText for asynchronous text generation:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;llmtext.messages_fns&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;agenerate&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;llmtext.data_types&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Message&lt;/span&gt;

&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;main&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="n"&gt;text&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;agenerate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="n"&gt;messages&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nc"&gt;Message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;role&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;user&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;what&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;s the weather today?&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt;
    &lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;asyncio&lt;/span&gt;
&lt;span class="n"&gt;asyncio&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;run&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;main&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With just a few lines of code, you can generate a response from a language model in seconds!&lt;/p&gt;

&lt;h3&gt;
  
  
  Agentic Workflow Made Simple
&lt;/h3&gt;

&lt;p&gt;The workflow capabilities of LLMText empower you to build dynamic interactions without complicating your code. Check out this example that showcases the agentic workflow functionality:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;llmtext.data_types&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Message&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;llmtext.workflows_fns&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;astream_agentic_workflow&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;llmtext.data_types&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;RunnableTool&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;typing&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Annotated&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;pydantic&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Field&lt;/span&gt;

&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;SearchInternetTool&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;RunnableTool&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;A simple tool to illustrate internet search functionality.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Annotated&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nc"&gt;Field&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;description&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;search query&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt;

    &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;arun&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;there&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;s no result for: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;

&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;main&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="n"&gt;stream&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;astream_agentic_workflow&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="n"&gt;messages&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;
            &lt;span class="nc"&gt;Message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;role&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;user&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;what&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;s the weather today?&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
            &lt;span class="nc"&gt;Message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;role&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;assistant&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;there&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;s no result for: what&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;s the weather today?&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
        &lt;span class="p"&gt;],&lt;/span&gt;
        &lt;span class="n"&gt;tools&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;SearchInternetTool&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
    &lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;chunk&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;stream&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;chunk&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;asyncio&lt;/span&gt;
&lt;span class="n"&gt;asyncio&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;run&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;main&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This example highlights the ease with which you can integrate tools into workflows, making your applications more powerful and interactive without additional complexity.&lt;/p&gt;

&lt;h2&gt;
  
  
  Confidence in Code Quality
&lt;/h2&gt;

&lt;p&gt;LLMText comes with an extensive suite of tests to ensure everything works as intended. You can run the tests using:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pytest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With dedicated test files for various functionalities, maintaining code integrity has never been easier.&lt;/p&gt;

&lt;h2&gt;
  
  
  Join the LLMText Community
&lt;/h2&gt;

&lt;p&gt;LLMText is open for contributions, and we welcome developers of all backgrounds to join us! Whether you have a bug to report, a feature to suggest, or cool enhancements to add, your input can make a difference. Follow our straightforward Git workflow to contribute:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Fork the repository.&lt;/li&gt;
&lt;li&gt;Create a new branch for your feature or bug fix.&lt;/li&gt;
&lt;li&gt;Commit your changes.&lt;/li&gt;
&lt;li&gt;Push your branch to your fork.&lt;/li&gt;
&lt;li&gt;Submit a pull request.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  License and Contact
&lt;/h2&gt;

&lt;p&gt;LLMText is licensed under the MIT License, giving you the freedom to use, modify, and distribute the software as you see fit. For any questions or feedback, don’t hesitate to open an issue on GitHub!&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;As developers, we thrive on tools that make our lives easier and our applications better. &lt;strong&gt;LLMText&lt;/strong&gt; stands out as a solution that not only simplifies interactions with language models but also enhances the overall developer experience. Its asynchronous capabilities, structured data extraction, type safety through Pydantic, and user-friendly interface come together to create a remarkable toolkit for building intelligent applications. Dive in, explore, and unlock the potential of LLMs with LLMText!&lt;/p&gt;

</description>
      <category>openai</category>
      <category>opensource</category>
      <category>llm</category>
      <category>rag</category>
    </item>
    <item>
      <title>Boost Your Python Performance with Parallize: A Game-Changer for Parallel Processing</title>
      <dc:creator>viky</dc:creator>
      <pubDate>Sun, 04 Aug 2024 13:36:36 +0000</pubDate>
      <link>https://dev.to/vikyw89/boost-your-python-performance-with-parallize-a-game-changer-for-parallel-processing-4mp0</link>
      <guid>https://dev.to/vikyw89/boost-your-python-performance-with-parallize-a-game-changer-for-parallel-processing-4mp0</guid>
      <description>&lt;p&gt;In the fast-paced world of software development, performance is king. Whether you're crunching numbers, processing large datasets, or running complex simulations, the speed at which your code executes can make or break your project's success. Enter &lt;strong&gt;Parallize&lt;/strong&gt;, a powerful Python library designed to supercharge your applications by leveraging parallel processing.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is Parallize?
&lt;/h2&gt;

&lt;p&gt;Parallize is a Python package that provides utilities to parallelize both synchronous and asynchronous functions using the &lt;code&gt;concurrent.futures.ProcessPoolExecutor&lt;/code&gt;. This means you can execute functions in separate processes, taking full advantage of multiple CPU cores to improve performance significantly.&lt;/p&gt;

&lt;h2&gt;
  
  
  Key Features
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Parallelize Synchronous Functions&lt;/strong&gt;: Seamlessly execute synchronous functions in parallel, freeing up the main thread and enhancing overall application responsiveness.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Parallelize Asynchronous Functions&lt;/strong&gt;: Extend the benefits of parallel processing to asynchronous functions, ensuring non-blocking execution and improved efficiency.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Customizable Worker Count&lt;/strong&gt;: Tailor the number of worker processes to your needs, or let Parallize default to the number of available CPU cores for optimal performance.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Mini Benchmark: See the Difference
&lt;/h2&gt;

&lt;p&gt;To illustrate the power of Parallize, let's look at some benchmark results:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Test Case&lt;/th&gt;
&lt;th&gt;Concurrent Execution Time&lt;/th&gt;
&lt;th&gt;Parallel Execution Time&lt;/th&gt;
&lt;th&gt;Speedup&lt;/th&gt;
&lt;th&gt;Tasks Count&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;test_aparallize_fn&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;0:00:17.215937&lt;/td&gt;
&lt;td&gt;0:00:08.293026&lt;/td&gt;
&lt;td&gt;2.08x&lt;/td&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;test_aparallize_10&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;0:01:25.070893&lt;/td&gt;
&lt;td&gt;0:00:13.997451&lt;/td&gt;
&lt;td&gt;5.94x&lt;/td&gt;
&lt;td&gt;10&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;These results speak for themselves. By using Parallize, you can achieve significant speedups, making your applications faster and more efficient.&lt;/p&gt;

&lt;h2&gt;
  
  
  Getting Started with Parallize
&lt;/h2&gt;

&lt;p&gt;Installing Parallize is a breeze with pip:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install &lt;/span&gt;parallize
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once installed, you can start using Parallize in your projects with just a few lines of code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;parallize&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;aparallize&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;my_function&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="n"&gt;y&lt;/span&gt;

&lt;span class="c1"&gt;# Call the function as usual
&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;aparallize&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;my_function&lt;/span&gt;&lt;span class="p"&gt;)(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;  &lt;span class="c1"&gt;# This will be executed in a separate process
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can also customize the number of worker processes by passing the &lt;code&gt;max_workers&lt;/code&gt; argument:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;aparallize&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;my_function&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;max_workers&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;)(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Why Choose Parallize?
&lt;/h2&gt;

&lt;p&gt;In today's competitive landscape, every millisecond counts. Parallize empowers developers to harness the full potential of their hardware, delivering faster, more responsive applications. Whether you're a data scientist, a web developer, or working on any CPU-bound task, Parallize is your go-to solution for parallel processing.&lt;/p&gt;

&lt;h2&gt;
  
  
  Join the Parallize Community
&lt;/h2&gt;

&lt;p&gt;Parallize is an open-source project, and we welcome contributions from the community. If you have ideas for improvements or encounter any issues, please visit our &lt;a href="https://github.com/vikyw89/parallize" rel="noopener noreferrer"&gt;GitHub repository&lt;/a&gt; and get involved.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Don't let performance bottlenecks hold you back. With Parallize, you can unlock the full power of parallel processing in your Python applications. Give it a try today and experience the difference for yourself.&lt;/p&gt;

&lt;p&gt;Happy coding!&lt;/p&gt;

</description>
      <category>programming</category>
      <category>python</category>
      <category>asyncprogramming</category>
      <category>fastapi</category>
    </item>
  </channel>
</rss>
