<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Davi Domingos de Oliveira</title>
    <description>The latest articles on DEV Community by Davi Domingos de Oliveira (@dom1ng0s).</description>
    <link>https://dev.to/dom1ng0s</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/dom1ng0s"/>
    <language>en</language>
    <item>
      <title>Arquitetura Zero Cost: Como transformei o Git em um banco de dados temporal com GitHub Actions</title>
      <dc:creator>Davi Domingos de Oliveira</dc:creator>
      <pubDate>Fri, 03 Apr 2026 02:49:00 +0000</pubDate>
      <link>https://dev.to/dom1ng0s/arquitetura-zero-cost-como-transformei-o-git-em-um-banco-de-dados-temporal-com-github-actions-316j</link>
      <guid>https://dev.to/dom1ng0s/arquitetura-zero-cost-como-transformei-o-git-em-um-banco-de-dados-temporal-com-github-actions-316j</guid>
      <description>&lt;h2&gt;
  
  
  *&lt;em&gt;1.A armadilha do Overengineering *&lt;/em&gt;
&lt;/h2&gt;

&lt;p&gt;Quando comecei a desenhar o Gado-Scraper, eu tinha um problema e uma restrição clara: eu precisava &lt;strong&gt;monitorar cotações pecuárias diariamente&lt;/strong&gt;, mas, como estudante, meu orçamento para infraestrutura em nuvem era zero. É muito fácil, hoje em dia, puxar o cartão de crédito e provisionar um banco de dados relacional e uma função serverless na AWS para rodar um script de 1 minuto por dia. Mas eu não podia me dar a esse luxo. Essa restrição financeira me forçou a pensar fora da caixa e evitar a armadilha clássica do &lt;em&gt;overengineering&lt;/em&gt;. Foi assim que descobri como transformar o próprio repositório no GitHub em um banco de dados temporal automatizado, com custo absolutamente zero.&lt;/p&gt;

&lt;p&gt;Neste artigo, vou te mostrar exatamente como construí uma pipeline ETL de dados que roda todos os dias sem servidor, sem um banco de dados tradicional e com custo exatamente zero. Você vai ver na prática como transformei o próprio repositório no GitHub e o GitHub Actions em um banco de dados temporal automatizado.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;2. A Mudança de Paradigma: "Git-as-a-Database"&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Mas o que é &lt;em&gt;Git-as-a-Database&lt;/em&gt;? Usar o repositório do GitHub como um banco de dados é uma premissa diferente do que o Git foi originalmente pensado, porém se encaixa perfeitamente com o rastreio de mudanças do sistema. Quando paramos para pensar, o Git foi criado com um propósito muito bem definido: rastrear o histórico de alterações em arquivos baseados em texto de forma imutável e cronológica. Bancos de dados tradicionais são incríveis, mas para salvar apenas a variação de uma ou duas cotações uma vez ao dia, eles são um exagero técnico.&lt;/p&gt;

&lt;p&gt;Para eliminar a necessidade de um servidor 24h hospedado na nuvem, decidi usar o &lt;strong&gt;&lt;em&gt;Github Actions&lt;/em&gt;&lt;/strong&gt;  no meu robô de captura. Configurei uma &lt;em&gt;pipeline&lt;/em&gt; automatizada que, todo dia em horário de mercado executa meus &lt;em&gt;scripts&lt;/em&gt; de &lt;em&gt;web scraping&lt;/em&gt;  escritos em &lt;strong&gt;&lt;em&gt;Python&lt;/em&gt;&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;O pulo do gato arquitetural acontece logo após a extração. Assim que o script coleta os valores da arroba, ele sobrescreve dois arquivos simples na raiz do projeto: &lt;code&gt;cotacoes_boi_hoje.json&lt;/code&gt; e &lt;code&gt;cotacoes_novilha_hoje.json&lt;/code&gt;. É aqui que a mágica acontece: em vez de abrir uma conexão pesada com a AWS e fazer um comando &lt;code&gt;INSERT&lt;/code&gt; em um banco de dados na nuvem, a pipeline simplesmente faz um &lt;code&gt;git push&lt;/code&gt;. &lt;strong&gt;O próprio repositório passa a funcionar como o banco de dados histórico&lt;/strong&gt;. Todo o histórico de preços da pecuária fica perfeitamente preservado no &lt;code&gt;git log&lt;/code&gt;, onde cada &lt;em&gt;commit&lt;/em&gt; diário atua como um &lt;em&gt;snapshot&lt;/em&gt; exato daquele dia, pronto para ser consumido por análises de séries temporais&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Dissecando a Pipeline YAML
&lt;/h2&gt;

&lt;p&gt;Todo o "cérebro" do robô de captura reside num único ficheiro de configuração YAML. É ele que orquestra toda a nossa infraestrutura efémera.&lt;/p&gt;

&lt;p&gt;Logo no início do ficheiro, defini três gatilhos (&lt;code&gt;on&lt;/code&gt;): o &lt;code&gt;schedule&lt;/code&gt; com uma &lt;em&gt;cron expression&lt;/em&gt; (&lt;code&gt;0 9 * * *&lt;/code&gt;) configurada para rodar automaticamente todos os dias às 9h UTC (06h da manhã no horário de Brasília), garantindo a captura das cotações no início do dia antes da abertura dos mercados locais; o &lt;code&gt;push&lt;/code&gt; para garantir que os dados atualizam caso eu faça alguma alteração no código; e o &lt;code&gt;workflow_dispatch&lt;/code&gt;, que me permite disparar o robô manualmente com um botão caso a extração falhe por alguma instabilidade externa.&lt;/p&gt;

&lt;p&gt;No entanto, havia um obstáculo técnico de segurança. Por padrão, os &lt;em&gt;runners&lt;/em&gt; (servidores temporários) do GitHub Actions possuem permissão apenas de leitura no repositório. Como o nosso objetivo era que o script em Python sobrescrevesse os ficheiros JSON e fizesse o &lt;em&gt;commit&lt;/em&gt; dos novos dados, eu precisava de dar autorização de escrita à máquina, tudo isso sem expor as minhas senhas ou criar tokens complexos.&lt;/p&gt;

&lt;p&gt;A solução elegante foi ajustar as permissões do &lt;em&gt;job&lt;/em&gt; usando a &lt;em&gt;flag&lt;/em&gt; &lt;code&gt;permissions: contents: write&lt;/code&gt;. Veja como ficou a estrutura real da pipeline:&lt;/p&gt;

&lt;p&gt;YAML&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Atualização Diária (Boi e Novilha)&lt;/span&gt;

&lt;span class="na"&gt;on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;push&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;             &lt;span class="c1"&gt;# Roda ao salvar&lt;/span&gt;
    &lt;span class="na"&gt;branches&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt; &lt;span class="nv"&gt;main&lt;/span&gt; &lt;span class="pi"&gt;]&lt;/span&gt;
  &lt;span class="na"&gt;schedule&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;cron&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;0&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;9&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;*&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;*&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;*'&lt;/span&gt;
  &lt;span class="na"&gt;workflow_dispatch&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="c1"&gt;# Permite disparo manual&lt;/span&gt;

&lt;span class="na"&gt;jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;raspagem-completa&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;

    &lt;span class="c1"&gt;# PERMISSÃO DE ESCRITA É OBRIGATÓRIA PARA SALVAR O JSON&lt;/span&gt;
    &lt;span class="na"&gt;permissions&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;contents&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;write&lt;/span&gt;

    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;1. Baixar código do repositório&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@v3&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;2. Instalar Python&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/setup-python@v4&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;python-version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;3.9'&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;3. Instalar bibliotecas&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;pip install -r requirements.txt&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;4. Rodar Scraper do Boi&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;python scraper_boi.py&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;5. Rodar Scraper da Novilha&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;python scraper_novilha.py&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;6. Salvar arquivos JSON no GitHub&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
          &lt;span class="s"&gt;git config --global user.name 'Bot do Scraper'&lt;/span&gt;
          &lt;span class="s"&gt;git config --global user.email 'actions@github.com'&lt;/span&gt;

          &lt;span class="s"&gt;# Adiciona os dois arquivos gerados&lt;/span&gt;
          &lt;span class="s"&gt;git add cotacoes_boi_hoje.json cotacoes_novilha_hoje.json&lt;/span&gt;

          &lt;span class="s"&gt;# Só commita se tiver mudança&lt;/span&gt;
          &lt;span class="s"&gt;git diff --quiet &amp;amp;&amp;amp; git diff --staged --quiet || (git commit -m "Dados atualizados: $(date +'%Y-%m-%d')" &amp;amp;&amp;amp; git push)&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Ao definir &lt;code&gt;contents: write&lt;/code&gt;, a pipeline ganha a autoridade necessária para fazer o &lt;em&gt;auto-commit&lt;/em&gt; direto na &lt;em&gt;branch&lt;/em&gt; principal assim que o Python finaliza a extração. Nenhuma configuração adicional, nenhum banco de dados provisionado e nenhum servidor para gerir.&lt;/p&gt;

&lt;h2&gt;
  
  
  4. Segurança de Execução: Prevenindo falhas na Pipeline
&lt;/h2&gt;

&lt;p&gt;Se prestou atenção no &lt;strong&gt;Passo 6&lt;/strong&gt; do código acima, deve ter notado um comando Bash um pouco longo na hora de fazer o &lt;em&gt;commit&lt;/em&gt;. Existe um motivo crítico para ele estar ali.&lt;/p&gt;

&lt;p&gt;Um problema clássico de automações de CI/CD que fazem &lt;em&gt;auto-commit&lt;/em&gt; é o &lt;em&gt;pipeline crash&lt;/em&gt; (falha no &lt;em&gt;job&lt;/em&gt;) quando o Git tenta fazer um &lt;em&gt;commit&lt;/em&gt;, mas não existem ficheiros modificados para adicionar à &lt;em&gt;staging area&lt;/em&gt;. Para garantir uma execução contínua, segura e resiliente, adicionei a seguinte trava lógica:&lt;/p&gt;

&lt;p&gt;Bash&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;git diff &lt;span class="nt"&gt;--quiet&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; git diff &lt;span class="nt"&gt;--staged&lt;/span&gt; &lt;span class="nt"&gt;--quiet&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="o"&gt;(&lt;/span&gt;git commit &lt;span class="nt"&gt;-m&lt;/span&gt; &lt;span class="s2"&gt;"Dados atualizados: &lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nb"&gt;date&lt;/span&gt; +&lt;span class="s1"&gt;'%Y-%m-%d'&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; git push&lt;span class="o"&gt;)&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;O comando &lt;code&gt;git diff --quiet&lt;/code&gt; atua como um inspetor de segurança do repositório. Como o meu script Python injeta automaticamente a data do dia na chave &lt;code&gt;"data_coleta"&lt;/code&gt; do JSON, o repositório garante um &lt;em&gt;snapshot&lt;/em&gt; histórico diário, gerando um registo contínuo da série temporal. No entanto, se eu precisar de disparar a pipeline manualmente (&lt;code&gt;workflow_dispatch&lt;/code&gt;) mais de uma vez no &lt;strong&gt;mesmo dia&lt;/strong&gt; para fins de &lt;em&gt;debugging&lt;/em&gt;, a verificação do &lt;em&gt;diff&lt;/em&gt; retorna silêncio (já que a data e os preços daquele dia específico já foram registados na primeira execução).&lt;/p&gt;

&lt;p&gt;Quando isso acontece, o fluxo é encerrado de forma limpa, sem tentar forçar um &lt;em&gt;commit&lt;/em&gt; vazio que faria a GitHub Action falhar e emitir um alerta de erro na infraestrutura. O repositório mantém-se contínuo, gerando a série temporal diária exata de que preciso, mas totalmente à prova de falhas de &lt;em&gt;runtime&lt;/em&gt; em reexecuções no mesmo dia.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;5. Distribuição de Dados: O fim da API Rest&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Até aqui, resolvemos a automação da coleta (com GitHub Actions) e o armazenamento histórico (com &lt;em&gt;Git-as-a-Database&lt;/em&gt;). Mas ainda faltava a última peça do quebra-cabeça: a distribuição.&lt;/p&gt;

&lt;p&gt;Se eu não tenho um banco de dados tradicional rodando na nuvem, como o meu ERP Zootécnico (o Sistema de Gestão de Gado) vai consumir essas cotações atualizadas? A abordagem clássica ditaria que eu precisasse provisionar um servidor e subir uma API REST (usando Flask ou FastAPI) apenas para ler os arquivos JSON e devolver as respostas via HTTP. Ou seja, mais infraestrutura para monitorar e manter.&lt;/p&gt;

&lt;p&gt;Foi aí que apliquei a regra de ouro da arquitetura minimalista: use o que já está pronto.&lt;/p&gt;

&lt;p&gt;O GitHub possui uma rede de distribuição de conteúdo (CDN) global e nativa projetada para servir arquivos brutos. A cada &lt;em&gt;commit&lt;/em&gt; que o robô faz atualizando o &lt;code&gt;cotacoes_boi_hoje.json&lt;/code&gt; e o &lt;code&gt;cotacoes_novilha_hoje.json&lt;/code&gt;, os dados ficam imediatamente disponíveis através do domínio &lt;code&gt;raw.githubusercontent.com&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Na prática, eu transformei o próprio repositório em uma API estática. O meu sistema de gestão agora faz uma simples requisição HTTP &lt;code&gt;GET&lt;/code&gt; direto para a URL &lt;em&gt;Raw&lt;/em&gt; do GitHub. Ele consome o JSON mais recente de forma instantânea, servido por uma infraestrutura de altíssima disponibilidade bancada pela Microsoft, com &lt;strong&gt;custo absolutamente zero&lt;/strong&gt; e &lt;strong&gt;sem precisar escrever uma única linha de código de rotas backend&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;6. Conclusão: A Mentalidade FinOps&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Todo projeto que eu me proponho a fazer me ensina algo. Um dos ensinamentos do Gado-Scraper foi: &lt;strong&gt;Nem sempre a solução mais complexa é a mais adequada.&lt;/strong&gt; No final das contas, aquela limitação financeira citada no inicio do texto tornou esse projeto muito mais especial do que só um &lt;em&gt;web scraper&lt;/em&gt;, foi toda uma jornada de busca de conhecimento para chegar nesse resultado satisfatório.&lt;/p&gt;

&lt;p&gt;Quer dar uma olhada na pipeline YAML rodando ou usar o repositório como base para as suas próprias automações?&lt;br&gt;&lt;br&gt;
💻 &lt;strong&gt;Acesse o código-fonte:&lt;/strong&gt;  &lt;a href="https://github.com/Dom1ng0s/Gado-Scraper" rel="noopener noreferrer"&gt;Dom1ng0s/Gado-Scraper no GitHub&lt;/a&gt;&lt;/p&gt;

</description>
      <category>architecture</category>
      <category>automation</category>
      <category>github</category>
      <category>showdev</category>
    </item>
    <item>
      <title>Data Modeling for Agriculture: Combining Cash Flow and Average Daily Gain (ADG) in the Same Database</title>
      <dc:creator>Davi Domingos de Oliveira</dc:creator>
      <pubDate>Fri, 03 Apr 2026 01:51:48 +0000</pubDate>
      <link>https://dev.to/dom1ng0s/data-modeling-for-agriculture-combining-cash-flow-and-average-daily-gain-adg-in-the-same-database-445p</link>
      <guid>https://dev.to/dom1ng0s/data-modeling-for-agriculture-combining-cash-flow-and-average-daily-gain-adg-in-the-same-database-445p</guid>
      <description>&lt;h2&gt;
  
  
  1. Introduction: A Real Problem on the Ground
&lt;/h2&gt;

&lt;p&gt;When people talk about technology in agriculture, the conversation usually jumps straight to the flashy stuff — autonomous tractors, drone-mapped fields, and bleeding-edge IoT sensors. But that's the reality for a small minority. There's a quiet technological divide running through the agricultural sector that rarely makes headlines.&lt;/p&gt;

&lt;p&gt;Livestock alone accounts for roughly 6–7% of Brazil's entire GDP. Yet 77% of rural properties in the country belong to small and mid-sized producers. So how does the vast majority of these operators manage the complexity of running a farm — tracking supplementation costs, live-weight commodity price swings, and the Average Daily Gain (ADG) of dozens or hundreds of animals? With paper notebooks, whiteboards, and generic spreadsheets that don't talk to each other.&lt;/p&gt;

&lt;p&gt;As engineers, we have a name for this: &lt;strong&gt;Dark Data&lt;/strong&gt;. Thousands of data points generated every single day, lost or severely underutilized simply because they were never captured in a relational database. Profit margins in modern livestock production are razor-thin, and making decisions based on unstructured data is a bottleneck worth billions of dollars.&lt;/p&gt;

&lt;p&gt;In this post, I want to walk through the architecture of the &lt;strong&gt;Livestock Management System (LMS)&lt;/strong&gt;, a zootechnical ERP I built to bring structure to this long tail of agricultural data — with a particular focus on how I solved the project's most significant performance bottleneck.&lt;/p&gt;




&lt;h2&gt;
  
  
  2. The Technical Challenge: Python vs. the Database
&lt;/h2&gt;

&lt;h3&gt;
  
  
  2.1. Understanding the Bottleneck (The ADG Math)
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;ADG&lt;/strong&gt; (Average Daily Gain) is an estimate of an animal's daily weight gain, typically measured in kilograms. Given a table structured like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="n"&gt;IF&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;EXISTS&lt;/span&gt; &lt;span class="n"&gt;pesagens&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="nb"&gt;INT&lt;/span&gt; &lt;span class="n"&gt;AUTO_INCREMENT&lt;/span&gt; &lt;span class="k"&gt;PRIMARY&lt;/span&gt; &lt;span class="k"&gt;KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;animal_id&lt;/span&gt; &lt;span class="nb"&gt;INT&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;data_pesagem&lt;/span&gt; &lt;span class="nb"&gt;DATE&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;peso&lt;/span&gt; &lt;span class="nb"&gt;DECIMAL&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;deleted_at&lt;/span&gt; &lt;span class="nb"&gt;DATETIME&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt; &lt;span class="k"&gt;DEFAULT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;FOREIGN&lt;/span&gt; &lt;span class="k"&gt;KEY&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;animal_id&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;REFERENCES&lt;/span&gt; &lt;span class="n"&gt;animais&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The ADG formula is straightforward: &lt;em&gt;(Latest Weight - First Weight) / Days between first and last weigh-in&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;On a working farm, this metric directly drives decision-making. It's the number that tells a rancher whether a nutrition protocol is working — or whether a particular animal is just burning through feed budget with nothing to show for it.&lt;/p&gt;

&lt;p&gt;The real problem starts when you try to turn that simple formula into scalable code.&lt;/p&gt;

&lt;h3&gt;
  
  
  2.2. The Backend Processing Trap (The Python Anti-Pattern)
&lt;/h3&gt;

&lt;p&gt;My first instinct was to use MySQL purely as a data store and let the Python backend handle all the computation — something roughly like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;cursor&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;execute&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;SELECT animal_id, peso, data_pesagem FROM pesagens WHERE deleted_at IS NULL&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;all_weigh_ins&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cursor&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;fetchall&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;  &lt;span class="c1"&gt;# &amp;lt;--- THE BOTTLENECK STARTS HERE
&lt;/span&gt;
&lt;span class="n"&gt;herd_adg&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt;

&lt;span class="c1"&gt;# Python doing the database's job
&lt;/span&gt;&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;record&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;all_weigh_ins&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;animal_id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;record&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;animal_id&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;animal_id&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;herd_adg&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;herd_adg&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;animal_id&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;weights&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[],&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;dates&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[]}&lt;/span&gt;

    &lt;span class="n"&gt;herd_adg&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;animal_id&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;weights&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;record&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;peso&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
    &lt;span class="n"&gt;herd_adg&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;animal_id&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;dates&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;record&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;data_pesagem&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

&lt;span class="c1"&gt;# Calculating ADG per animal with CPU-heavy loops
&lt;/span&gt;&lt;span class="n"&gt;results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;animal_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;data&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;herd_adg&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;items&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="n"&gt;first_weight&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;min&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;weights&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
    &lt;span class="n"&gt;last_weight&lt;/span&gt;  &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;max&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;weights&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

    &lt;span class="n"&gt;first_date&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;min&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;dates&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
    &lt;span class="n"&gt;last_date&lt;/span&gt;  &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;max&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;dates&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

    &lt;span class="n"&gt;days&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;last_date&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;first_date&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="n"&gt;days&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;days&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;adg&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;last_weight&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;first_weight&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="n"&gt;days&lt;/span&gt;
        &lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;animal_id&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;animal_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;adg&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;adg&lt;/span&gt;&lt;span class="p"&gt;})&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After the first load-testing round, I knew this approach was a dead end. The backend was carrying the full weight of raw computation, which caused a dramatic spike in server RAM consumption. The direct consequence: the system would slow down and eventually lock up as it scaled to more farms and larger herds.&lt;/p&gt;




&lt;h2&gt;
  
  
  3. The Solution: Push Intelligence into the Database
&lt;/h2&gt;

&lt;p&gt;To fix the memory bottleneck, I flipped the logic entirely. Instead of pulling thousands of raw rows into Python and forcing the application layer to process them, why not do the math where the data already lives?&lt;/p&gt;

&lt;p&gt;That's how I designed the "Performance-First" architecture of the LMS — delegating the heavy lifting to MySQL.&lt;/p&gt;

&lt;h3&gt;
  
  
  3.1. Optimized SQL Views and Window Functions
&lt;/h3&gt;

&lt;p&gt;The centerpiece of this solution is a set of &lt;strong&gt;SQL Views&lt;/strong&gt;. A View acts as a virtual table with the formula baked in. Instead of computing ADG on the fly in the backend at request time, I teach the calculation to the database itself.&lt;/p&gt;

&lt;p&gt;To ensure the computation correctly reflects the chronological order of weigh-ins — even when an animal loses weight during a period — I used &lt;strong&gt;Common Table Expressions (CTEs)&lt;/strong&gt; and &lt;strong&gt;Window Functions&lt;/strong&gt; (&lt;code&gt;ROW_NUMBER&lt;/code&gt;).&lt;/p&gt;

&lt;p&gt;Here's how I structured the &lt;code&gt;v_gmd_analitico&lt;/code&gt; view to solve this complex aggregation automatically:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;OR&lt;/span&gt; &lt;span class="k"&gt;REPLACE&lt;/span&gt; &lt;span class="k"&gt;VIEW&lt;/span&gt; &lt;span class="n"&gt;v_gmd_analitico&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt;
&lt;span class="k"&gt;WITH&lt;/span&gt; &lt;span class="n"&gt;PesagensOrdenadas&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="k"&gt;SELECT&lt;/span&gt;
        &lt;span class="n"&gt;animal_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;data_pesagem&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;peso&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;ROW_NUMBER&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="n"&gt;OVER&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;PARTITION&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;animal_id&lt;/span&gt; &lt;span class="k"&gt;ORDER&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;data_pesagem&lt;/span&gt; &lt;span class="k"&gt;ASC&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;rn_asc&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;ROW_NUMBER&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="n"&gt;OVER&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;PARTITION&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;animal_id&lt;/span&gt; &lt;span class="k"&gt;ORDER&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;data_pesagem&lt;/span&gt; &lt;span class="k"&gt;DESC&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;rn_desc&lt;/span&gt;
    &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;pesagens&lt;/span&gt;
    &lt;span class="k"&gt;WHERE&lt;/span&gt; &lt;span class="n"&gt;deleted_at&lt;/span&gt; &lt;span class="k"&gt;IS&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;
&lt;span class="p"&gt;),&lt;/span&gt;
&lt;span class="n"&gt;PrimeiraUltima&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="k"&gt;SELECT&lt;/span&gt;
        &lt;span class="n"&gt;animal_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="k"&gt;MAX&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;CASE&lt;/span&gt; &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;rn_asc&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="n"&gt;data_pesagem&lt;/span&gt; &lt;span class="k"&gt;END&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;data_inicial&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="k"&gt;MAX&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;CASE&lt;/span&gt; &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;rn_asc&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="n"&gt;peso&lt;/span&gt; &lt;span class="k"&gt;END&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;peso_inicial&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="k"&gt;MAX&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;CASE&lt;/span&gt; &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;rn_desc&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="n"&gt;data_pesagem&lt;/span&gt; &lt;span class="k"&gt;END&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;data_final&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="k"&gt;MAX&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;CASE&lt;/span&gt; &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;rn_desc&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="n"&gt;peso&lt;/span&gt; &lt;span class="k"&gt;END&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;peso_final&lt;/span&gt;
    &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;PesagensOrdenadas&lt;/span&gt;
    &lt;span class="k"&gt;GROUP&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;animal_id&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;SELECT&lt;/span&gt;
    &lt;span class="n"&gt;a&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;user_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;a&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;animal_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;a&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;brinco&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;peso_final&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;peso_final&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;peso_inicial&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;ganho_total&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;DATEDIFF&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data_final&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data_inicial&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;dias&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;CASE&lt;/span&gt;
        &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;DATEDIFF&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data_final&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data_inicial&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;
        &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;peso_final&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;peso_inicial&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="n"&gt;DATEDIFF&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data_final&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data_inicial&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;ELSE&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;
    &lt;span class="k"&gt;END&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;gmd&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;PrimeiraUltima&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;
&lt;span class="k"&gt;JOIN&lt;/span&gt; &lt;span class="n"&gt;animais&lt;/span&gt; &lt;span class="n"&gt;a&lt;/span&gt; &lt;span class="k"&gt;ON&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;animal_id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;a&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;
&lt;span class="k"&gt;WHERE&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data_inicial&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data_final&lt;/span&gt;
  &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;a&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;deleted_at&lt;/span&gt; &lt;span class="k"&gt;IS&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here's where the architecture pays off: when Flask needs to render a producer's dashboard, it no longer runs unbounded loops or pulls raw data across the network. The main listing queries the base &lt;code&gt;animais&lt;/code&gt; table using server-side pagination, and the View is called surgically only when complex metrics are needed. MySQL handles all the grouping and guards against division-by-zero, delivering the zootechnical metric ready for aggregation — &lt;code&gt;SELECT AVG(gmd) FROM v_gmd_analitico&lt;/code&gt; — or for building out a detailed animal profile page instantly.&lt;/p&gt;

&lt;h3&gt;
  
  
  3.2. Composite Indexes for Instant Lookups
&lt;/h3&gt;

&lt;p&gt;Solving the ADG computation with a View was a major win, but what about when a producer wants to filter costs for a specific lot? Without proper indexing, MySQL would fall back to a &lt;strong&gt;Full Table Scan&lt;/strong&gt; on every click — scanning the entire table regardless of how targeted the query was.&lt;/p&gt;

&lt;p&gt;I addressed this with composite structural indexes — &lt;code&gt;idx_pesagens_otimizada&lt;/code&gt; and &lt;code&gt;idx_custos_busca&lt;/code&gt; being the two most critical. Even as a client registers thousands of animals over the years, every filtered lookup continues to return in milliseconds. The indexes effectively make query cost O(1) at the application layer, regardless of table size.&lt;/p&gt;

&lt;h3&gt;
  
  
  3.3. Server-Side Pagination and Connection Pooling
&lt;/h3&gt;

&lt;p&gt;Two final rules govern the communication layer between the API and the database:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Server-Side Pagination:&lt;/strong&gt; The main dashboard never loads the entire animal roster in a single response. I use &lt;code&gt;LIMIT&lt;/code&gt; and &lt;code&gt;OFFSET&lt;/code&gt; directly in MySQL queries, sending the browser only what fits on the current page.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Connection Pooling:&lt;/strong&gt; Using &lt;code&gt;mysql-connector-python&lt;/code&gt;, I configured a connection pool at database setup time. This eliminates the massive TCP overhead of opening and closing a new database connection on every incoming API request — the pool keeps connections warm and ready to reuse.&lt;/p&gt;




&lt;h2&gt;
  
  
  4. Results and Real-World Impact
&lt;/h2&gt;

&lt;p&gt;The best architecture isn't the one built with the most fashionable tools — it's the one that eliminates the user's actual bottleneck in the smartest way. Shifting to a database-first approach delivered immediate, measurable gains on two fronts.&lt;/p&gt;

&lt;h3&gt;
  
  
  Business Impact: The End of Dark Data
&lt;/h3&gt;

&lt;p&gt;A livestock producer who previously spent around &lt;strong&gt;2 hours&lt;/strong&gt; manually cross-referencing weigh-in notebooks with slow Excel files now has the complete farm picture in front of them. The dashboard delivers the full herd ADG analysis and consolidated cash flow in under &lt;strong&gt;3 minutes&lt;/strong&gt;. Dark data became actionable intelligence.&lt;/p&gt;

&lt;h3&gt;
  
  
  Technical Impact: A Server That Can Actually Scale
&lt;/h3&gt;

&lt;p&gt;The Flask server is no longer choking on in-memory computation. RAM consumption dropped sharply because Python stopped pulling thousands of rows over the wire to do arithmetic it was never meant to do. With the SQL View absorbing the heavy calculation, composite indexes accelerating every filtered query, and pagination controlling data transfer, the system is genuinely production-scalable. Double or triple the number of registered animals — the dashboard still opens in milliseconds.&lt;/p&gt;




&lt;h2&gt;
  
  
  5. Conclusion
&lt;/h2&gt;

&lt;p&gt;Generic software fails in agriculture because it treats a steer's live weight like SKU inventory in a retail system. Livestock is a living system, and its key performance indicators — like Average Daily Gain — demand an architecture designed around them, not bolted on as an afterthought.&lt;/p&gt;

&lt;p&gt;In the end, good software solves real problems. Going from a rancher's operational pain to a production-grade SQL View using CTEs and Window Functions is what separates purpose-built tooling from off-the-shelf software dressed up with a new logo.&lt;/p&gt;

&lt;p&gt;Want to dig into the code or see this architecture running live?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;💻 &lt;strong&gt;Source code:&lt;/strong&gt; &lt;a href="https://github.com/Dom1ng0s/sistema_gado" rel="noopener noreferrer"&gt;Dom1ng0s/sistema_gado on GitHub&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;🚀 &lt;strong&gt;Live demo:&lt;/strong&gt; Active deployment via Docker on Oracle Cloud: &lt;a href="http://163.176.171.60" rel="noopener noreferrer"&gt;http://163.176.171.60&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Appendix B: Brazilian Agribusiness Terms — A Translator's Note
&lt;/h2&gt;

&lt;p&gt;Two terms in the original Portuguese source require special handling for an international engineering audience:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;"Arroba" (@ — the Brazilian livestock weight unit)&lt;/strong&gt;&lt;br&gt;
In Brazilian cattle markets, live weight is quoted per &lt;em&gt;arroba&lt;/em&gt; — a unit equivalent to 15 kg (approximately 33 lbs). When a rancher tracks the "price per arroba," they're tracking the live-weight commodity price used to calculate the market value of an animal at a given weight. For this article, I opted to avoid the term entirely and instead refer to "live-weight commodity price" and "market price per kilogram," which convey the same economic concept without requiring domain-specific context.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;"Boi Gordo" (lit. "Fat Ox" / Finished Beef Steer)&lt;/strong&gt;&lt;br&gt;
&lt;em&gt;Boi Gordo&lt;/em&gt; is the Brazilian futures market category for a grain-finished steer ready for slaughter — the rough equivalent of "fed cattle" in the CME Group's Live Cattle futures contract in the US. Price fluctuation in this category is what the article refers to when mentioning commodity price swings that affect profitability. I translated this simply as "live-weight commodity price" throughout to keep the focus on the economic signal rather than the Brazilian market nomenclature.&lt;/p&gt;

</description>
      <category>architecture</category>
      <category>data</category>
      <category>database</category>
      <category>dataengineering</category>
    </item>
    <item>
      <title>Modelagem de Dados no Agro: Unindo fluxo de caixa e Ganho Médio Diário (GMD) no mesmo Banco de Dados</title>
      <dc:creator>Davi Domingos de Oliveira</dc:creator>
      <pubDate>Wed, 01 Apr 2026 02:43:53 +0000</pubDate>
      <link>https://dev.to/dom1ng0s/modelagem-de-dados-no-agro-unindo-fluxo-de-caixa-e-ganho-medio-diario-gmd-no-mesmo-banco-de-dados-133i</link>
      <guid>https://dev.to/dom1ng0s/modelagem-de-dados-no-agro-unindo-fluxo-de-caixa-e-ganho-medio-diario-gmd-no-mesmo-banco-de-dados-133i</guid>
      <description>&lt;h2&gt;
  
  
  1. Introdução: O Problema Real do Produtor
&lt;/h2&gt;

&lt;p&gt;Quando se fala em tecnologia no agronegócio, é comum pensarmos imediatamente no "Agro 4.0": tratores autônomos, drones escaneando lavouras e sensores IoT de última geração. Porém, essa é a realidade de uma minoria absoluta. Existe um abismo tecnológico silencioso no campo brasileiro.&lt;/p&gt;

&lt;p&gt;A pecuária sozinha representa cerca de 6% a 7% de todo o PIB nacional. No entanto, 77% das propriedades rurais do país pertencem a pequenos e médios produtores. E como a grande maioria desses produtores gerencia a complexidade de uma fazenda — calculando custos de suplementação, flutuação do preço da arroba e o Ganho Médio Diário (GMD) de dezenas ou centenas de animais? Usando cadernos de papel, quadros brancos ou planilhas genéricas que não se conversam.&lt;/p&gt;

&lt;p&gt;Para nós, desenvolvedores, o nome disso é &lt;em&gt;Dark Data&lt;/em&gt; (dados escuros). São milhares de pontos de dados gerados diariamente que são perdidos ou subutilizados porque não estão estruturados em um banco de dados relacional. O lucro na pecuária moderna está espremido em centavos, e tomar decisões baseadas em dados não estruturados é um gargalo de bilhões de reais. &lt;/p&gt;

&lt;p&gt;Neste artigo, quero explorar a arquitetura do &lt;strong&gt;Sistema de Gestão de Gado (SGG)&lt;/strong&gt;, um ERP zootécnico que construí para estruturar essa "cauda longa" da pecuária, focando em como resolvi o maior gargalo de performance do projeto.&lt;/p&gt;

&lt;h2&gt;
  
  
  2. O Desafio Técnico: Python vs. Banco de Dados
&lt;/h2&gt;

&lt;h3&gt;
  
  
  2.1. Entendendo o Gargalo (A Matemática do GMD)
&lt;/h3&gt;

&lt;p&gt;O &lt;strong&gt;GMD&lt;/strong&gt; (Ganho Médio Diário), é a estimativa de crescimento diário de um animal, geralmente medido em Kg.&lt;br&gt;
Numa tabela estruturada como:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;        &lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="n"&gt;IF&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;EXISTS&lt;/span&gt; &lt;span class="n"&gt;pesagens&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="nb"&gt;INT&lt;/span&gt; &lt;span class="n"&gt;AUTO_INCREMENT&lt;/span&gt; &lt;span class="k"&gt;PRIMARY&lt;/span&gt; &lt;span class="k"&gt;KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;animal_id&lt;/span&gt; &lt;span class="nb"&gt;INT&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;data_pesagem&lt;/span&gt; &lt;span class="nb"&gt;DATE&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;peso&lt;/span&gt; &lt;span class="nb"&gt;DECIMAL&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;deleted_at&lt;/span&gt; &lt;span class="nb"&gt;DATETIME&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt; &lt;span class="k"&gt;DEFAULT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="k"&gt;FOREIGN&lt;/span&gt; &lt;span class="k"&gt;KEY&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;animal_id&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;REFERENCES&lt;/span&gt; &lt;span class="n"&gt;animais&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;O cálculo do GMD é dado como &lt;em&gt;(Último Peso - Primeiro Peso)/Dias entre o primeiro e o último peso&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;Numa fazenda, essa estatística afeta diretamente a tomada de decisão. É essa métrica que diz ao pecuarista se a dieta de nutrição está funcionando ou se o animal está apenas dando prejuízo.&lt;/p&gt;

&lt;p&gt;O problema real começa quando tentamos transformar essa fórmula simples em código escalável&lt;/p&gt;

&lt;h3&gt;
  
  
  2.2. A Armadilha de Processar no Backend (O Risco do Python)
&lt;/h3&gt;

&lt;p&gt;A minha primeira ideia foi usar o &lt;strong&gt;MySQL&lt;/strong&gt; apenas para guardar as informações e usar o backend em Python para lidar com os cálculos, algo semelhante a:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;cursor&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;execute&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;SELECT animal_id, peso, data_pesagem FROM pesagens WHERE deleted_at IS NULL&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;todas_pesagens&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cursor&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;fetchall&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="c1"&gt;# &amp;lt;--- O GARGALO COMEÇA AQUI
&lt;/span&gt;
&lt;span class="n"&gt;rebanho_gmd&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt;

&lt;span class="c1"&gt;# 2. O Python tentando fazer o trabalho do Banco de Dados
&lt;/span&gt;&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;registro&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;todas_pesagens&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;animal_id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;registro&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;animal_id&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;animal_id&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;rebanho_gmd&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;rebanho_gmd&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;animal_id&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;pesos&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[],&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;datas&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[]}&lt;/span&gt;

    &lt;span class="n"&gt;rebanho_gmd&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;animal_id&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;pesos&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;registro&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;peso&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
    &lt;span class="n"&gt;rebanho_gmd&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;animal_id&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;datas&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;registro&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;data_pesagem&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

&lt;span class="c1"&gt;# 3. Calculando o GMD animal por animal com loops (Alto consumo de CPU)
&lt;/span&gt;&lt;span class="n"&gt;resultados&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;animal_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;dados&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;rebanho_gmd&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;items&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="n"&gt;peso_inicial&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;min&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;dados&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;pesos&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
    &lt;span class="n"&gt;peso_final&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;max&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;dados&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;pesos&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

    &lt;span class="n"&gt;data_inicial&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;min&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;dados&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;datas&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
    &lt;span class="n"&gt;data_final&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;max&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;dados&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;datas&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

    &lt;span class="n"&gt;dias&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;data_final&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;data_inicial&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="n"&gt;days&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;dias&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;gmd&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;peso_final&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;peso_inicial&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="n"&gt;dias&lt;/span&gt;
        &lt;span class="n"&gt;resultados&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;animal_id&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;animal_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;gmd&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;gmd&lt;/span&gt;&lt;span class="p"&gt;})&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Porém, após a primeira bateria de testes, percebi que essa escolha era ineficiente, pois o backend ficaria responsável por boa parte dos cálculos brutos, aumentando drasticamente o consumo de RAM pelo servidor. A consequência direta disso seria a lentidão e o travamento do sistema à medida que o projeto escalasse para mais fazendas e rebanhos maiores.&lt;/p&gt;

&lt;h2&gt;
  
  
  3. A Solução: Inteligência no Banco de Dados
&lt;/h2&gt;

&lt;p&gt;Para resolver o problema do alto consumo de RAM que mostrei no código anterior, decidi inverter a lógica. Em vez de trazer milhares de linhas brutas para o Python e forçar a aplicação a processá-las, por que não fazer a matemática onde os dados já estão armazenados?&lt;/p&gt;

&lt;p&gt;Foi assim que estruturei a arquitetura "Performance-First" do Sistema de Gestão de Gado (SGG), delegando o processamento pesado ao MySQL.&lt;/p&gt;

&lt;h3&gt;
  
  
  3.1. Views SQL Otimizadas e Window Functions
&lt;/h3&gt;

&lt;p&gt;A peça central dessa solução foi a criação de &lt;strong&gt;Views SQL&lt;/strong&gt;. Uma View funciona como uma "tabela virtual". Em vez de calcular o Ganho Médio Diário (GMD) no backend em tempo real, eu ensino a fórmula matemática ao banco de dados.&lt;/p&gt;

&lt;p&gt;Para garantir que o cálculo considerasse a ordem cronológica exata das pesagens (mesmo que um animal perdesse peso no período), utilizei &lt;em&gt;Common Table Expressions (CTEs)&lt;/em&gt; e &lt;em&gt;Window Functions&lt;/em&gt; (&lt;code&gt;ROW_NUMBER&lt;/code&gt;).&lt;/p&gt;

&lt;p&gt;Veja como estruturei a View &lt;code&gt;v_gmd_analitico&lt;/code&gt; no meu banco para resolver esse cálculo complexo de forma automática:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;OR&lt;/span&gt; &lt;span class="k"&gt;REPLACE&lt;/span&gt; &lt;span class="k"&gt;VIEW&lt;/span&gt; &lt;span class="n"&gt;v_gmd_analitico&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt;
&lt;span class="k"&gt;WITH&lt;/span&gt; &lt;span class="n"&gt;PesagensOrdenadas&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="k"&gt;SELECT&lt;/span&gt; 
        &lt;span class="n"&gt;animal_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;data_pesagem&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;peso&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;ROW_NUMBER&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="n"&gt;OVER&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;PARTITION&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;animal_id&lt;/span&gt; &lt;span class="k"&gt;ORDER&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;data_pesagem&lt;/span&gt; &lt;span class="k"&gt;ASC&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;rn_asc&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;ROW_NUMBER&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="n"&gt;OVER&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;PARTITION&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;animal_id&lt;/span&gt; &lt;span class="k"&gt;ORDER&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;data_pesagem&lt;/span&gt; &lt;span class="k"&gt;DESC&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;rn_desc&lt;/span&gt;
    &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;pesagens&lt;/span&gt;
    &lt;span class="k"&gt;WHERE&lt;/span&gt; &lt;span class="n"&gt;deleted_at&lt;/span&gt; &lt;span class="k"&gt;IS&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;
&lt;span class="p"&gt;),&lt;/span&gt;
&lt;span class="n"&gt;PrimeiraUltima&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="k"&gt;SELECT&lt;/span&gt; 
        &lt;span class="n"&gt;animal_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="k"&gt;MAX&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;CASE&lt;/span&gt; &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;rn_asc&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="n"&gt;data_pesagem&lt;/span&gt; &lt;span class="k"&gt;END&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;data_inicial&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="k"&gt;MAX&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;CASE&lt;/span&gt; &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;rn_asc&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="n"&gt;peso&lt;/span&gt; &lt;span class="k"&gt;END&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;peso_inicial&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="k"&gt;MAX&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;CASE&lt;/span&gt; &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;rn_desc&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="n"&gt;data_pesagem&lt;/span&gt; &lt;span class="k"&gt;END&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;data_final&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="k"&gt;MAX&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;CASE&lt;/span&gt; &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;rn_desc&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="n"&gt;peso&lt;/span&gt; &lt;span class="k"&gt;END&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;peso_final&lt;/span&gt;
    &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;PesagensOrdenadas&lt;/span&gt;
    &lt;span class="k"&gt;GROUP&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;animal_id&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;SELECT&lt;/span&gt; 
    &lt;span class="n"&gt;a&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;user_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;a&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;animal_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;a&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;brinco&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;peso_final&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;peso_final&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;peso_inicial&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;ganho_total&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;DATEDIFF&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data_final&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data_inicial&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;dias&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;CASE&lt;/span&gt; 
        &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;DATEDIFF&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data_final&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data_inicial&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; 
        &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;peso_final&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;peso_inicial&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="n"&gt;DATEDIFF&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data_final&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data_inicial&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;ELSE&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; 
    &lt;span class="k"&gt;END&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;gmd&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;PrimeiraUltima&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;
&lt;span class="k"&gt;JOIN&lt;/span&gt; &lt;span class="n"&gt;animais&lt;/span&gt; &lt;span class="n"&gt;a&lt;/span&gt; &lt;span class="k"&gt;ON&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;animal_id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;a&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;
&lt;span class="k"&gt;WHERE&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data_inicial&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data_final&lt;/span&gt;
  &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;a&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;deleted_at&lt;/span&gt; &lt;span class="k"&gt;IS&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;"A mágica acontece aqui: Quando o Flask precisa processar os indicadores do pecuarista, ele não faz mais laços de repetição infinitos e não consome RAM desnecessária. A listagem principal consulta diretamente a tabela base de animais de forma paginada, enquanto a View é acionada cirurgicamente para as métricas complexas. O banco já resolve o agrupamento e previne divisões por zero, entregando o dado zootécnico pronto para extração de médias globais (ex: &lt;code&gt;SELECT AVG(gmd) FROM v_gmd_analitico&lt;/code&gt;) ou para a montagem instantânea da ficha detalhada de cada animal.&lt;/p&gt;

&lt;h3&gt;
  
  
  3.2. Índices Compostos para Buscas Instantâneas
&lt;/h3&gt;

&lt;p&gt;Ter a View resolveu o problema do cálculo do GMD, mas e quando o produtor quisesse buscar os custos de um lote específico? Para evitar que o MySQL fizesse um &lt;em&gt;Full Table Scan&lt;/em&gt; (varrer a tabela inteira a cada clique), a modelagem precisava ser inteligente.&lt;/p&gt;

&lt;p&gt;Implementei índices compostos estruturais, como o &lt;code&gt;idx_pesagens_otimizada&lt;/code&gt; e o &lt;code&gt;idx_custos_busca&lt;/code&gt;. Isso garantiu que, mesmo que o cliente cadastre milhares de animais ao longo dos anos, os filtros de busca continuem respondendo em milissegundos.&lt;/p&gt;

&lt;h3&gt;
  
  
  3.3. Paginação Server-Side e Pool de Conexões
&lt;/h3&gt;

&lt;p&gt;Para fechar o "pacote" de alta performance, apliquei duas últimas regras na comunicação entre a API e o Banco:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Server-Side Pagination:&lt;/strong&gt; O painel principal nunca carrega todos os animais de uma vez. Utilizo &lt;code&gt;LIMIT&lt;/code&gt; e &lt;code&gt;OFFSET&lt;/code&gt; diretamente nas queries do MySQL, enviando para o navegador apenas o que cabe na tela.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Connection Pooling:&lt;/strong&gt; Usando a biblioteca &lt;code&gt;mysql-connector-python&lt;/code&gt;, configurei um &lt;em&gt;pool&lt;/em&gt; de conexões logo no setup inicial do banco. Isso evita o &lt;em&gt;overhead&lt;/em&gt; gigantesco de abrir e fechar uma nova conexão TCP a cada nova requisição do usuário.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  4. Resultados e Impacto Real
&lt;/h2&gt;

&lt;p&gt;Na engenharia de software, a melhor arquitetura não é a que usa o maior número de ferramentas da moda, mas a que resolve o gargalo do usuário de forma inteligente. Ao adotar essa abordagem focada no banco de dados, o impacto foi imediato em duas frentes:&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;4.1. O Impacto no Negócio (O fim do Dark Data)&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;O pecuarista que antes gastava cerca de &lt;strong&gt;2 horas&lt;/strong&gt; cruzando dados de pesagens de cadernos para planilhas lentas no Excel, agora tem o cenário completo da fazenda. O painel entrega a análise de GMD do rebanho e o fluxo de caixa consolidados em cerca de &lt;strong&gt;3 minutos&lt;/strong&gt;. O dado escuro finalmente virou decisão.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;4.2. O Impacto Técnico (Estabilidade do Servidor)&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;O meu servidor Flask agora "respira aliviado". O consumo de RAM da aplicação despencou, porque o Python parou de baixar milhares de linhas pela rede para fazer matemática básica. Com a View SQL resolvendo o cálculo, os índices otimizando os filtros e a paginação segurando o tráfego, o sistema se tornou verdadeiramente escalável. A fazenda pode dobrar ou triplicar o número de animais cadastrados, e o painel continuará abrindo em milissegundos.&lt;/p&gt;

&lt;h2&gt;
  
  
  5. Conclusão
&lt;/h2&gt;

&lt;p&gt;Sistemas genéricos falham no agronegócio porque tentam tratar o peso de um boi como se fosse apenas o "estoque de uma loja". A pecuária é um organismo vivo, e seus indicadores, como o Ganho Médio Diário, exigem uma arquitetura desenhada para eles.&lt;/p&gt;

&lt;p&gt;No fim das contas, bons sistemas resolvem problemas reais — o resto é exercício. Partir da dor do pecuarista para chegar à solução técnica de uma View avançada é o que diferencia código de prateleira de um software com propósito.&lt;/p&gt;

&lt;p&gt;Quer analisar o meu código ou ver essa arquitetura rodando ao vivo?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  💻 &lt;strong&gt;Acesse o código-fonte:&lt;/strong&gt;  &lt;a href="//github.com/Dom1ng0s/sistema_gado"&gt;Dom1ng0s/sistema_gado no GitHub&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  🚀 &lt;strong&gt;Teste o sistema:&lt;/strong&gt; Deploy ativo via Docker na Oracle Cloud através do IP:  &lt;a href="http://163.176.171.60" rel="noopener noreferrer"&gt;http://163.176.171.60&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>python</category>
      <category>sql</category>
      <category>architecture</category>
      <category>database</category>
    </item>
  </channel>
</rss>
