<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: João Bosco</title>
    <description>The latest articles on DEV Community by João Bosco (@boscobecker).</description>
    <link>https://dev.to/boscobecker</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/boscobecker"/>
    <language>en</language>
    <item>
      <title>🚀 TinyLlama Fine-Tuning with LoRA (CPU-Friendly)</title>
      <dc:creator>João Bosco</dc:creator>
      <pubDate>Sat, 21 Jun 2025 19:33:48 +0000</pubDate>
      <link>https://dev.to/boscobecker/tinyllama-fine-tuning-with-lora-cpu-friendly-k3e</link>
      <guid>https://dev.to/boscobecker/tinyllama-fine-tuning-with-lora-cpu-friendly-k3e</guid>
      <description>&lt;h1&gt;
  
  
  TinyLlama Fine-Tuning with LoRA
&lt;/h1&gt;

&lt;p&gt;This project demonstrates how to fine-tune the &lt;a href="https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v1.0" rel="noopener noreferrer"&gt;TinyLlama-1.1B-Chat-v1.0&lt;/a&gt;&lt;br&gt;
model using the &lt;a href="https://arxiv.org/abs/2106.09685" rel="noopener noreferrer"&gt;LoRA (Low-Rank Adaptation)&lt;/a&gt; technique for efficient parameter-efficient training. The training is optimized for CPU environments with limited RAM (e.g., 16GB).&lt;/p&gt;
&lt;h2&gt;
  
  
  Project Structure
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;src/TrainTinyLlama.py&lt;/strong&gt;: Main script for fine-tuning TinyLlama with LoRA.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;dataset/dataset.json&lt;/strong&gt;: Training data in JSON format.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Dataset Format
&lt;/h2&gt;

&lt;p&gt;The dataset should be a JSON file containing a list of objects, each with &lt;code&gt;input&lt;/code&gt; and &lt;code&gt;output&lt;/code&gt; fields. Example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"input"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Generate a form  with a panel with color white"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"output"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="nl"&gt;"Type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"TForm"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="nl"&gt;"Name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"FrmMainForm"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="nl"&gt;"Caption"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Sample Form"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="nl"&gt;"Width"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;800&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="nl"&gt;"Height"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;600&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="nl"&gt;"Children"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
                  &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="nl"&gt;"Type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"TPanel"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="nl"&gt;"Name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Panel1"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="nl"&gt;"Left"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="nl"&gt;"Top"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="nl"&gt;"Width"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="nl"&gt;"Height"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                    &lt;/span&gt;&lt;span class="nl"&gt;"Color"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"#FFFFFF"&lt;/span&gt;&lt;span class="w"&gt;
                  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
              &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Fine-Tuning Details
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Model&lt;/strong&gt;: &lt;a href="https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v1.0" rel="noopener noreferrer"&gt;TinyLlama-1.1B-Chat-v1.0&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Adapter&lt;/strong&gt;: LoRA (Low-Rank Adaptation)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Target Modules&lt;/strong&gt;: &lt;code&gt;q_proj&lt;/code&gt;, &lt;code&gt;v_proj&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;LoRA Config&lt;/strong&gt;: &lt;code&gt;r=8&lt;/code&gt;, &lt;code&gt;alpha=16&lt;/code&gt;, &lt;code&gt;dropout=0.05&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Batch Size&lt;/strong&gt;: 1 (adjustable)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Epochs&lt;/strong&gt;: 1 (increase for better results)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Device&lt;/strong&gt;: CPU only (&lt;code&gt;use_cpu=True&lt;/code&gt;)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Training
&lt;/h2&gt;

&lt;p&gt;To start fine-tuning, run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python src/TrainTinyLlama.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The script will:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Load and preprocess the dataset.&lt;/li&gt;
&lt;li&gt;Apply LoRA adapters to the model.&lt;/li&gt;
&lt;li&gt;Train using Hugging Face's &lt;code&gt;Trainer&lt;/code&gt; API.&lt;/li&gt;
&lt;li&gt;Save the fine-tuned model and tokenizer to the &lt;code&gt;TinyLlama-lora-out&lt;/code&gt; directory.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Output
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Fine-tuned Model&lt;/strong&gt;: Saved in &lt;code&gt;TinyLlama-lora-out/&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Logs&lt;/strong&gt;: Saved in &lt;code&gt;logs/&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Requirements
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Python 3.8+&lt;/li&gt;
&lt;li&gt;&lt;a href="https://pypi.org/project/transformers/" rel="noopener noreferrer"&gt;transformers&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://pypi.org/project/datasets/" rel="noopener noreferrer"&gt;datasets&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://pypi.org/project/peft/" rel="noopener noreferrer"&gt;peft&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;torch&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Install dependencies:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-r&lt;/span&gt; src/requirements.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Merge LoRA weights into base model
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python src&lt;span class="se"&gt;\M&lt;/span&gt;erge_lora.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Convert to gguf
&lt;/h2&gt;

&lt;p&gt;python convert_hf_to_gguf.py ../TinyLlama-merged --outfile ./tinyllama-custom.gguf&lt;/p&gt;

&lt;h2&gt;
  
  
  Import to Ollama
&lt;/h2&gt;

&lt;p&gt;Windows &amp;gt; %USERPROFILE%.ollama\models  &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Create a Modelfile&lt;br&gt;
FROM ./tinyllama-custom.gguf&lt;br&gt;
&lt;/p&gt;
&lt;/blockquote&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;ollama create tinyllama-custom &lt;span class="nt"&gt;-f&lt;/span&gt; Modelfile
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  llama.cpp
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;The &lt;code&gt;.gguf&lt;/code&gt; format is compatible with &lt;a href="https://github.com/ggerganov/llama.cpp" rel="noopener noreferrer"&gt;llama.cpp&lt;/a&gt;, a C++ project for efficient execution of Llama models on CPU and GPU.&lt;/li&gt;
&lt;li&gt;To use your custom model with &lt;code&gt;llama.cpp&lt;/code&gt;, simply copy the &lt;code&gt;.gguf&lt;/code&gt; file to the models folder and follow the instructions in the repository.&lt;/li&gt;
&lt;li&gt;Documentation: &lt;a href="https://github.com/ggerganov/llama.cpp#readme" rel="noopener noreferrer"&gt;llama.cpp README&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Model conversion: &lt;a href="https://github.com/ggerganov/llama.cpp/blob/master/convert.py" rel="noopener noreferrer"&gt;convert.py&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Example usage:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;./main &lt;span class="nt"&gt;-m&lt;/span&gt; ./tinyllama-custom.gguf &lt;span class="nt"&gt;-p&lt;/span&gt; &lt;span class="s2"&gt;"Your prompt here"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Notes
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;The script is optimized for CPU training. For GPU, set &lt;code&gt;no_cuda=False&lt;/code&gt; and adjust &lt;code&gt;fp16&lt;/code&gt; as needed.&lt;/li&gt;
&lt;li&gt;LoRA enables efficient fine-tuning with minimal memory usage.&lt;/li&gt;
&lt;li&gt;Adjust hyperparameters (epochs, batch size) based on your hardware and dataset size.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  References
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v1.0" rel="noopener noreferrer"&gt;TinyLlama Model Card&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://arxiv.org/abs/2106.09685" rel="noopener noreferrer"&gt;LoRA: Low-Rank Adaptation of Large Language Models&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://huggingface.co/docs/transformers/index" rel="noopener noreferrer"&gt;Hugging Face Transformers Documentation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/ggerganov/llama.cpp" rel="noopener noreferrer"&gt;llama.cpp (GitHub)&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>delphi</category>
      <category>json</category>
      <category>buiderui</category>
      <category>lowcode</category>
    </item>
    <item>
      <title>🚀 BuilderUI: Create Delphi Interfaces Dynamically from JSON</title>
      <dc:creator>João Bosco</dc:creator>
      <pubDate>Thu, 19 Jun 2025 13:33:17 +0000</pubDate>
      <link>https://dev.to/boscobecker/builderui-create-delphi-interfaces-dynamically-from-json-463l</link>
      <guid>https://dev.to/boscobecker/builderui-create-delphi-interfaces-dynamically-from-json-463l</guid>
      <description>&lt;p&gt;Have you ever found yourself copying and pasting the same UI code across multiple Delphi forms?  I did — and that’s exactly why I created &lt;strong&gt;BuilderUI&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;BuilderUI is a &lt;strong&gt;dynamic UI generator for Delphi&lt;/strong&gt; that allows you to define components using JSON and render them at runtime. It’s perfect for scenarios where flexibility, rapid prototyping, or low-code customization is needed.&lt;/p&gt;

&lt;h2&gt;
  
  
  🔧 What is BuilderUI?
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;BuilderUI&lt;/strong&gt; is a Delphi engine that reads a JSON file and builds a fully working VCL form with components like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;TEdit&lt;/code&gt;, &lt;code&gt;TLabel&lt;/code&gt;, &lt;code&gt;TButton&lt;/code&gt;, &lt;code&gt;TCheckBox&lt;/code&gt;, &lt;code&gt;TComboBox&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Containers like &lt;code&gt;TPanel&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Event support like &lt;code&gt;OnClick&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;✅ Main Features:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;JSON-driven UI generation
&lt;/li&gt;
&lt;li&gt;Component positioning (&lt;code&gt;Top&lt;/code&gt;, &lt;code&gt;Left&lt;/code&gt;, &lt;code&gt;Width&lt;/code&gt;, &lt;code&gt;Height&lt;/code&gt;)
&lt;/li&gt;
&lt;li&gt;Runtime event handling
&lt;/li&gt;
&lt;li&gt;Clean and extensible architecture
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  🧪 Example JSON Input
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"Name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"FrmLoginScreen"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"Type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"TForm"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"Caption"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Login Screen"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"Width"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;400&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"Height"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;300&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"Components"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"TLabel"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"lblUser"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Caption"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Username"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Top"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;20&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Left"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;20&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"TEdit"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"edtUser"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Top"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;40&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Left"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;20&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Width"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"TButton"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"btnLogin"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Caption"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Login"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Top"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;80&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Left"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;20&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;🧠 How It Works (Under the Hood) BuilderUI uses:&lt;/p&gt;

&lt;p&gt;🏗️ Builder + Factory pattern to create UI elements&lt;br&gt;
🔁 RTTI to dynamically instantiate and set component properties&lt;br&gt;
🔎 System.JSON for JSON parsing&lt;br&gt;
🧪 In-progress unit tests using DUnitX&lt;/p&gt;

&lt;h2&gt;
  
  
  💡 Why BuilderUI?
&lt;/h2&gt;

&lt;p&gt;Use cases:&lt;br&gt;
Generate dynamic admin panels from metadata&lt;br&gt;
Let clients define UI layouts via config&lt;br&gt;
Create low-code solutions inside Delphi&lt;br&gt;
Replace boilerplate screen generation with flexible runtime forms&lt;/p&gt;

&lt;h2&gt;
  
  
  🧭 Roadmap
&lt;/h2&gt;

&lt;p&gt;Grouping and tabs with TPageControl&lt;br&gt;
Data-aware components with binding&lt;br&gt;
Drag-and-drop form designer&lt;br&gt;
Integration with 3rd-party components (DevExpress, TMS)&lt;br&gt;
JSON schema validation and autocomplete&lt;/p&gt;

&lt;h2&gt;
  
  
  ⚙️ How to Use
&lt;/h2&gt;

&lt;p&gt;Want a quick demo? Just clone the repo and run the example project.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Clone the repository: git clone &lt;a href="https://github.com/boscobecker/BuilderUI.git" rel="noopener noreferrer"&gt;https://github.com/boscobecker/BuilderUI.git&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Open in Delphi (tested on Delphi 10.4 and 11+)&lt;/li&gt;
&lt;li&gt;Load a JSON file with layout&lt;/li&gt;
&lt;li&gt;Run the project and see the form created at runtime&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  🙌 Contribute or Give Feedback
&lt;/h2&gt;

&lt;p&gt;This project is under active development.&lt;br&gt;
If you're passionate about runtime UI, low-code, or just love Delphi — I'd love your feedback, ideas or contributions.&lt;br&gt;
⭐ Star the project on GitHub&lt;br&gt;
💬 Comment here or open an issue&lt;br&gt;
🤝 Connect with me on LinkedIn :&lt;a href="https://www.linkedin.com/in/boscobecker/" rel="noopener noreferrer"&gt;https://www.linkedin.com/in/boscobecker/&lt;/a&gt;&lt;br&gt;
🔗 GitHub Repo: &lt;a href="https://github.com/boscobecker/BuilderUI" rel="noopener noreferrer"&gt;https://github.com/boscobecker/BuilderUI&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;🧠 Follow me for more Delphi &amp;amp; .NET content!&lt;/p&gt;

</description>
      <category>delphi</category>
      <category>json</category>
      <category>buiderui</category>
      <category>ui</category>
    </item>
  </channel>
</rss>
