<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Suraj Bera</title>
    <description>The latest articles on DEV Community by Suraj Bera (@suraj_bera).</description>
    <link>https://dev.to/suraj_bera</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/suraj_bera"/>
    <language>en</language>
    <item>
      <title>Part 2: Vector Embeddings in simplest terms</title>
      <dc:creator>Suraj Bera</dc:creator>
      <pubDate>Sun, 03 May 2026 05:03:53 +0000</pubDate>
      <link>https://dev.to/suraj_bera/part-2-vector-embeddings-in-simplest-terms-2j35</link>
      <guid>https://dev.to/suraj_bera/part-2-vector-embeddings-in-simplest-terms-2j35</guid>
      <description>&lt;p&gt;This is my Day 2 of learning AI fundamentals where I will be covering  the following concepts:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Vector Embeddings&lt;/li&gt;
&lt;li&gt;How Tokenisation and Vector Embeddings relate to each other&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Vector embeddings:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Vector embeddings is the process of turning each token id(generated during tokenisation) into high dimensional vector where semantic similarity results into geometric closeness. Think of it like this: dog is closer to puppy, also closer to dog food. But dog is not closer to car or petrol.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  When we use embeddings?
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Recommendations: Suggest similar songs, videos, movies, products&lt;/li&gt;
&lt;li&gt;Search: Get search results when keywords don't match&lt;/li&gt;
&lt;li&gt;Cluster: Grouping related concepts together.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;A beginner might be confused in terms like: Vector, High Dimensional.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;This is an example of a vector: [0.9, 0.8, 0.1]. Array/List/Vector all mean the same thing. 'List' is just a plain english, 'array' is the programming term, vector is the math/ml term.&lt;/li&gt;
&lt;li&gt;High Dimensional: Multi-dimensional just means more than 1 - could be 2D, 3D, 10D,... too vague. But high dimensional specifically means  &lt;strong&gt;hundreds of thousands of dimensions&lt;/strong&gt;(Open AI's text-embedding-3-small = 1536 dimensions)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;There are 2 types of Search:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Lexical Search: Exact words/characters search&lt;/li&gt;
&lt;li&gt;Semantic Search: Meaning/Intent search&lt;/li&gt;
&lt;/ol&gt;

&lt;blockquote&gt;
&lt;p&gt;Vector embedding enables semantic search&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  How tokenisation &amp;amp; vector embeddings are connected together?
&lt;/h2&gt;

&lt;p&gt;text-tokenisation --&amp;gt; token Ids-embedding lookup --&amp;gt; vectors --&amp;gt; transformers&lt;br&gt;
"hello"             --&amp;gt; [221728]               --&amp;gt; [0.21,-0.44,...]&lt;/p&gt;

&lt;p&gt;&lt;a href="https://projector.tensorflow.org/" rel="noopener noreferrer"&gt;Vector Embedding Visualizer&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here is the link to &lt;a href="https://www.linkedin.com/posts/surajbera_tiktokenizer-activity-7456386292990304256-HgzV?utm_source=share&amp;amp;utm_medium=member_desktop&amp;amp;rcm=ACoAACqk0GoBZnxif8bFvArYATVbMOZugoVL0Ms" rel="noopener noreferrer"&gt;part 1&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>beginners</category>
      <category>machinelearning</category>
      <category>nlp</category>
    </item>
  </channel>
</rss>
