<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Iulia Feroli</title>
    <description>The latest articles on DEV Community by Iulia Feroli (@iuliaferoli).</description>
    <link>https://dev.to/iuliaferoli</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/iuliaferoli"/>
    <language>en</language>
    <item>
      <title>What are Agents: Combining LLMs, semantic search and RAG into conversational AI</title>
      <dc:creator>Iulia Feroli</dc:creator>
      <pubDate>Sun, 26 Oct 2025 13:48:29 +0000</pubDate>
      <link>https://dev.to/iuliaferoli/what-are-agents-combining-llms-semantic-search-and-rag-into-conversational-ai-2pef</link>
      <guid>https://dev.to/iuliaferoli/what-are-agents-combining-llms-semantic-search-and-rag-into-conversational-ai-2pef</guid>
      <description>&lt;p&gt;In the ever-evolving world of Large Language Models (LLMs), Retrieval Augmented Generation (RAG) has emerged as a technique for combining search and generation. Taking this further by adding some context, memory, and the power to call custom tools - and you get &lt;em&gt;agents&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a&gt;Check out the full videos on &lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Understanding RAG: Combining Search and Generation
&lt;/h2&gt;

&lt;p&gt;Quick recap of techniques that come into play here:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Large language models - both as specific embedding models used to vectorize text, which enables..&lt;/li&gt;
&lt;li&gt;Semntic search - used to get relevant semantic results within different data sets and retrieve documents (or chunks) that will serve as context.&lt;/li&gt;
&lt;li&gt;Generative AI - a type of LLM used to generate text, built on top of statistical models that predict the next most-likely word.&lt;/li&gt;
&lt;li&gt;RAG - Retrieval Augmented Generation - putting the techniques above together to build retrieval-based answering tools. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F591u6mins1w9o7mmg93e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F591u6mins1w9o7mmg93e.png" alt=" " width="800" height="448"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Why use RAG above simply asking an LLM a question?
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;LLMs are frozen in time and domain to the latest training data. They don't know about data not within their training. You would need to re-train an LLM to include new information all the time. This would not only be very costly in terms of infrastructure and time, but also:&lt;/li&gt;
&lt;li&gt;Data privacy - some private data simply cannot be included in LLM training - instead we want to reference this information within trusted architecture.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Beyond RAG: Agentic Workflows and Their Potential
&lt;/h2&gt;

&lt;p&gt;Going further, even RAG will have its limitations. Namely:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Conversational flow and memory - as you continue to explore a solution you may need to add context of previous conversations with the LLM, or your single prompt will get too large for a single token call. &lt;/li&gt;
&lt;li&gt;Tools - rather than just search, in order to answer our question, the LLMs may need to also perform some computations or tasks, or independetly collect other data.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Introducing Tools
&lt;/h2&gt;

&lt;p&gt;We can define tools as pieces of code that agents can use (for example: a search query, a database lookup, running plugins, executing small code snippets, making calculations, sending messages or even generating graphics). We can build access data control or certain API best practices into these tool - also giving us more control over how the LLM will perform these tasks. &lt;br&gt;
At the same time, it is the LLM's job to choose which of these tools to run and in what order. So you don't have to hardcode the entire logic of the process. Rather we can rely on the LLM's understanding of what needs to be done and detecting the intent from the prompt.&lt;/p&gt;

&lt;p&gt;We also introduce memory or context - agents are able to continue conversations and remember user preferences or previous discussions. &lt;/p&gt;

&lt;p&gt;To summerize, agents are the latest development within the NLP field, standing on the shoulders of other impressive techniques such as semantic search or RAG. They can change how we interact with LLMs, or even how we browse the internet or perform automatable tasks. &lt;/p&gt;

&lt;p&gt;In this new paradigm, we're seeing LLMs able to make API calls, search and retrieve data, perform aggregations or simple tasks, and becom a new form of User Interface between us and a lot of other tools and programs.&lt;/p&gt;

</description>
      <category>rag</category>
      <category>agents</category>
      <category>llm</category>
    </item>
    <item>
      <title>Can We Really Trust AI? Lies, Poison, and the Need for Responsible AI</title>
      <dc:creator>Iulia Feroli</dc:creator>
      <pubDate>Fri, 24 Oct 2025 11:36:52 +0000</pubDate>
      <link>https://dev.to/iuliaferoli/can-we-really-trust-ai-lies-poison-and-the-need-for-responsible-ai-4hjg</link>
      <guid>https://dev.to/iuliaferoli/can-we-really-trust-ai-lies-poison-and-the-need-for-responsible-ai-4hjg</guid>
      <description>&lt;p&gt;These days, it feels like Large Language Models (LLMs) and conversational agents are taking over the internet. We're asking them questions and expecting accurate, truthful answers. But is that expectation realistic? Recent research suggests that trusting LLMs implicitly can be risky. As someone who's been working in data science and AI for nearly a decade, specializing in NLP and agentic AI, I've been diving into the latest findings and want to share some key concerns.&lt;/p&gt;

&lt;p&gt;Check out my full video diving into the research papers: &lt;a href="https://www.youtube.com/watch?v=NUlNuPuHudY" rel="noopener noreferrer"&gt;https://www.youtube.com/watch?v=NUlNuPuHudY&lt;/a&gt; &lt;/p&gt;

&lt;h2&gt;
  
  
  The Lie Detector: LLMs Can Intentionally Deceive
&lt;/h2&gt;

&lt;p&gt;One of the most unsettling discoveries is that LLMs aren't just prone to occasional "hallucinations" (unintentional errors); they can actually lie. A study by researchers at Carnegie Mellon University &lt;a href="https://arxiv.org/pdf/2509.03518" rel="noopener noreferrer"&gt;("Can LLMs Lie? Investigations Beyond Hallucination")&lt;/a&gt; demonstrates that LLMs can discern between truth and falsehood within their internal systems. They can be aware that something is a lie and still choose to provide incorrect information to achieve a specific goal.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Hallucinations vs. Lies: It's crucial to distinguish between unintentional errors (hallucinations) and intentional deception (lies). LLMs can internally recognize falsehoods and choose to present them anyway.&lt;/li&gt;
&lt;li&gt;The Goal-Oriented LLM: LLMs are trained to achieve specific goals, which might override the priority of absolute truthfulness. For example, an LLM trained to sell a product might omit drawbacks or even provide misleading information to close the deal.&lt;/li&gt;
&lt;li&gt;Controlling the Lies: Researchers have explored ways to limit the types and frequency of lies that LLMs can tell. However, the fact that LLMs can be deliberately programmed with acceptable levels of deception raises significant ethical questions.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvo8ss5u7nhnb45e0t26w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvo8ss5u7nhnb45e0t26w.png" alt=" " width="800" height="1243"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Poison Pill: How Easily Can LLMs Be Corrupted?
&lt;/h2&gt;

&lt;p&gt;Beyond intentional design, LLMs are also vulnerable to external influence through "poisoning attacks." A paper by researchers from Anthropic, the Alan Turing Institute, and others &lt;a href="https://arxiv.org/pdf/2510.07192" rel="noopener noreferrer"&gt;("Poisoning Attacks on LLMs Require a Near-Constant Number of Poison Samples")&lt;/a&gt; reveals a surprising vulnerability: LLMs can be corrupted with a relatively small amount of malicious data.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The Ratio Myth: It was previously assumed that the sheer volume of data used to train LLMs would dilute the impact of malicious data. The research shows that this isn't necessarily true.&lt;/li&gt;
&lt;li&gt;Small Dose, Big Impact: As few as 250 carefully crafted documents can poison models with billions of parameters.&lt;/li&gt;
&lt;li&gt;Difficult to Detect: Poisoning attacks can introduce subtle changes in behavior that are difficult to detect through standard testing. For example, a trigger word could cause the LLM to output gibberish or switch to a different language.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  A Call for Responsible AI
&lt;/h2&gt;

&lt;p&gt;These findings highlight the need for a more cautious and responsible approach to AI adoption. It's crucial to be aware of the potential for LLMs to lie or be corrupted, and to take steps to mitigate these risks.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Don't Trust Blindly: Approach LLM outputs with skepticism. Verify information from multiple sources, especially when dealing with critical decisions.&lt;/li&gt;
&lt;li&gt;Demand Transparency: Advocate for transparency in how LLMs are trained and customized. Understand the ethical guidelines and potential biases that have been incorporated.&lt;/li&gt;
&lt;li&gt;Focus on Robust Engineering: Prioritize good software engineering practices, thorough testing, and careful selection of data sources when building AI applications. Avoid relying solely on "vibe coding" and untested LLM outputs.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The Future of AI: A Balanced Perspective
&lt;/h2&gt;

&lt;p&gt;Generative AI is a powerful technology, but it's not a magic bullet. By acknowledging the risks and focusing on responsible development and deployment, we can harness the true potential of AI while safeguarding against its potential harms. I want to try on this channel to highlight and showcase more of these responsibly exciting parts of the AI ecosystem that perhaps don't get as much attention but are something that can get developers a bit more happy but also get people thinking about things in a more robust and stable way which I'm not seeing a lot with the current AI solutions.&lt;/p&gt;

</description>
      <category>llm</category>
      <category>ai</category>
      <category>chatgpt</category>
      <category>security</category>
    </item>
    <item>
      <title>Analyzing my Oura sleep score - is it AI or just math?</title>
      <dc:creator>Iulia Feroli</dc:creator>
      <pubDate>Wed, 29 Jan 2025 16:49:44 +0000</pubDate>
      <link>https://dev.to/iuliaferoli/analyzing-my-oura-sleep-score-is-it-ai-or-just-math-59ja</link>
      <guid>https://dev.to/iuliaferoli/analyzing-my-oura-sleep-score-is-it-ai-or-just-math-59ja</guid>
      <description>&lt;p&gt;Today I dove into my activity tracker data; specifically, the sleep score that Oura calculates daily. It ended up being the perfect scenario to explore a much bigger question I am begging everyone to ask more often:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Does this problem need AI or a simple math formula?&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Hear me out here!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqxfimbn9dyp2hfoyeic6.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqxfimbn9dyp2hfoyeic6.jpeg" alt=" " width="500" height="833"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Activity tracking data
&lt;/h2&gt;

&lt;p&gt;Now, I'm a really big fan of tracking my data and using gamification to improve my health. I still use my fitbit and garmin depending on activity (i.e. gym or hiking), however, for day-to-day use, I love the Oura ring. It's more subtle and elegant and it doesn't annoy me like sleeping with a watch on does.  &lt;/p&gt;

&lt;p&gt;That being said, sleep tracking is an important element of why I use my Oura, and therefore worth looking into a bit more. &lt;br&gt;
If you're not familiar with this metric, you can read about the sleep score&lt;a href="https://ouraring.com/blog/sleep-score/?srsltid=AfmBOoqqZsOkmbrSxv9yZi8CSCAaowCUCrPks5MfSyS4L7zboK9vrcvz" rel="noopener noreferrer"&gt; from the Oura blog here.&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  The magical Sleep Score
&lt;/h2&gt;

&lt;p&gt;One of the big negatives about Oura is that a lot of the insights and breakdowns are hidden behind a subscription paywall - and the 'free' version only provides you with the sleep score itself without any further explanation. This is quite a departure from the fitbit and garmin models that provide you with a comprehensive dashboard once you've bought the device. &lt;/p&gt;

&lt;p&gt;So what is this magical sleep score insight we go get then? And is it worth a monthly fee to peek behind the veil at the unique insights?&lt;/p&gt;
&lt;h2&gt;
  
  
  The Hypothesis
&lt;/h2&gt;

&lt;p&gt;As a skeptic and data scientist, I had to take it upon myself to look into this question. Rather than approach it as a fancy AI insight I can throw chatGPT at (which seems to be the go-to first step these days) I will be going the Occam's razor way. &lt;/p&gt;

&lt;p&gt;My intuition/assumption about it is pretty simple - &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;More deep sleep and lower heart rate correlate with better scores.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Can it really be that basic?&lt;br&gt;
Let's take a look in a quick data science experiment.&lt;/p&gt;
&lt;h3&gt;
  
  
  Oura Developer API
&lt;/h3&gt;

&lt;p&gt;First up I found &lt;a href="https://cloud.ouraring.com/v2/docs" rel="noopener noreferrer"&gt;the Oura developer API&lt;/a&gt; and got a quick dump of my activity/sleep data. &lt;/p&gt;

&lt;p&gt;As Oura tends to have the most restrictive native way to look at your data (vs fitbit or garmin) this is already a great find/idea if you want some more insights!&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_data&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;type&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
  &lt;span class="n"&gt;url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;https://api.ouraring.com/v2/usercollection/&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="nb"&gt;type&lt;/span&gt;
  &lt;span class="n"&gt;params&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt; 
      &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;start_date&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;2021-11-01&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
      &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;end_date&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;2025-01-01&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt; 
  &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="n"&gt;headers&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; 
    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Authorization&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Bearer &lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="n"&gt;auth_token&lt;/span&gt; 
  &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;request&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;GET&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;headers&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;params&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;params&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; 
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;()[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;data&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;get_data&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;sleep&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="nf"&gt;open&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;oura_data_sleep.json&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;w&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;encoding&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;utf-8&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dump&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;ensure_ascii&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;False&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;indent&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Elasticsearch Index
&lt;/h3&gt;

&lt;p&gt;Step 2: we need a few extra lines to add the new data into an Elasticsearch index so we can look through it easily. &lt;br&gt;
You can check out some more starter examples &lt;a href="https://elasticsearch-py.readthedocs.io/en/v8.17.1/interactive.html" rel="noopener noreferrer"&gt;for Elastic &amp;amp; Python here.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This was also super easy - as it tends to be with json files - so we didn't need any additional mapping or data processing steps.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Elasticsearch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;cloud_id&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;ELASTIC_CLOUD_ID&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;  &lt;span class="c1"&gt;# cloud id can be found under deployment management
&lt;/span&gt;    &lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;ELASTIC_API_KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="c1"&gt;# your username and password for connecting to elastic, found under Deplouments - Security
&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;index_name&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;oura-history-sleep&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;

&lt;span class="c1"&gt;# Create the Elasticsearch index with the specified name (delete if already existing)
&lt;/span&gt;&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;indices&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;exists&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;index_name&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;indices&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;delete&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;index_name&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;indices&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;create&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;index_name&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="nf"&gt;open&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;oura_data_sleep.json&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;r&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;json_data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;load&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;documents&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;doc&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;json_data&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;documents&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;doc&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;load&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;helpers&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;bulk&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;documents&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;index_name&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now comes our tiny data science experiment for our hypothesis.&lt;br&gt;
And I do mean tiny. Remember - we're going for the simplest possible explanation first. &lt;/p&gt;

&lt;p&gt;I gathered the days with the highest sleep scores with a simple sort on the score field.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;index_name&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;oura-history-sleep&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;

&lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;search&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;index_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;sort&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;readiness.score:desc&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;hit&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;hits&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;hits&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]:&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Day: {day} and sleeping score: {score}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;format&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;day&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;hit&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;_source&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;day&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;score&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;hit&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;_source&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;readiness&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;score&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]))&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7wphxzqyplv5udu6wdv1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7wphxzqyplv5udu6wdv1.png" alt=" " width="650" height="358"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Looking a bit deeper at these days, and at the fields I already suspect had a pretty high influence on the scores, we notice some consistent values:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmabs1yxnrzczszo6xb5j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmabs1yxnrzczszo6xb5j.png" alt=" " width="800" height="200"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Finally, the experiment.
&lt;/h3&gt;

&lt;p&gt;Can it be as simple as just making a basic formula around sleeping times and heart rate? &lt;/p&gt;

&lt;p&gt;I build a query with Elasticsearch filters looking at deep sleep over 1.5 hours, heart rate under 60, and sorting the hits by highest REM time:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;query&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;range&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;deep_sleep_duration&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;gte&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;1.5&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="mi"&gt;3600&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;range&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;average_heart_rate&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:{&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;lte&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;60&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;search&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;index_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;sort&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;rem_sleep_duration:desc&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now we can look at what days this gets back and check what the sleep scores for those days were. The validation step if you will.&lt;/p&gt;

&lt;p&gt;Surprise, surprise. It's pretty much the same dates we got in the initial query. It's not perfect - just like our formula or filter selection is far from 100% accurate - but that's exactly the point. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faufr4ntcy6ah5kt1w7ek.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faufr4ntcy6ah5kt1w7ek.png" alt=" " width="660" height="338"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This confirms that the intuition/hypothesis was pretty much spot on and I can "predict" my sleep score with &lt;em&gt;very&lt;/em&gt; simple math.&lt;/p&gt;

&lt;p&gt;Here are a few &lt;a href="https://www.elastic.co/guide/en/kibana/current/lens.html" rel="noopener noreferrer"&gt;Kibana visualizations&lt;/a&gt; of a larger subset of my data to further illustrate this connection: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fctfcuk4hxchk8b1uc44k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fctfcuk4hxchk8b1uc44k.png" alt=" " width="800" height="336"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Why does this matter?
&lt;/h3&gt;

&lt;p&gt;In a world of AI buzzwords and the race to the most over-the-top model we can get - you can look at a simple "sleep score" and believe it's a fancy, groundbreaking AI insight that's worth a monthly fee to unlock. &lt;/p&gt;

&lt;p&gt;However, more often than not - it's either a reasonable &amp;amp; intuitive formula or maybe even a basic regression. &lt;/p&gt;

&lt;p&gt;Simpler. More accurate. Cheaper to compute.&lt;/p&gt;

&lt;p&gt;And that's something we need to keep reminding ourselves of. This is why I believe data scientist/analyst jobs will continue to be safe, and why learning the basics of intuitive modeling and machine learning is much more impactful than having unrestricted access to an LLM. &lt;/p&gt;

&lt;p&gt;Because as undoubtedly amazing as the advanced tech we have access to today is, it's even more important to understand when you don't need to use it at all.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/iuliaferoli/oura-elastic/blob/main/connect.ipynb" rel="noopener noreferrer"&gt;See full code notebook here&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>elasticsearch</category>
      <category>datascience</category>
      <category>python</category>
    </item>
    <item>
      <title>Am I tech enough? A Python Revamp Plan</title>
      <dc:creator>Iulia Feroli</dc:creator>
      <pubDate>Thu, 14 Mar 2024 21:05:44 +0000</pubDate>
      <link>https://dev.to/iuliaferoli/am-i-tech-enough-a-python-revamp-plan-4o7g</link>
      <guid>https://dev.to/iuliaferoli/am-i-tech-enough-a-python-revamp-plan-4o7g</guid>
      <description>&lt;h2&gt;
  
  
  For someone who has worked in tech for years, I still find myself asking "Am I technical enough?" alarmingly often.
&lt;/h2&gt;

&lt;p&gt;&lt;em&gt;I wrote this article before starting my DevRel job at Elastic over the summer. I finally feel "on track" enough about it to share.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Tech and Doubt go hand in hand
&lt;/h2&gt;

&lt;p&gt;In some ways this is a somewhat reasonable question - tech is an overwhelmingly large field, with constant developments and endless specialties and rabbit holes of knowledge, so it's quite fair to acknowledge there will always be gaps and unfamiliar territory.&lt;/p&gt;

&lt;p&gt;At the same time, don't fall down the slippery slope of minimizing the parts you &lt;em&gt;do&lt;/em&gt; know. Any skill seems less impressive once you've mastered it. Not to mention, as you advance in one topic, you tend to figure out 10 more related deep dives you need to get around to in order to &lt;em&gt;really&lt;/em&gt; master it. And it goes on, and on, and on... Just because you're still learning doesn't mean you're not already an expert.&lt;/p&gt;

&lt;p&gt;If this sounds familiar, you've probably also heard of "Impostor syndrome", which by now is a very common phenomenon in the field, especially for women. &lt;/p&gt;

&lt;h2&gt;
  
  
  So what do we do? 
&lt;/h2&gt;

&lt;p&gt;In my experience there are two ways to deal with these kinds of thoughts: facing them head-on, or the much comfier favorite - avoidance. After all, you can't suck at something you don't try so that's a neat way out. &lt;/p&gt;

&lt;p&gt;Now - I've done my fair share of avoiding. Whether it's hiding in easier projects, branching out into different departments (hello marketing stint), or spending more time on non-technical techie-adjacent tasks (demos don't go into production!). Turns out you can avoid the scary tech climb in a multitude of ways. &lt;/p&gt;

&lt;p&gt;Well, it doesn't make the scary tech climb go away. In fact, the more time you "waste" without any progress, the scarier it keeps getting. And while not climbing at all is a perfectly reasonable choice as well, there's just something about that rush. &lt;/p&gt;

&lt;p&gt;Not a lot of other jobs are as exciting as solving technical puzzles and creating functionalities from scratch. &lt;/p&gt;

&lt;p&gt;We're left with one option. Just start climbing. &lt;/p&gt;

&lt;h2&gt;
  
  
  The solution?
&lt;/h2&gt;

&lt;p&gt;Accept that change won't happen overnight. You'll be confused, out of your depth, uncomfortable, overwhelmed, and doubtful. For a while. But you'll be all those things even if you don't start anyway. So let's just take a leap of faith and trust that it will be worth it to make it to the other side. &lt;/p&gt;

&lt;p&gt;What does all that melodrama actually mean? &lt;/p&gt;

&lt;p&gt;For me, it means taking stock of where I am in my tech journey, where I want to be, and making a practical plan to try my best to get there. I feel like I had started to slowly distance myself from programming and development, in favor of creating more creative content or even... shudder... tech &lt;strong&gt;sales&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;And while I still love community engagement activities (here I am writing a blog about it); at the end of the day I want to have a stronger foundation to base my videos, articles, and talks on, and build more of my own solutions. &lt;/p&gt;

&lt;p&gt;So as an unapologetic type A person, I've put myself on a program to polish up the developer within. At the same time, as a community advocate and free-time tech YouTuber - I'll build content out of this whole journey to hopefully help out anyone else out there going through a similar crisis. &lt;/p&gt;

&lt;p&gt;Have you been wanting to get into programming and Python but don't know where to start? Maybe this kind of journey is the kick you need!&lt;br&gt;
 &lt;br&gt;
I'm gonna structure a 100-day transformation for myself to cover all the topics one should conquer to call themselves a (python) developer more comfortably. If you're completely new, or if you want to go back and give your foundation a nice pick-me-up - this is the guide for you!&lt;/p&gt;

&lt;p&gt;I'm mostly tailoring this for my own interest, but I'll try to make it as general as possible as well so it can be a great guide to follow along to for non-Iulia readers as well. &lt;/p&gt;

&lt;h2&gt;
  
  
  What's on the schedule?
&lt;/h2&gt;

&lt;p&gt;This will be a multi-part series. I won't be building all my own training materials but rather curate a structured path leveraging all the awesome (free) internet resources out there. &lt;br&gt;
The series schedule and content are subject to change, keeping in mind every new topic could lead to new questions to explore, so we'll keep it flexible!&lt;/p&gt;

&lt;h1&gt;
  
  
  So what does it take to be a Python Developer?
&lt;/h1&gt;

&lt;h2&gt;
  
  
  Part 1: Vanilla Basics
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Scope the journey - Making an action plan and deciding on your best way of working/learning + Resources overview - library of useful links&lt;/li&gt;
&lt;li&gt;Setup - Python installation &amp;amp; practical setup + Editors, IDEs&lt;/li&gt;
&lt;li&gt;Virtual Environments - Exploring packages, installs, requirements, scalability&lt;/li&gt;
&lt;li&gt;Python 101 - A refresher course of the Python programming language - concepts you should master before going ahead (syntax, data types, functions &amp;amp; classes, etc)&lt;/li&gt;
&lt;li&gt;Hands-on - Choosing a project to start building and improving upon throughout this series to put all the concepts into practice. → Building a simple tic-tac-toe game&lt;/li&gt;
&lt;li&gt;Version control &amp;amp; Github - Setting up version control basics, best practices&lt;/li&gt;
&lt;li&gt;Refactoring - Going over various iterations of our game - adding more and more layers of complexity to the program for more advanced functionalities as we explore more concepts&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Part 2 Data Science w/ Python
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Basics of Data Science &amp;amp; Machine Learning&lt;/li&gt;
&lt;li&gt;Essential Libraries (sci-kit learn, numpy, pandas, etc)&lt;/li&gt;
&lt;li&gt;Awesome Frameworks (HuggingFace, Langchain, etc)&lt;/li&gt;
&lt;li&gt;MLOps&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Part 3 Adjacent growth
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Python + Web Development - exploring libraries Flask and Django, HTML, CSS&lt;/li&gt;
&lt;li&gt;Docker - containers!&lt;/li&gt;
&lt;li&gt;Cloud - scaling up &amp;amp; down!&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Stay tuned for part 1 coming out soon.&lt;br&gt;
If you have any topics you think we should add to the exploration don't hesitate to comment!&lt;/p&gt;

</description>
      <category>python</category>
      <category>beginners</category>
      <category>learning</category>
      <category>programming</category>
    </item>
    <item>
      <title>What is RAG? A quick 101</title>
      <dc:creator>Iulia Feroli</dc:creator>
      <pubDate>Tue, 12 Mar 2024 15:00:17 +0000</pubDate>
      <link>https://dev.to/iuliaferoli/what-is-rag-a-quick-101-o5e</link>
      <guid>https://dev.to/iuliaferoli/what-is-rag-a-quick-101-o5e</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;How is RAG used?&lt;br&gt;
What's up with the &lt;strong&gt;context&lt;/strong&gt; ?&lt;br&gt;
Does RAG work out-of-the-box?&lt;br&gt;
What do you need to configure to make it work?&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;RAG&lt;/strong&gt; = Retrieval Augmented Generation
&lt;/h2&gt;

&lt;p&gt;RAG, or Retrieval Augmented Generation combines the concepts of:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Information Retrieval - searching for relevant documents within your database/index based on a query (question).&lt;/li&gt;
&lt;li&gt;Augmenting - adding the extra information (context) to the query you are going to make. &lt;/li&gt;
&lt;li&gt;Generative AI - a model (usually an LLM) takes in a query/request and generates an answer for it (in natural language).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Putting those together, when a user asks a question, you first retrieve relevant information that might help answer that question; then you give this information (maybe data entries from your database, some recent articles or news, etc) as context to the LLM.&lt;/p&gt;

&lt;p&gt;So now you can generate an answer based not only on the initial question; but the additional information you retrieved as well - giving more accurate and relevant responses.&lt;/p&gt;

&lt;h2&gt;
  
  
  It's a simple question.
&lt;/h2&gt;

&lt;p&gt;A basic example:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Query = "What is Iulia Feroli's job?" &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Sending this to a general LLM would only give the correct answer if some information about me was added to the training dataset. (which I very much doubt.) If not, the response would either be an incorrect guess (which we would call a "hallucination") or an "I don't know". &lt;/p&gt;

&lt;p&gt;To test out the theory I asked chatGPT (see bellow) and it basically told me I'm not famous enough 🙄. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flm9lt6eeqls3ggjspbxd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flm9lt6eeqls3ggjspbxd.png" alt=" " width="800" height="296"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  So we need a little more &lt;strong&gt;context&lt;/strong&gt;...
&lt;/h2&gt;

&lt;p&gt;Let's try the &lt;em&gt;Information Retrieval&lt;/em&gt; route. &lt;br&gt;
If I have a database or other source that might include some information about non-prominent, not-yet-recognizable figures; I can first run a search for Iulia Feroli over there. Let's pretend my database is LinkedIn. With a quick search I get:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxkgovtc6whsi8sas4sf9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxkgovtc6whsi8sas4sf9.png" alt=" " width="800" height="430"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Wow, she sure looks like a public figure! &lt;/p&gt;

&lt;p&gt;In a real use case, your search experience for this step might be more complex. You could search for usernames in a database, through scans of PDFs, using misspellings (my name is not Lulia btw), or even semantic search if you look up "red haired speaker at that conference last week in Amsterdam?". &lt;/p&gt;

&lt;h2&gt;
  
  
  Context + Query --&amp;gt; Flattering Response
&lt;/h2&gt;

&lt;p&gt;Now, to perform RAG, instead of just sending the query to the LLM, I can send along the context I retrieved and ask it to take that into account when generating the answer. &lt;/p&gt;

&lt;p&gt;Then ideally we get back something like:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Iulia is an awesome developer advocate @elastic making content about NLP, search, AI, and python of course!&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;I say ideally because the results will still vary a lot depending on how you configure your RAG pipeline. &lt;/p&gt;

&lt;p&gt;There are loads of out-of-the-box components you can use, so you don't have to code it all from scratch, but you do need to make some informed choices.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;How will you do the information retrieval? &lt;br&gt;
Obviously, I'd use elasticsearch, but going deeper... do you use "classic" or semantic search? Which embedding model works best with your data? How about the chunking and window size?&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Which LLM to choose for the generative AI part? &lt;br&gt;
There are so many models available on HuggingFace and beyond to choose from, depending on the domain of your query; or even the extra training &amp;amp; customization you might want to apply. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;How to set it all up?&lt;br&gt;
There are loads of great resources such as LangChain that allow you to create building blocks you can plug in and out to create your "chains". &lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;And this is only scratching the surface! &lt;br&gt;
I hope this gave you a good idea of what RAG is all about.&lt;/p&gt;

&lt;p&gt;Some resources to get you further:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.elastic.co/what-is/retrieval-augmented-generation" rel="noopener noreferrer"&gt;https://www.elastic.co/what-is/retrieval-augmented-generation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.elastic.co/search-labs/blog/articles/gen-ai-using-cohere-llm" rel="noopener noreferrer"&gt;https://www.elastic.co/search-labs/blog/articles/gen-ai-using-cohere-llm&lt;/a&gt; &lt;/li&gt;
&lt;li&gt;&lt;a href="https://huggingface.co/learn" rel="noopener noreferrer"&gt;https://huggingface.co/learn&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://python.langchain.com/docs/get_started/introduction" rel="noopener noreferrer"&gt;https://python.langchain.com/docs/get_started/introduction&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>rag</category>
      <category>llm</category>
      <category>elasticsearch</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Getting started with Elasticsearch + Python</title>
      <dc:creator>Iulia Feroli</dc:creator>
      <pubDate>Wed, 07 Feb 2024 11:50:24 +0000</pubDate>
      <link>https://dev.to/iuliaferoli/getting-started-with-elasticsearch-python-5adi</link>
      <guid>https://dev.to/iuliaferoli/getting-started-with-elasticsearch-python-5adi</guid>
      <description>&lt;p&gt;This blog will introduce you to some core concepts and building blocks of working with the official Elasticsearch Python client. We will cover connecting to the client, creating and populating an index, adding a custom mapping, and running some initial simple queries.&lt;/p&gt;

&lt;p&gt;But wait, what is Elasticsearch? Elastic is a platform offering search-powered solutions - from the Elasticsearch engine to observability and dashboarding solutions like Kibaba, security software, and much more. &lt;/p&gt;

&lt;p&gt;Elasticsearch is based on &lt;a href="https://lucene.apache.org/" rel="noopener noreferrer"&gt;Lucene&lt;/a&gt; and is used by various companies and developers across the world to build custom search solutions. &lt;/p&gt;

&lt;h2&gt;
  
  
  Getting connected
&lt;/h2&gt;

&lt;p&gt;You can deploy Elastic locally or in your cloud of choice. &lt;br&gt;
In this example, I'm connecting to Elastic Cloud with an API Key (the recommended security option), but there are a lot of &lt;a href="https://www.elastic.co/guide/en/elasticsearch/client/python-api/current/connecting.html" rel="noopener noreferrer"&gt;other authentication options you can see here.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Elasticsearch can be accessed via API calls through various interfaces, from the Dev Console available in the cloud browser to various language models, or tools like Postman that can send in requests.&lt;/p&gt;

&lt;p&gt;The Python client is a high-level client that allows us to interact with Elasticsearch from our IDE and add functionalities like search to our projects. &lt;/p&gt;

&lt;p&gt;You can install the client in your environment: &lt;code&gt;! pip install elasticsearch&lt;/code&gt; and start coding. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;a href="https://elasticsearch-py.readthedocs.io/en/v8.12.0/" rel="noopener noreferrer"&gt;The docs for the client&lt;/a&gt; are available here.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Using your credentials you can create an instance of the Elasticsearch client that will communicate with your (in this case cloud) deployment.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;getpass&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;getpass&lt;/span&gt;  &lt;span class="c1"&gt;# For securely getting user input
&lt;/span&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;elasticsearch&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Elasticsearch&lt;/span&gt;

&lt;span class="c1"&gt;# Prompt the user to enter their Elastic Cloud ID and API Key securely
&lt;/span&gt;&lt;span class="n"&gt;ELASTIC_CLOUD_ID&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;getpass&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Elastic Cloud ID: &lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;ELASTIC_API_KEY&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;getpass&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Elastic API Key: &lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Create an Elasticsearch client using the provided credentials
&lt;/span&gt;&lt;span class="n"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Elasticsearch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;cloud_id&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;ELASTIC_CLOUD_ID&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;  &lt;span class="c1"&gt;# cloud id can be found under deployment management
&lt;/span&gt;    &lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;ELASTIC_API_KEY&lt;/span&gt; 
&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="c1"&gt;# API keys can be generated under management / security
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Creating an Index
&lt;/h3&gt;

&lt;p&gt;Next up we want to add data.&lt;br&gt;
Elasticsearch indexes data in the form of documents (that look like dictionaries or JSON files). Think of it as a collection of properties and values. &lt;br&gt;
To make this example fun, I'll be using a &lt;a href="https://www.kaggle.com/datasets/gulsahdemiryurek/harry-potter-dataset" rel="noopener noreferrer"&gt;dataset of Harry Potter Characters from Kaggle. &lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then each entry in our dataset (or document in our index) will represent one character. A simplified version of such a document would look like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;document&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; 
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Name&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Iulia&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Eye colour&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;hazel&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;House&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Slytherin&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Loyalty&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Order of the Phoenix&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Patronus&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Basset Hound Puppy&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;So let's create this index! We can first create a mapping - which tells Elasticsearch what kind of features it can expect from the incoming data we want to add, to ensure everything is stored in the right format. Read more about &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-types.html" rel="noopener noreferrer"&gt;the various field types here.&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Custom Mapping
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;hp_characters&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="n"&gt;settings&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt;
&lt;span class="n"&gt;mappings&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;_meta&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;created_by&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Iulia Feroli&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;properties&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Name&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;type&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;keyword&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;type&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;text&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Eye colour&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;type&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;keyword&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;House&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;type&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;keyword&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Loyalty&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;type&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;text&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Patronus&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;type&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;keyword&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;indices&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;create&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;settings&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;settings&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;mappings&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;mappings&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Adding data
&lt;/h3&gt;

&lt;p&gt;Now, we can either add the documents to our index one at a time, like for our first example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;index&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;document&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;doc_test&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Or we can do a bulk index, adding the entire dataset at once, leveraging the Elasticsearch &lt;code&gt;bulk&lt;/code&gt; &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/8.11/docs-bulk.html" rel="noopener noreferrer"&gt;API.&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;generate_operations&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;documents&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;index_name&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
  &lt;span class="n"&gt;operations&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
  &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;document&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;enumerate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;documents&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
      &lt;span class="n"&gt;operations&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;index&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;_index&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;index_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;_id&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;}})&lt;/span&gt;
      &lt;span class="n"&gt;operations&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;document&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;operations&lt;/span&gt;

&lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;bulk&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;index_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;operations&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nf"&gt;generate_operations&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;characters&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;index_name&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;refresh&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Done! Now we have the index in Elastic - which means we can start searching.&lt;/p&gt;

&lt;h3&gt;
  
  
  You know, for search.
&lt;/h3&gt;

&lt;p&gt;We can create simple queries based on just one feature and value, like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;search&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;hp_characters&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;match&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Loyalty&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Dumbledores Army&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;})&lt;/span&gt;

&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;We get back {total} results, here are the top ones:&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;format&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;total&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;hits&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;total&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;value&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]))&lt;/span&gt;
&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;hit&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;hits&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;hits&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]:&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;hit&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;_source&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Name&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You'll notice we also processed the result a little since by default Elasticsearch returns a pretty comprehensive JSON with loads of metadata from our search. In this case, we'll get the name of every character associated with Dumbledore's army (so pretty much the entire Gryffindor 5th years and their allies). &lt;/p&gt;

&lt;p&gt;As an FYI, this is what a query and result would look like from the Dev Tools interface on Cloud:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5tr7sbgduyn6jl0rsdrh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5tr7sbgduyn6jl0rsdrh.png" alt=" " width="800" height="410"&gt;&lt;/a&gt; &lt;/p&gt;

&lt;h3&gt;
  
  
  Building up
&lt;/h3&gt;

&lt;p&gt;From here, we can start mixing and matching all kinds of &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/search-your-data.html" rel="noopener noreferrer"&gt;query types&lt;/a&gt; to create increasingly specific searches:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;search&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;hp&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;bool&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;must&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;multi_match&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;query&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;quidditch chaser keeper beater seeker&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;fields&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Job&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Skills&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;]&lt;/span&gt; 
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;match&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;House&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Gryffindor&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="p"&gt;],&lt;/span&gt;
      &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;must_not&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;range&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Birth&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
              &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;lte&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;1980-01-01&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
              &lt;span class="p"&gt;}&lt;/span&gt;
          &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;filter&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;term&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Hair colour&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Red&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;

      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;  
  &lt;span class="p"&gt;})&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Can you guess the answer to this one? It translates to - Red-haired quidditch-associated Gryffindors born after 1980. &lt;br&gt;
&lt;em&gt;(the answer is Ginny &amp;amp; Ron)&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  That's it, everyone!
&lt;/h3&gt;

&lt;p&gt;That's the basics of interacting with Elasticsearch via the Python client to run searches. &lt;/p&gt;

&lt;p&gt;There's loads more you can do in Elastic, from observability &amp;amp; visualizations to semantic / vector search, and way more. &lt;/p&gt;

&lt;p&gt;Stay tuned for the next articles :) &lt;/p&gt;

</description>
      <category>python</category>
      <category>elasticsearch</category>
      <category>beginners</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Simple Flask Integration for an Elastic Semantic Search App</title>
      <dc:creator>Iulia Feroli</dc:creator>
      <pubDate>Tue, 06 Feb 2024 20:57:09 +0000</pubDate>
      <link>https://dev.to/iuliaferoli/simple-flask-integration-for-an-elastic-semantic-search-app-587i</link>
      <guid>https://dev.to/iuliaferoli/simple-flask-integration-for-an-elastic-semantic-search-app-587i</guid>
      <description>&lt;h2&gt;
  
  
  Have you been on any websites without a search function lately?
&lt;/h2&gt;

&lt;p&gt;How about ones that didn't allow for typos, synonyms, or "I forgot the exact word but I'm looking for something to do with these themes?". Probably not... or if you did it was likely a frustrating experience. &lt;/p&gt;

&lt;p&gt;Semantic (or vector) Search is becoming ubiquitous in the way we interact with the internet - we're always looking for something, and we want to find it no matter how bad we may be at formulating what it is!&lt;/p&gt;

&lt;h3&gt;
  
  
  Enter Elasticsearch.
&lt;/h3&gt;

&lt;p&gt;With the Lucene-based engine you can quickly build an &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/documents-indices.html" rel="noopener noreferrer"&gt;index&lt;/a&gt; of documents (be it numbers, words, images, or sound) and start searching through them based on various filters, aggregations, and fancy new-age models like &lt;a href="https://www.elastic.co/guide/en/machine-learning/current/ml-nlp-elser.html" rel="noopener noreferrer"&gt;ELSER&lt;/a&gt;. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;You can check out the source code for this project + my full tutorial &lt;a href="https://github.com/iuliaferoli/harry-potter-search?tab=readme-ov-file#phase-3" rel="noopener noreferrer"&gt;for an end to end Elastic Semantic Search Application here&lt;/a&gt; &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;In this blog, we're going to address the "on any website" part of a Search Solution. Or at least - propose a starting point for it. There are many great tutorials out there for a deep dive on Flask - one of the best &lt;a href="https://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-i-hello-world" rel="noopener noreferrer"&gt;from my colleague Miguel&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;However, as someone getting reacquainted/jumping deeper into the dev realm after mostly shallow toe-dipping I found some of the guides a bit overwhelming. So this will be a very simple, stripped-down, essentials-only example of a basic Flask app. &lt;/p&gt;

&lt;h3&gt;
  
  
  So what's our minimally viable "Website with search"?
&lt;/h3&gt;

&lt;p&gt;We have a Search Engine we've built for our custom domain. (You can think of a retail online store, blogging website, cooking recipe inventory, news website, etc - what I'm using is the Harry Potter books, the content is mostly irrelevant.)&lt;/p&gt;

&lt;p&gt;On the Elastic Side, you can see how I built my Semantic &lt;a href="https://github.com/iuliaferoli/harry-potter-search?tab=readme-ov-file#harry-potter-movie-dialoogue-index--intro-to-elasticsearch-python-client" rel="noopener noreferrer"&gt;Search App here&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;What we want now is to abstract all that and only use a few simple functions to call the functionalities we need. Namely, connect to the client; run the actual user input as a semantic search on our index, and log the search to our history. See &lt;a href="https://github.com/iuliaferoli/harry-potter-search/blob/main/helper_functions.py" rel="noopener noreferrer"&gt;these functions here&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Now let's Flask!
&lt;/h3&gt;

&lt;p&gt;I'm building three pages - so we will an HTML template file, and a function and API mapping for each.&lt;/p&gt;

&lt;p&gt;Our folder structure:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9e7qtka5zp8qeyen2b2v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9e7qtka5zp8qeyen2b2v.png" alt=" " width="502" height="334"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;See the full &lt;a href="https://github.com/iuliaferoli/harry-potter-search/blob/main/web_app.py" rel="noopener noreferrer"&gt;Python web_app&lt;/a&gt; file here&lt;/p&gt;

&lt;p&gt;See the full &lt;a href="https://github.com/iuliaferoli/harry-potter-search/tree/main/templates" rel="noopener noreferrer"&gt;HTML template files&lt;/a&gt; here&lt;/p&gt;

&lt;p&gt;The website pages will end us looking like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6gog9vt6jfva8alempfn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6gog9vt6jfva8alempfn.png" alt=" " width="800" height="1220"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  1. User input for our search.
&lt;/h3&gt;

&lt;p&gt;The premise is simple - one simple form and one button. &lt;/p&gt;

&lt;p&gt;Create the routes in the file we define the Flask&lt;br&gt;
&lt;br&gt;
 &lt;code&gt;app = Flask(__name__)&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="nd"&gt;@app.route&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;/&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;search&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
   &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;render_template&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;search.html&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And create a simple HTML template:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;form&lt;/span&gt; &lt;span class="na"&gt;action = &lt;/span&gt;&lt;span class="s"&gt;"http://127.0.0.1:5000/search"&lt;/span&gt; &lt;span class="na"&gt;method = &lt;/span&gt;&lt;span class="s"&gt;"POST"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;p&amp;gt;&lt;/span&gt;Your Query: &lt;span class="nt"&gt;&amp;lt;input&lt;/span&gt; &lt;span class="na"&gt;type = &lt;/span&gt;&lt;span class="s"&gt;"text"&lt;/span&gt; &lt;span class="na"&gt;name = &lt;/span&gt;&lt;span class="s"&gt;"question"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&amp;lt;/p&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;p&amp;gt;&amp;lt;input&lt;/span&gt; &lt;span class="na"&gt;type = &lt;/span&gt;&lt;span class="s"&gt;"submit"&lt;/span&gt; &lt;span class="na"&gt;value = &lt;/span&gt;&lt;span class="s"&gt;"submit"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&amp;lt;/p&amp;gt;&lt;/span&gt;
 &lt;span class="nt"&gt;&amp;lt;/form&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;A user can now type in their query; and behind the scenes it is run in Elasticsearch, and the user is routed to a page where they can see the results, namely: &lt;/p&gt;

&lt;h3&gt;
  
  
  2. View result of search
&lt;/h3&gt;

&lt;p&gt;This is the meat and potatoes of our project. For the user, it's another simple static page; but routing to this page runs the helper functions we mentioned earlier - interacting with our Elastic backend. &lt;br&gt;
On the surface though, we quickly get the top results for our search in a few seconds. Users want to find stuff fast!&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="nd"&gt;@app.route&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;/search&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt; &lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="n"&gt;methods&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;POST&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;GET&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;show_search_term&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;request&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;method&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;POST&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="c1"&gt;# getting the query from the user
&lt;/span&gt;        &lt;span class="n"&gt;question&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;request&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;form&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;question&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
        &lt;span class="c1"&gt;# running the semantic search model and getting the results from Elasticsearch
&lt;/span&gt;        &lt;span class="n"&gt;answer&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;semantic_search&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;question&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;  &lt;span class="n"&gt;model_id&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;model_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Logging the search &amp;amp; response in a separate index
&lt;/span&gt;        &lt;span class="n"&gt;document&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Query&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;question&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Response&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;answer&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;date&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;datetime&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;now&lt;/span&gt;&lt;span class="p"&gt;()}&lt;/span&gt;
        &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;index&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;historical_searches&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;document&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;document&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="c1"&gt;#print(response)
&lt;/span&gt;
        &lt;span class="c1"&gt;# Returning the template for the user to view their results
&lt;/span&gt;        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;render_template&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;search_result.html&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;answer&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;answer&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;question&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;question&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is rendered with a jinja2 template (allowing the HTML to use for statements to iterate through lists) allowing us to loop through the list of documents we get back from Elastic and show them on the page:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;body&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;h1&amp;gt;&lt;/span&gt;Your query was: {{ question }}&lt;span class="nt"&gt;&amp;lt;/h1&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;p&amp;gt;&lt;/span&gt;Search Results:&lt;span class="nt"&gt;&amp;lt;/p&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;ul&lt;/span&gt; &lt;span class="na"&gt;id=&lt;/span&gt;&lt;span class="s"&gt;"answer"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
{% for item in answer %}
&lt;span class="nt"&gt;&amp;lt;li&amp;gt;&lt;/span&gt; {{ item }} &lt;span class="nt"&gt;&amp;lt;/li&amp;gt;&lt;/span&gt;
{% endfor %}
&lt;span class="nt"&gt;&amp;lt;/ul&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/body&amp;gt;&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Lastly, if you want to see previous searches, you can go to:&lt;/p&gt;

&lt;h3&gt;
  
  
  3. History of Searches.
&lt;/h3&gt;

&lt;p&gt;This page triggers a separate call to Elastic, asking to give us back the latest queries and answers that have been run by users. If the previous was meat and potatoes, this is... gravy?&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="nd"&gt;@app.route&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;/history&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;show_history&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
   &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;search&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;historical_searches&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;sort&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;date&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;order&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;desc&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;}}])&lt;/span&gt;
   &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;render_template&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;history.html&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;hits&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;hits&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The template is very similar to the previous, just with one extra for loop to show multiple lists on answers:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;
&lt;span class="nt"&gt;&amp;lt;h1&amp;gt;&lt;/span&gt;These are the past searches ran:&lt;span class="nt"&gt;&amp;lt;/h1&amp;gt;&lt;/span&gt;
{% for result in response %}
&lt;span class="nt"&gt;&amp;lt;p&amp;gt;&lt;/span&gt;Your query was: {{ result._source.Query }}&lt;span class="nt"&gt;&amp;lt;/p&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;p&amp;gt;&lt;/span&gt;Search Results:&lt;span class="nt"&gt;&amp;lt;/p&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;ul&lt;/span&gt; &lt;span class="na"&gt;id=&lt;/span&gt;&lt;span class="s"&gt;"answer"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;

{% for item in result._source.Response %}
&lt;span class="nt"&gt;&amp;lt;li&amp;gt;&lt;/span&gt; {{ item }} &lt;span class="nt"&gt;&amp;lt;/li&amp;gt;&lt;/span&gt;
{% endfor %}
&lt;span class="nt"&gt;&amp;lt;/ul&amp;gt;&lt;/span&gt;
{% endfor %}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  There we go!
&lt;/h3&gt;

&lt;p&gt;With a few simple lines of code, you can turn a search engine into a "Website with a Search Function". Naturally, you will more likely include these capabilities in your existing web infrastructure - which is surely more sophisticated than three Flask pages. &lt;/p&gt;

&lt;p&gt;Hopefully, this blog mostly works to illustrate the user experience - and how close to your fingertips it is!&lt;/p&gt;

&lt;p&gt;Happy Searching! &lt;br&gt;
Next up - making these pages not look horrible and expanding on the search web app idea. Stay tuned :) &lt;/p&gt;

</description>
      <category>elasticsearch</category>
      <category>nlp</category>
      <category>python</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Hello, dev!</title>
      <dc:creator>Iulia Feroli</dc:creator>
      <pubDate>Thu, 25 Jan 2024 13:39:51 +0000</pubDate>
      <link>https://dev.to/iuliaferoli/hello-dev-3eh4</link>
      <guid>https://dev.to/iuliaferoli/hello-dev-3eh4</guid>
      <description>&lt;p&gt;This is an intro post to say Hi!&lt;/p&gt;

&lt;p&gt;I'm Iulia (iulia not Lulia) and I work in tech, more specifically as a Developer Advocate (aka devrel//tech evangelism/other buzzwords). Currently, I'm @ Elastic.&lt;/p&gt;

&lt;p&gt;Basically, I build content &amp;amp; demos with Python, mostly about ML, NLP, LLMs, and semantic search/embeddings/vectors. And you know, search. &lt;/p&gt;

&lt;p&gt;You can find me at tech conferences across EMEA (mostly Norther Europe) either at one of our Elastic booths or speaking about Harry Potter for some reason. &lt;br&gt;
I also host the meetups at our Elastic Amsterdam office (one every month, we have pizza!)&lt;/p&gt;

&lt;p&gt;Links:&lt;br&gt;
&lt;a href="https://www.linkedin.com/in/iuliaferoli/" rel="noopener noreferrer"&gt;https://www.linkedin.com/in/iuliaferoli/&lt;/a&gt;&lt;br&gt;
&lt;a href="https://mastodon.social/@iulia_" rel="noopener noreferrer"&gt;https://mastodon.social/@iulia_&lt;/a&gt;&lt;br&gt;
&lt;a href="https://youtube.com/iuliaferoli" rel="noopener noreferrer"&gt;https://youtube.com/iuliaferoli&lt;/a&gt;&lt;/p&gt;

</description>
      <category>beginners</category>
      <category>python</category>
      <category>elasticsearch</category>
      <category>nlp</category>
    </item>
  </channel>
</rss>
