<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Dan Stanhope</title>
    <description>The latest articles on DEV Community by Dan Stanhope (@danstanhope).</description>
    <link>https://dev.to/danstanhope</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/danstanhope"/>
    <language>en</language>
    <item>
      <title>Generate a PDF &amp; upload to S3 using AWS Lambda &amp; Puppeteer.</title>
      <dc:creator>Dan Stanhope</dc:creator>
      <pubDate>Thu, 09 Jun 2022 18:09:13 +0000</pubDate>
      <link>https://dev.to/danstanhope/create-and-upload-pdf-to-s3-using-aws-lambda-puppeteer-219o</link>
      <guid>https://dev.to/danstanhope/create-and-upload-pdf-to-s3-using-aws-lambda-puppeteer-219o</guid>
      <description>&lt;p&gt;I've put together a working project that'll take all of your dev.to posts, create a .pdf and upload it to S3. Ba-boom.&lt;/p&gt;

&lt;p&gt;Head over to GitHub to grab the &lt;a href="https://github.com/danstanhope/aws-lambda-puppeteer" rel="noopener noreferrer"&gt;code&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;First, a little bit about the project. It's run using sam cli and CloudFormation(make sure you've got your sam/aws cli installed and configured properly. Here's how to do &lt;a href="https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html" rel="noopener noreferrer"&gt;this&lt;/a&gt;). &lt;/p&gt;

&lt;p&gt;Modules and custom code will be compiled into a Lambda Layer. This is where all the requisite node_modules will live, as well as some custom code and our html templates. Cool thing about layers is once you've built it you can share it amongst your other functions. We're also going to be including another layer(&lt;code&gt;chrome-aws-lambda&lt;/code&gt;), which will be used to run puppeteer inside the Lambda environment.&lt;/p&gt;

&lt;p&gt;Here's what the .pdf looks like with my data(I know, not great. It's a .pdf and I'm no designer. Let's move on).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqgl2af5lhntowoksyplv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqgl2af5lhntowoksyplv.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Show me the code!
&lt;/h3&gt;

&lt;p&gt;I briefly mentioned layers before, it's a pretty cool feature of AWS Lambda. If you've got a bunch of functions running and you want to share node_modules or custom code(i.e. adapters, helper functions, templates etc.) you can create 1 or more layers and attach them to your functions.&lt;/p&gt;

&lt;p&gt;When importing anything other than a &lt;code&gt;node_module&lt;/code&gt; you need to reference a different path(&lt;code&gt;/opt/&lt;/code&gt;). Here you can see the modules being included vs. the custom code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvts9dg0wlnfsiau17zca.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvts9dg0wlnfsiau17zca.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The function is quite simple, we start off by ingesting our html template from our layer and initializing a few variables(*note: change the bucket name to something other than my name). Next, we pull down some dev.to posts and compile our template.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs90new88vhash5tyq4vg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs90new88vhash5tyq4vg.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then we initialize puppeteer and pass in our template string. We'll create a buffer and give this to our S3 upload method.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh99keu35uasi5fh6vyal.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh99keu35uasi5fh6vyal.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Guys. That's it. It's done. We made a .pdf.&lt;/p&gt;

&lt;h3&gt;
  
  
  How do I run this?
&lt;/h3&gt;

&lt;p&gt;Once you've cloned the repo, head into &lt;code&gt;/layers/shared/nodejs&lt;/code&gt; and run &lt;code&gt;yarn&lt;/code&gt;. This will install all the packages we need. You can add this to a build step at some point, too. When creating a layer, it's important to note that you need to include the function's runtime as part of the folder structure in order for lambda to recognize it(in this case &lt;code&gt;nodejs&lt;/code&gt;). &lt;/p&gt;

&lt;p&gt;Jump back into the root of the project and run: ```sam&lt;br&gt;
&lt;br&gt;
 local invoke PuppeteerFunction --no-event&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
If you've configured everything properly, you should have a nice .pdf created &amp;amp; waiting for you in S3.

One thing to note, there is some strange bug with the aws chrome package and it's not working with Nodejs14. That's why all the runtimes are set to Nodejs12. If you can get it working with a more up-to-date runtime, please let me know and I'll update the repo.

Thanks so much for reading! Hope this helps someone.


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

</description>
      <category>aws</category>
      <category>serverless</category>
      <category>node</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>What's Automatic Batching? React 18 feature explained</title>
      <dc:creator>Dan Stanhope</dc:creator>
      <pubDate>Thu, 31 Mar 2022 14:57:49 +0000</pubDate>
      <link>https://dev.to/danstanhope/whats-automatic-batching-react-18-feature-explained-k33</link>
      <guid>https://dev.to/danstanhope/whats-automatic-batching-react-18-feature-explained-k33</guid>
      <description>&lt;p&gt;If you've ever built a component in React, chances are you've used state variables. If you've ever build a kinda-complex component in React, chances are you've got multiple state variables in use.&lt;/p&gt;

&lt;p&gt;So, what happens when we update those variables? The component re-renders, right? Make changes to a bunch of state variables and a bunch of re-rendering happens. All this rendering could have performance implications for your app.&lt;/p&gt;

&lt;p&gt;Introducing automatic batching. Now, batching has been around for a bit in React. But, React only automatically batched state changes for you if they were called inside a hook or browser event. Like say, a click event:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fizhu7idp3rd2lzxz13vf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fizhu7idp3rd2lzxz13vf.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Console Output:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F21ucjjt64f2xjt9lwphe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F21ucjjt64f2xjt9lwphe.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is automatic batching. React takes multiple state changes and groups them together so they don't happen independently -- fantastic stuff. &lt;/p&gt;

&lt;p&gt;So where's the improvement?&lt;/p&gt;

&lt;p&gt;There are other places you may want to change state in your component(promises, timeouts). Let's say we have a component that fetches some data after a button click. We have two state variables, an array of users and a page counter. We want to update these inside a promise once the data is returned. In React 17, this will cause the component to re-render twice.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fodcpwsa8klkmd59z1j69.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fodcpwsa8klkmd59z1j69.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Console Output React 17:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fitp92lu79jbkhxmnw2tb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fitp92lu79jbkhxmnw2tb.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Console Output React 18:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3x4f65j34qux0b28zioz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3x4f65j34qux0b28zioz.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is great! You can make changes to a couple state variables and React will apply them all at the same time, automatically, for you. Awesome!&lt;/p&gt;

&lt;p&gt;If you weren't aware of how batching worked in previous versions of React, hopefully you know the limitations now. And if you've got some components out there changing state variables inside promises, might be time to upgrade :)&lt;/p&gt;

&lt;p&gt;Thanks!&lt;/p&gt;

</description>
      <category>react</category>
    </item>
    <item>
      <title>Building a Compound Interest Calculator using Chart.js + Next.js</title>
      <dc:creator>Dan Stanhope</dc:creator>
      <pubDate>Thu, 03 Mar 2022 19:25:21 +0000</pubDate>
      <link>https://dev.to/danstanhope/compound-interest-calculator-using-chartjs-react-1lac</link>
      <guid>https://dev.to/danstanhope/compound-interest-calculator-using-chartjs-react-1lac</guid>
      <description>&lt;h3&gt;
  
  
  Overview
&lt;/h3&gt;

&lt;p&gt;So, just for fun I decided to create a compound interest calculator using Next.js, TypeScript and Chart.js. &lt;/p&gt;

&lt;p&gt;I wanted to kick the tires on Vercel, since I hadn't deployed anything to their platform prior to this. It's so awesome! The build times are so fast and it's really easy to get up-and-running.&lt;/p&gt;

&lt;p&gt;You can view the calculator &lt;a href="https://www.compoundinterest.cloud/" rel="noopener noreferrer"&gt;here&lt;/a&gt;(excuse the domain name, it was the cheapest one I could find that still made some sense haha.). Also, I've got all the code in a public repo &lt;a href="https://github.com/danstanhope/react-compound-interest-calculator" rel="noopener noreferrer"&gt;here&lt;/a&gt;, if you want to take a closer look.&lt;/p&gt;

&lt;p&gt;In terms of the calculations, I worked off of the formulas found &lt;a href="https://www.vertex42.com/Calculators/compound-interest-calculator.html#rate-per-period" rel="noopener noreferrer"&gt;here&lt;/a&gt;. I did my best to verify my results against some other sites out there and, as far as I can tell, it's working well -- famous last words.&lt;/p&gt;

&lt;h3&gt;
  
  
  Let's talk code
&lt;/h3&gt;

&lt;p&gt;This is the first react project I haven't used Redux with in some time. Instead, opting to go with &lt;code&gt;useContext&lt;/code&gt; and &lt;code&gt;useReducer&lt;/code&gt;. Once I got it running, I thought it was great!&lt;/p&gt;

&lt;p&gt;I had a few components that needed access to the input field values in order to generate the results and plot the graph etc. &lt;code&gt;useContext&lt;/code&gt; made sharing the state between all the components that needed it really straight forward.&lt;/p&gt;

&lt;p&gt;It'll be pretty long-winded to go through the entire project and explain every aspect, so I figured it'd be best to showcase a couple pieces that I found interesting.&lt;/p&gt;

&lt;h3&gt;
  
  
  Setting up useContext
&lt;/h3&gt;

&lt;p&gt;In order to set up Context and share it amongst your components there's only a few things you need to do.&lt;/p&gt;

&lt;p&gt;First, you need to create your context. Make sure you create this as a module, as you'll need it again in your components.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2gwy1ydvretnnsu1sa2j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2gwy1ydvretnnsu1sa2j.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then you need to wrap the components that will be sharing the state in a provider component. It's useful to note that you can share multiple contexts by simply nesting the provider components. As you can probably guess from the screen-shot, the &lt;code&gt;Form&lt;/code&gt;, &lt;code&gt;Graph&lt;/code&gt; and &lt;code&gt;Table&lt;/code&gt; components will have access to both Contexts.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwvcfdo8eyj8c14aqpbiu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwvcfdo8eyj8c14aqpbiu.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;At the component level, you'll just need to import your context definition we used in the first step and initialize it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fstpyvh43oag4bnvddfnb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fstpyvh43oag4bnvddfnb.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once you have this running, you'll be able to access your state across components. Sa-weet!&lt;/p&gt;

&lt;h3&gt;
  
  
  Crunching numbers
&lt;/h3&gt;

&lt;p&gt;Calculating compound interest is pretty straight-forward when your compound frequency and payment frequency are the same. Like if you were to make an annual contribution and your interest was also going to compound annually as well.&lt;/p&gt;

&lt;p&gt;Most of the calculators out there allow for the user to mix things up a bit. I mean, for real-world savers the payment &amp;amp; compound frequency seldom match. What if I want to make a monthly contribution but my interest compounds annually? Or semi-annually? Well, you've got alter the formula a little. This was the part that took me a bit to figure out as most of the tutorials out there never went into this and the calculators I was checking my results against did -- they never matched and my brain hurt.&lt;/p&gt;

&lt;p&gt;In order to allow for varying payment and compounding frequency, you'll need to calculate the &lt;code&gt;rate&lt;/code&gt; and the &lt;code&gt;total number of payment periods&lt;/code&gt; variables slightly differently.&lt;/p&gt;

&lt;p&gt;Here's the entire, working formula used in the codebase.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;F = P*(1+rate)^nper + A*( ((1+rate)^nper - 1)/rate )&lt;/code&gt;&lt;br&gt;
&lt;code&gt;rate = ((1+r/n)^(n/p))-1&lt;/code&gt;&lt;br&gt;
&lt;code&gt;nper = p*t&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;*Head &lt;a href="https://www.vertex42.com/Calculators/compound-interest-calculator.html#rate-per-period" rel="noopener noreferrer"&gt;here&lt;/a&gt; for a more detailed explanation.&lt;/p&gt;

&lt;p&gt;Once I had that working, it was just a matter of iterating for each year and adding the results to an array.&lt;/p&gt;

&lt;p&gt;Anyway, if you're interested in seeing how this works clone the repo and let me know if you have any questions.&lt;/p&gt;

&lt;p&gt;Apologies that the write-up wasn't super-involved. You'll need to get into the code anyway to figure this all out.&lt;/p&gt;

&lt;p&gt;Thanks for reading!&lt;/p&gt;

</description>
      <category>react</category>
      <category>javascript</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Twitter Authentication using Supabase + React + Redux + Typescript</title>
      <dc:creator>Dan Stanhope</dc:creator>
      <pubDate>Thu, 30 Dec 2021 16:22:43 +0000</pubDate>
      <link>https://dev.to/danstanhope/twitter-authentication-using-supabase-react-redux-typescript-4c87</link>
      <guid>https://dev.to/danstanhope/twitter-authentication-using-supabase-react-redux-typescript-4c87</guid>
      <description>&lt;p&gt;As the title of this post suggests, we're going to be building a small React + Redux app that'll allow your users to authenticate using their twitter credentials and access &lt;strong&gt;auth-only&lt;/strong&gt; parts of the app.&lt;/p&gt;

&lt;p&gt;The code for this tutorial can be found &lt;a href="https://github.com/danstanhope/supabase-twitter-authentication" rel="noopener noreferrer"&gt;here&lt;/a&gt;. Go ahead and clone that and get it running using &lt;strong&gt;yarn update&lt;/strong&gt; and &lt;strong&gt;yarn start&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Unfortunately, there's a bit of boring setup &amp;amp; config stuff we'll have to walk through in order to get this working for you. So bear with me and we'll get this going, I promise(If I missed something, let me know and I'll help you work through it).&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Twitter + Supabase Setup
&lt;/h3&gt;

&lt;p&gt;Head over to Supabase and grab your project's api url(&lt;strong&gt;settings -&amp;gt;api-&amp;gt;config-&amp;gt;url&lt;/strong&gt;) and append &lt;strong&gt;/auth/v1/callback&lt;/strong&gt; to it. This is going to form the callback url we provide to Twitter, it'll look something like this: &lt;strong&gt;https://.supabase.co/auth/v1/callback&lt;/strong&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Twitter
&lt;/h4&gt;

&lt;p&gt;Make your way over to &lt;a href="https://developer.twitter.com/" rel="noopener noreferrer"&gt;https://developer.twitter.com/&lt;/a&gt; and create a new project. Once you've filled everything out, make sure to keep your API key &amp;amp; secret(used as client_id and client_key in Supabase). &lt;/p&gt;

&lt;p&gt;Next, you'll need to create an app. When you've done that, you'll need to enable 3rd party authentication -- this is where we'll add our callback url.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn33xw683v9k3a69fq26p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn33xw683v9k3a69fq26p.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft1bufsgiiv6xnlu07npu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft1bufsgiiv6xnlu07npu.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Be sure to add the localhost url to the callback section. For any of the remaining, required url fields just put your app url in for now.&lt;/p&gt;

&lt;h4&gt;
  
  
  Supabase
&lt;/h4&gt;

&lt;p&gt;Under settings on the authentication tab enter the site url as well as our localhost callback url. A picture is worth a thousand words...&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwbuzwb86plpf3d6l4jsl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwbuzwb86plpf3d6l4jsl.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, enable Twitter authentication and enter your Twitter creds.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgtnhoeatti7dxefp511a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgtnhoeatti7dxefp511a.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Geez, hopefully that wasn't too painful and I haven't lost too many people. If we did this correctly, the boring config stuff is over. Woot woot!&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Code walkthrough
&lt;/h3&gt;

&lt;p&gt;First thing is first, you'll need to make a couple changes to the &lt;strong&gt;env.development&lt;/strong&gt; file. Most importantly, adding your Supabase anon key(&lt;strong&gt;settings -&amp;gt;api-&amp;gt;project api keys&lt;br&gt;
-&amp;gt;anon public&lt;/strong&gt;).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fots5fk9l8selqcaopbve.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fots5fk9l8selqcaopbve.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The project is a pretty standard React + Redux app(styled with Tailwind). The most important part to take note of is how we're protecting the pages of our app that require the user to be authenticated. Take a look at &lt;strong&gt;src-&amp;gt;pages-&amp;gt;PrivateRoute.tsx &amp;amp; src-&amp;gt;App.tsx&lt;/strong&gt; and you'll be able to see how the PrivateRoute component ensures that only authenticated users can access certain pages.&lt;/p&gt;

&lt;h4&gt;
  
  
  PrivateRoute.tsx
&lt;/h4&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbllh7rcmmj8d7za3hw6h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbllh7rcmmj8d7za3hw6h.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  App.tsx
&lt;/h4&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb87v9fpbel325g4oghx4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb87v9fpbel325g4oghx4.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After running our yarn commands, you should have a page open that looks like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu1o82lle47ysxui5xy8g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu1o82lle47ysxui5xy8g.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If everything is setup properly, after clicking the sign in button you'll be sent to Twitter to enter your credentials and sent back to our dashboard page.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh5j2aivp3rmkda2sfiix.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh5j2aivp3rmkda2sfiix.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Hope this helps someone get started with Supabase authentication. As an aside, you could easily port this project over to Firebase or Amplify authentication with few code changes. There would be a bunch more super-fun setup and config for you to do though :)&lt;/p&gt;

&lt;p&gt;If you get stuck or have any feedback, let me know!&lt;/p&gt;

&lt;p&gt;Thanks for reading.&lt;/p&gt;

</description>
      <category>tutorial</category>
      <category>react</category>
      <category>typescript</category>
      <category>redux</category>
    </item>
    <item>
      <title>Making sense of Next.js rendering</title>
      <dc:creator>Dan Stanhope</dc:creator>
      <pubDate>Tue, 21 Dec 2021 19:49:21 +0000</pubDate>
      <link>https://dev.to/danstanhope/making-sense-of-nextjs-rendering-252c</link>
      <guid>https://dev.to/danstanhope/making-sense-of-nextjs-rendering-252c</guid>
      <description>&lt;p&gt;Gone are the days when you'd fetch data from your database, pass it to your view and render it. Well, these days aren't gone. But we can definitely do better.&lt;/p&gt;

&lt;p&gt;Next.js has three different rendering modes: Server Side Rendering(SSR), Static Site Generation(SSR) and the newly minted ISR(Incrementation Site Regeneration). Each method serves a purpose and comes with its own pros and cons list -- let's get into those!&lt;/p&gt;

&lt;p&gt;To illustrate how the three modes function, I've included a small amount of code you could use in a Next.js project.&lt;/p&gt;

&lt;p&gt;The component makes a request for a list of public repos and, to introduce some variability, randomly selects the sort direction before rendering.&lt;/p&gt;

&lt;p&gt;If you'd like to follow along, create a Next.js project(&lt;strong&gt;yarn create next-app&lt;/strong&gt;) and replace the code found in pages/index.js.&lt;/p&gt;

&lt;h3&gt;
  
  
  SSG(Static Site Generation)
&lt;/h3&gt;

&lt;p&gt;In the early days of the web, this is kinda how websites worked. HTML files were just stored on disc on a webserver and with each request these pages were served up. No trips to the database or rendering to be done, it's all ready to go and sitting there. Changing the page content is a bit of pain, but because everything is built and you're reading from disc, it's blazing-fast.&lt;/p&gt;

&lt;p&gt;With SSG your pages are generated during Next.js' build step, and once these pages are created, that's it. They don't change until you run the build step again. This step is probably done as part of a CI/CD deployment process, so updating the site means running through that again before heading to production.&lt;/p&gt;

&lt;p&gt;To try this out, &lt;strong&gt;yarn build and start&lt;/strong&gt; your project. Once your site is ready, hit refresh a few times and you'll notice the order of the list never changes. That's because only 1 request for data was made and that happened at build time.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;export default function Repos({ repos }) {
  return (
    &amp;lt;&amp;gt;
      &amp;lt;h3&amp;gt;Repos&amp;lt;/h3&amp;gt;
      &amp;lt;dl&amp;gt;
        {repos.map((repo, index) =&amp;gt; (
          &amp;lt;div key={index}&amp;gt;
            &amp;lt;dt&amp;gt;{repo.name}&amp;lt;/dt&amp;gt;
            &amp;lt;dd&amp;gt;{repo.description}&amp;lt;/dd&amp;gt;
          &amp;lt;/div&amp;gt;
        ))}
      &amp;lt;/dl&amp;gt;
    &amp;lt;/&amp;gt;
  );
};

export async function getStaticProps() {
  const direction = parseInt(Math.random() * 2) ? 'asc' : 'desc';
  const res = await fetch(`https://api.github.com/users/vercel/repos?sort=created&amp;amp;direction=${direction}`, {
    headers: {
      'Accept': 'application/vnd.github.v3+json'
    }
  });
  const repos = await res.json();

  return {
    props: {
      repos
    },
  };
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  SSR(Server Side Rendering)
&lt;/h3&gt;

&lt;p&gt;Pages are generated and returned after each page view. Think about how a traditional WordPress site works(also think about how slow WordPress can be). The slowness(lag) is introduced by way of network requests for data(DB &amp;amp; API calls etc.). The benefit being that as the content creator edits the page data in the CMS, those changes are reflected right away to the user.&lt;/p&gt;

&lt;p&gt;Definitely a convenient way to manage your website, but your pages are going to be kinda slow to load.&lt;/p&gt;

&lt;p&gt;Try building &amp;amp; running your project once again and notice how  when you refresh(do it a few times) the sort order flips from ascending to descending. This is because with each refresh, the component is making a request to the API for data.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;export default function Repos({ repos }) {
  return (
    &amp;lt;&amp;gt;
      &amp;lt;h3&amp;gt;Repos&amp;lt;/h3&amp;gt;
      &amp;lt;dl&amp;gt;
        {repos.map((repo, index) =&amp;gt; (
          &amp;lt;div key={index}&amp;gt;
            &amp;lt;dt&amp;gt;{repo.name}&amp;lt;/dt&amp;gt;
            &amp;lt;dd&amp;gt;{repo.description}&amp;lt;/dd&amp;gt;
          &amp;lt;/div&amp;gt;
        ))}
      &amp;lt;/dl&amp;gt;
    &amp;lt;/&amp;gt;
  );
};

export async function getServerSideProps() {
  const direction = parseInt(Math.random() * 2) ? 'asc' : 'desc';
  const res = await fetch(`https://api.github.com/users/vercel/repos?sort=created&amp;amp;direction=${direction}`, {
    headers: {
      'Accept': 'application/vnd.github.v3+json'
    }
  });
  const repos = await res.json();

  return {
    props: {
      repos
    },
  };
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  ISR(Incrementation Site Regeneration)
&lt;/h3&gt;

&lt;p&gt;For most, this is kind of the best of both worlds. ISR will create each page using SSR initially and store the response the same way SSG does. Subsequent requests for the same page will now be read from disc and be, you know, super-fast. &lt;/p&gt;

&lt;p&gt;Content creators can make changes to their pages using a CMS and after the expiration period has lapsed, view their changes. What's really cool here is that Next.js is basically creating a caching layer for you. When it's time to regenerate the page, it'll do it behind the scenes and, once complete, replace the SSG page for future reads. Amazing stuff, really.&lt;/p&gt;

&lt;p&gt;Because this version of code is only going to request new data every 10 seconds(&lt;strong&gt;validate: 10&lt;/strong&gt;), I added a time stamp to the component. So once you've built and run it, refresh a few times and you'll notice that the time stamp doesn't change after the initial render. Refresh 10 seconds or so later and you'll see the time stamp change(and hopefully the sort direction).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;export default function Repos({ repos, time }) {
  return (
    &amp;lt;&amp;gt;
      &amp;lt;h3&amp;gt;Repos&amp;lt;/h3&amp;gt;
      &amp;lt;em&amp;gt;{time}&amp;lt;/em&amp;gt;
      &amp;lt;dl&amp;gt;
        {repos.map((repo, index) =&amp;gt; (
          &amp;lt;div key={index}&amp;gt;
            &amp;lt;dt&amp;gt;{repo.name}&amp;lt;/dt&amp;gt;
            &amp;lt;dd&amp;gt;{repo.description}&amp;lt;/dd&amp;gt;
          &amp;lt;/div&amp;gt;
        ))}
      &amp;lt;/dl&amp;gt;
    &amp;lt;/&amp;gt;
  );
};


export async function getStaticProps() {
  const direction = parseInt(Math.random() * 2) ? 'asc' : 'desc';
  const res = await fetch(`https://api.github.com/users/vercel/repos?sort=created&amp;amp;direction=${direction}`, {
    headers: {
      'Accept': 'application/vnd.github.v3+json'
    }
  });
  const repos = await res.json();
  const time = await new Date().toISOString();

  return {
    props: {
      repos,
      time
    },
    revalidate: 10
  };
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It's pretty easy to see why Next.js is so popular!&lt;/p&gt;

&lt;p&gt;And if you’re having a hard time selecting the best rendering method, go for ISR :)&lt;/p&gt;

&lt;p&gt;Thanks for reading.&lt;/p&gt;

</description>
      <category>nextjs</category>
    </item>
    <item>
      <title>Update your mapping &amp; reindex ElasticSearch easily...well, pretty easily</title>
      <dc:creator>Dan Stanhope</dc:creator>
      <pubDate>Mon, 07 Jun 2021 19:05:26 +0000</pubDate>
      <link>https://dev.to/danstanhope/update-your-mapping-reindex-elasticsearch-easily-well-pretty-easily-29ih</link>
      <guid>https://dev.to/danstanhope/update-your-mapping-reindex-elasticsearch-easily-well-pretty-easily-29ih</guid>
      <description>&lt;p&gt;If you've ever worked with ElasticSearch you've probably had to alter your original mapping and reindex your cluster's data. Unless you're super-awesome at predicting the search functionality you need from ElasticSearch and got your mapping perfect the first time. If this is you, no need to finish reading this post.&lt;/p&gt;

&lt;p&gt;Tinkering with data in production isn't the most fun job in the world and I've never really found step-by-step instructions on how to properly/safely perform a reindex. So that's what this post is all about. Hopefully it'll help someone!&lt;/p&gt;

&lt;p&gt;As always, test this out on your local, staging or any non-production ElasticSearch cluster before assuming it works. How many records you're reindexing will be one of the determining factors in how long it takes to process.&lt;/p&gt;

&lt;h3&gt;
  
  
  Create your new index + mapping
&lt;/h3&gt;

&lt;p&gt;The first step is to create an entirely new index with your new mapping.&lt;/p&gt;

&lt;p&gt;Let's say our ElasticSearch is running on &lt;a href="http://127.0.0.1:9201"&gt;http://127.0.0.1:9201&lt;/a&gt;, our old index is called &lt;code&gt;testv1&lt;/code&gt; and our new index &lt;code&gt;testv2&lt;/code&gt; .&lt;/p&gt;

&lt;p&gt;For brevity, the mapping is empty.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl -XPUT --header 'Content-Type: application/json' http://127.0.0.1:9201/testv2 -d '
{
  "settings": {},
  "mappings": {
    "_doc": {
      "properties": {}
    }
  }
}
'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Reindex your data
&lt;/h3&gt;

&lt;p&gt;Here's where we actually begin running the data from your old index into your new one. One thing to note is the &lt;code&gt;wait_for_completion&lt;/code&gt; flag. When this is set to false, a task will be create and allow to you issue subsequent requests to get progress updates. Without it, the command will just hang and you'll likely receive a &lt;code&gt;curl&lt;/code&gt; timeout and have no way of knowing when it's done reindexing.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl -XPOST --header 'Content-Type: application/json' http://127.0.0.1:9201/_reindex?wait_for_completion=false -d  '
{
  "source" : {
    "index" : "testv1"
  },
  "dest" : {
    "index" : "testv2",
    "version_type" : "external"
  }
}
'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This call will return a task id that can be used, to get status updates, like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl -XGET --header 'Content-Type: application/json' http://127.0.0.1:9201/_tasks/&amp;lt;TASK_ID&amp;gt;?pretty=true
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once your reindexing task is complete, you can verify the document counts in your old index match the new one.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl -XGET http://127.0.0.1:9201/testv1/_count
curl -XGET http://127.0.0.1:9201/testv2/_count
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When you're satisfied that everything is looking good, move onto the last step...creating an alias and telling people how awesome you are at ElasticSearch.&lt;/p&gt;

&lt;h3&gt;
  
  
  Create an alias pointing to your new index and remove the old one
&lt;/h3&gt;

&lt;p&gt;You likely don't want to have to alter all your queries to use the new index name. Fortunately, ElasticSearch allows you to create aliases. The idea being that we create an alias with the same name as the original index, but point it to the new one. This way there's no changes to be made to our queries.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl -XPOST --header 'Content-Type: application/json' http://127.0.0.1:9201/_aliases -d  '
{
    "actions" : [
      {
        "add" : {
          "index" : "testv2",
          "alias" : "testv1"
        }
      },
    {
        "remove_index" : {
          "index": "testv1"
      }
    }
  ]
}
'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Voilà! You're done.&lt;/p&gt;

&lt;p&gt;As an aside, there are things you can do to improve the reindexing performance. Like changing the &lt;code&gt;refresh_interval&lt;/code&gt; and lessening the number of replicas. Whether or not this is something you need will largely depend on the size of cluster and underlying hardware.&lt;/p&gt;

&lt;p&gt;Here's a sample of how to do this, should you need to. Make sure you run this prior to starting the reindex command.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl -XPUT --header 'Content-Type: application/json' http://127.0.0.1:9201/testv2/_settings -d  '
{
  "index" : {
    "refresh_interval" : "-1",
    "number_of_replicas": "0"
  }
}
'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Make sure to set these values back to whatever you feel is best once the new index is fully populated.&lt;/p&gt;

&lt;p&gt;Thanks!&lt;/p&gt;

</description>
      <category>aws</category>
      <category>elasticsearch</category>
      <category>devops</category>
    </item>
    <item>
      <title>Migrating DynamoDB data using Lamba + Streams</title>
      <dc:creator>Dan Stanhope</dc:creator>
      <pubDate>Fri, 19 Mar 2021 19:44:11 +0000</pubDate>
      <link>https://dev.to/danstanhope/migrating-dynamodb-data-using-lamba-streams-2e3m</link>
      <guid>https://dev.to/danstanhope/migrating-dynamodb-data-using-lamba-streams-2e3m</guid>
      <description>&lt;h3&gt;
  
  
  The scenario
&lt;/h3&gt;

&lt;p&gt;You've got an existing DynamoDB table and you'd like to migrate the data to another table. Or, you've got some data that pre-dates whenever you enabled streams and lined up that Lambda event listener. What's the move?&lt;/p&gt;

&lt;h3&gt;
  
  
  First, what are Streams?
&lt;/h3&gt;

&lt;p&gt;When records are added or updated in your DynamoDB table &lt;em&gt;change data&lt;/em&gt; is created and added to an event stream. This stream is super-easy to monitor &amp;amp; consume with a Lambda function. Basically, as records change data is added to a stream and you can capture that data with a Lambda function in near-realtime. Sweet.&lt;/p&gt;

&lt;p&gt;One thing to note, event stream data is only stored for 24hrs, after which it's gone. Or is it? It is.&lt;/p&gt;

&lt;p&gt;A common pattern, utilizing streams, is to write to a table, process the change data using Lambda and write to another location(i.e ElasticSearch, SQS). Maybe the data gets transformed a little bit along the way, too.&lt;/p&gt;

&lt;p&gt;Let's say this is something you're doing -- you've got a nice pipeline running that sends data from dynamodb -&amp;gt; lambda -&amp;gt; elasticsearch but you've got some old data in the table that arrived before the stream was enabled. You can write a script that scans/queries the table and updates each entry with a flag(&lt;em&gt;pre_existing_processed in our case, but change to whatever you like&lt;/em&gt;). By updating the existing record, it creates new change data and writes to the event stream. Pretty cool!&lt;/p&gt;

&lt;p&gt;You could formulate a query that selects the records you'd like to get onto the event stream(date range, perhaps?) and update each record with a flag(something to indicate it's an old record).&lt;/p&gt;

&lt;h3&gt;
  
  
  The Code
&lt;/h3&gt;

&lt;p&gt;I've created a small project that runs a paginated(&lt;em&gt;DynamoDB will return up to 1MB worth of data per page&lt;/em&gt;) query and performs a bulk update(&lt;em&gt;AWS allows a max. of 25 records per bulk update&lt;/em&gt;).&lt;/p&gt;

&lt;p&gt;Clone the GitHub repo &lt;a href="https://github.com/danstanhope/migrate-data-dynamodb-streams-lambda"&gt;here.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Make sure you update ./aws_keys.json with AWS credentials that have access to DynamoDB before starting.&lt;/p&gt;

&lt;p&gt;It's important to note that you'll likely need to increase your table's read/write capacity -- which comes at a cost.&lt;/p&gt;

&lt;p&gt;Start by adding the requisite packages:&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;yarn&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;Run the script(you'll be prompted for your table name):&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;node migrate.js -t &amp;lt;YOUR_TABLE&amp;gt; -b &amp;lt;BATCH_SIZE&amp;gt;&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;There's also a batch limit parameter, in case you want to run a set number. Remember that depending on how much data you've got, it could take a long time to run. I recommend testing with a small batch size first to ensure everything is running how you expect it to.&lt;/p&gt;

&lt;p&gt;This approach can be used to process millions of legacy/pre-existing records...but it'll take some time 😊&lt;/p&gt;

&lt;p&gt;As always, be careful running this code and make sure you know the cost implications etc.&lt;/p&gt;

&lt;p&gt;Hope this helps!&lt;/p&gt;

</description>
      <category>serverless</category>
      <category>node</category>
      <category>aws</category>
      <category>dynamodb</category>
    </item>
    <item>
      <title>React file upload using S3 pre-signed urls</title>
      <dc:creator>Dan Stanhope</dc:creator>
      <pubDate>Tue, 16 Feb 2021 22:05:46 +0000</pubDate>
      <link>https://dev.to/danstanhope/react-file-upload-using-s3-pre-signed-urls-1a6d</link>
      <guid>https://dev.to/danstanhope/react-file-upload-using-s3-pre-signed-urls-1a6d</guid>
      <description>&lt;p&gt;&lt;em&gt;Don't forget to like!&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  What are we building?
&lt;/h3&gt;

&lt;p&gt;We're going to create a lambda function that generates a pre-signed url as well as a react front-end utilizing a really cool component library!&lt;/p&gt;

&lt;p&gt;Traditionally, uploading files could be a bit of pain to implement &amp;amp; manage. Fortunately AWS allows you to upload objects directly to an S3 bucket using pre-signed urls. Pre-signed urls come with an expiration date, so you need to start your upload before then otherwise access gets blocked.&lt;/p&gt;

&lt;h3&gt;
  
  
  Walk through time.
&lt;/h3&gt;

&lt;p&gt;The project is divided up into two sections, basically. The front-end and the back-end.&lt;/p&gt;

&lt;p&gt;Head over to github to grab the &lt;a href="https://github.com/danstanhope/react-upload-pre-signed-s3-urls" rel="noopener noreferrer"&gt;code&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Back-end
&lt;/h3&gt;

&lt;p&gt;We're going to be using CloudFormation and AWS SAM to create and deploy our Lambda function as well as create our S3 bucket. This function, when called, is going to generate our pre-signed url for us. You could just as easily host this code within your own API, too.&lt;/p&gt;

&lt;p&gt;Firstly, make sure you've got aws-cli and aws-sam-cli installed and configured(setting up your keys &amp;amp; region etc.). Here's how to do &lt;a href="https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html" rel="noopener noreferrer"&gt;this&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Once you're all set up and ready to go, all you need to do is run &lt;code&gt;sam build&lt;/code&gt; followed by &lt;code&gt;sam deploy --guided&lt;/code&gt; from inside the lambda function's root folder. SAM cli will guide you through the deployment and, once successful, you'll have a newly created S3 bucket and lambda function.&lt;/p&gt;

&lt;p&gt;Make sure you copy your lambda function's api gateway url, as you'll need to make one small change in the &lt;code&gt;Upload.js&lt;/code&gt; component. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fskwuehzo0kd2jf8u1re1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fskwuehzo0kd2jf8u1re1.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Front-end
&lt;/h3&gt;

&lt;p&gt;Update the &lt;code&gt;Upload.js&lt;/code&gt; component with your API endpoint.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const SignedUploadDragAndDrop = () =&amp;gt; {
  useRequestPreSend(async ({ items, options }) =&amp;gt; {
    const files = items.length &amp;gt; 0 ? items[0] : {};

    let { file } = files;
    let { name, type } = file;
    let gateway = '&amp;lt;YOUR APIGATEWAY ENDPOINT URL&amp;gt;';

    const response = await axios(
      `${gateway}?` +
      new URLSearchParams({
        name,
        type,
      })
    );

   ....
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After this, just run &lt;code&gt;yarn&lt;/code&gt; and &lt;code&gt;yarn start&lt;/code&gt; from inside the frontend folder and you should end up with a page that looks like the one in this post's hero image.&lt;/p&gt;

&lt;p&gt;I've used a seriously awesome component library called React-uploady for this tutorial. Specifically, I've combined its upload button, drag-and-drop and progress components. But there are several others you can add on. &lt;a href="https://github.com/rpldy/react-uploady" rel="noopener noreferrer"&gt;Check it out!&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When you select a file to upload a request is made to retrieve the pre-signed url and, once returned, the upload begins. Pretty sweet.&lt;/p&gt;

&lt;p&gt;Hope this helps!&lt;/p&gt;

</description>
      <category>react</category>
      <category>serverless</category>
      <category>node</category>
      <category>tutorial</category>
    </item>
  </channel>
</rss>
