<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Rachael Tatman</title>
    <description>The latest articles on DEV Community by Rachael Tatman (@rctatman).</description>
    <link>https://dev.to/rctatman</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/rctatman"/>
    <language>en</language>
    <item>
      <title>The same five Docker commands I have to look up every dang time (and one I always try to use that never works) </title>
      <dc:creator>Rachael Tatman</dc:creator>
      <pubDate>Wed, 09 Jun 2021 21:56:37 +0000</pubDate>
      <link>https://dev.to/rctatman/the-same-five-docker-commands-i-have-to-look-up-every-dang-time-and-one-i-always-try-to-use-that-never-works-24ah</link>
      <guid>https://dev.to/rctatman/the-same-five-docker-commands-i-have-to-look-up-every-dang-time-and-one-i-always-try-to-use-that-never-works-24ah</guid>
      <description>&lt;p&gt;I really don't know what it is about Docker (maybe that I need to mess with containers fairly infrequently, maybe that it's just different enough from git that I get confused between the two, absolutely that I don't remember which commands are for images and which are for containers) but for some reason I find myself looking up the same five commands every dang time. &lt;/p&gt;

&lt;p&gt;This post is primarily for me, so that I can stop digging through the Docker docs, but I figured some other folks might have the same problem as well. &lt;/p&gt;

&lt;h2&gt;
  
  
  Looking at stuff
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;docker ps&lt;/code&gt; - lists running &lt;em&gt;containers&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;docker image ls&lt;/code&gt; - lists images (not containers) in the local directory &lt;/li&gt;
&lt;li&gt;
&lt;code&gt;docker logs [container_name]&lt;/code&gt; - the container name will be at the end of the line of output of &lt;code&gt;docker ps&lt;/code&gt; (images don't have logs; they're like classes)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Creating &amp;amp; uploading a new image
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;docker build . -t username/image:tag&lt;/code&gt; - create a new image from a Dockerfile (in current directory) and context&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;docker commit&lt;/code&gt; - this commits a &lt;em&gt;container&lt;/em&gt;, not an image, stop using it to try and commit images to Dockerhub Rachael it 100% does not work&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;docker push username/image:tag&lt;/code&gt; - pushes a built image to Dockerhub (you do not need to commit images, stop trying to commit images)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Bonus: Docker compose
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;docker-compose down&lt;/code&gt; - use this instead of &lt;code&gt;docker-compose kill&lt;/code&gt; or when you restart the container it will still use the old image&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;docker-compose up -d&lt;/code&gt; - use this to get new containers set up (&lt;code&gt;restart&lt;/code&gt; will just wake up your containers w. the  same images), the detached flag keeps you from getting log spam&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>docker</category>
    </item>
    <item>
      <title>So you work in tech and want to support Black Lives Matter</title>
      <dc:creator>Rachael Tatman</dc:creator>
      <pubDate>Wed, 03 Jun 2020 20:24:44 +0000</pubDate>
      <link>https://dev.to/rctatman/so-you-work-in-tech-and-want-to-support-black-lives-matter-410m</link>
      <guid>https://dev.to/rctatman/so-you-work-in-tech-and-want-to-support-black-lives-matter-410m</guid>
      <description>&lt;p&gt;Hello my friends. Especially if you're in the United States, it's a pretty scary time right now. Protests against police brutality and lack of accountability have led to a violent escalation by the police across the country. It's easy to feel overwhelmed and helpless, especially if police brutality isn't an issue that's been on your radar. I've written this guide for &lt;strong&gt;my fellow non-Black folks in tech&lt;/strong&gt; to help you figure out for yourself what specific actions you can take to help, now and into the future.&lt;/p&gt;

&lt;h2&gt;
  
  
  1. Handle your immediate physical stress.
&lt;/h2&gt;

&lt;p&gt;It's hard to stay focused and take action when your body is busy freaking out. When you have time in the future, I really recommend the book &lt;a href="https://www.burnoutbook.net/"&gt;Burnout: The Secret to Unlocking the Stress Cycle&lt;/a&gt; for a deeper discussion into handling short and long term stress, but for now that can wait.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Set aside a specific time to catch up on the news and social media and when that time's up, turn it off and take action instead. Staying informed is great! Causing yourself needless anxiety by seeking out images of violence is not.&lt;/li&gt;
&lt;li&gt;Spend at least 5-10 minutes doing a physical activity that gets your heart rate up. I personally really like jumping rope, but even something like vacuuming or a quick (spatially distant) walk will help you handle your physical stress response.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  2. Pick a specific area to focus your efforts on
&lt;/h2&gt;

&lt;p&gt;I'm a big believer in starting where you are right now. "Fighting racism" is a great idea but it's not exactly a concrete action plan. What can you do right now, right where you are? To avoid feeling overloaded, I would recommend picking a specific topic for your advocacy that you are both interested in and knowledgeable about. For me, that's racial bias in machine learning. &lt;/p&gt;

&lt;h2&gt;
  
  
  3. Educate yourself about existing efforts
&lt;/h2&gt;

&lt;p&gt;Especially if you're new to the specific issue you're interested in, seek out organizations that are already working on it and read through their materials. You don't want to repeat work that's already been done and you also want to avoid unintentionally hindering existing efforts or speaking over folks that have been doing this work for a while. &lt;/p&gt;

&lt;p&gt;For a general introduction, &lt;a href="https://airtable.com/shr4E7n2GLi9qYp3I/tbl2cvGXcgx6LXtAP"&gt;this list&lt;/a&gt; put together by &lt;a href="https://www.blacktechforblacklives.com/"&gt;Black Tech for Black Lives&lt;/a&gt; has curated educational resources and specific calls to action.&lt;/p&gt;

&lt;p&gt;Here are some relevant, more specific organizations that I'm familiar with and the causes they support. Feel free to comment with others.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;If you want to help support Black researchers in AI: &lt;a href="https://blackinai.github.io/"&gt;Black in AI&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;If you want to assist with data collection and analysis (on a large number of topics): &lt;a href="http://d4bl.org/"&gt;Data4BlackLives&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;If you want to support legislation to ensure equitable and accountable AI: &lt;a href="https://www.ajlunited.org/"&gt;Algorithmic Justice League&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;If you want to support research into inequality and bias in machine learning and AI: &lt;a href="https://ainowinstitute.org/"&gt;AI Now Institute&lt;/a&gt; or &lt;a href="https://datasociety.net/"&gt;Data &amp;amp; Society&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;If you want to support a data driven approach to police reform: &lt;a href="https://www.joincampaignzero.org/"&gt;Campaign Zero&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The reason I say educate &lt;em&gt;yourself&lt;/em&gt; here is that I'd recommend against reaching out to specific individuals for information right now. Your Black friends or colleagues in particular are dealing with a lot right now. It's not fair to ask them to do unpaid labor to educate you in general and worse than usual right now.&lt;/p&gt;

&lt;h2&gt;
  
  
  4. Commit to a consistent action
&lt;/h2&gt;

&lt;p&gt;Now that you've educated yourself about specific changes you want to see, how can you go about making sure they happen? For example, I want Seattle to institute a ban on the use of facial recognition by police or city agencies. How can I work towards this, not just today and next week but until it happens?&lt;/p&gt;

&lt;p&gt;Your biggest enemy here is fatigue. There's a lot going on in the world and &lt;br&gt;
it's important to set up a future plan for yourself right now, while you have the time and energy, to ensure you have the biggest impact. Some ideas for specific action items:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Advocate within your company.&lt;/strong&gt; Commit to contributing to an existing internal organization or creating one if none exists. As tech workers we have the ability to stop unethical products from coming on the market at all rather than just mitigate the harm they do once they're launched. &lt;a href="https://portside.org/2020-01-27/how-organize-your-workplace-without-getting-caught"&gt;This guide&lt;/a&gt; has some tips on how to organize securely.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Donate on a schedule.&lt;/strong&gt; A lot of relevant charities are flooded with donations right now. Which is great... but what about six months from now? What about next year? Set up scheduled donations so that they have consistent income for sustained action. Make sure to take advantage of matching if your employer offers it. (And if they don't, ask if they will.)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Advocate for legislation.&lt;/strong&gt; Once you've identified specific pieces of legislation you want to advocate for, reach out to your representatives directly. It's important to make sure you consider every level of representation here. Federal measures are important, but so are state and municipal ones.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Volunteer.&lt;/strong&gt; Many of the organizations I linked above are actively looking for volunteers. If you're an AI researcher, you can review papers for Black in AI. If you're a data scientist, you can sign up to volunteer for Data 4 Black Lives. Setting time aside to lend your time and skills to existing efforts can be an excellent way to have an impact.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Set aside time to learn.&lt;/strong&gt; You will mess up. You will be ignorant. Set aside designated time to learn more, to read relevant research and writing. Follow Black technologists already working on the problems you're interested in. Make space to listen to them. We have all been raised in a society built on a foundation of white supremacy; it will take time to understand all the ways that's affected both you and the world around you.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Prioritize spending with Black-owned businesses.&lt;/strong&gt; In the United States, racial inequality is inexorably linked to wealth inequality. (If this is news to you, &lt;a href="https://www.forbes.com/sites/brianthompson1/2018/02/18/the-racial-wealth-gap-addressing-americas-most-pressing-epidemic/#10e67d1b7a48"&gt;this piece in Forbes is a good overview&lt;/a&gt;.) Make it a personal priority to support Black-owned businesses where possible.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  An example action plan
&lt;/h2&gt;

&lt;p&gt;For my current specific item--a city-level facial recognition ban--I've taken the following actions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;contacted my city council representatives by phone and email (and set myself a calendar reminder to check back in every two weeks)&lt;/li&gt;
&lt;li&gt;started the process to schedule an in-person (well, in-video-chat) meeting with my district's city council representative&lt;/li&gt;
&lt;li&gt;set up recurring donations to Algorithmic Justice League and Fight for the Future, two organizations working towards the goal&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you're interested in supporting this particular cause, Fight for the Future &lt;a href="https://docs.google.com/document/d/1_lm1ON8Kwxx1PZAKUiX-L8XDv6uza7uCRVKrQAktcOI/edit?usp=sharing"&gt;has a handy checklist of things you can do here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Working to create sustainable, systematic change will take a lot of us working on lots of different aspects of the problem. The best time for you to start your piece of the work is right now. 🖤&lt;/p&gt;

</description>
    </item>
    <item>
      <title>My livecoding setup (Software + Hardware)</title>
      <dc:creator>Rachael Tatman</dc:creator>
      <pubDate>Mon, 13 Apr 2020 19:40:29 +0000</pubDate>
      <link>https://dev.to/rctatman/my-livecoding-setup-software-hardware-58kf</link>
      <guid>https://dev.to/rctatman/my-livecoding-setup-software-hardware-58kf</guid>
      <description>&lt;p&gt;Hello friends! Since &lt;a href="https://dev.to/rctatman/my-top-5-live-coding-tips-3ckm"&gt;my last post on livecoding tips&lt;/a&gt; I've gotten a lot of requests for more information on my streaming setup. This post is a quick overview of the software and hardware I use. Note that I'm not specifically endorsing any of these product or saying you &lt;em&gt;have&lt;/em&gt; to do what I do: but I remember how difficult it was to find information specifically about live coding setups when I started so I figured it would be helpful.&lt;/p&gt;

&lt;p&gt;If you want to join my streams, I stream every Wednesday and Friday at 9:00 AM pacific on &lt;a href="https://www.twitch.tv/rctatman/"&gt;my Twitch&lt;/a&gt; or &lt;a href="https://www.youtube.com/channel/UCJ0V6493mLvqdiVwOKWBODQ/live"&gt;the Rasa YouTube channel&lt;/a&gt;. &lt;/p&gt;

&lt;h2&gt;
  
  
  Software
&lt;/h2&gt;

&lt;p&gt;First I should note that I stream on a Windows box (Windows 10 specifically), so this may not be as relelvent for Mac/Linux folks. I did consider streaming from Linux when I first started, but I've had so many AV problems in the past that I finally decided it wasn't worth the time investment.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Streaming - &lt;a href="https://streamlabs.com/"&gt;Streamlabs OBS&lt;/a&gt; - Free and Open Source&lt;/li&gt;
&lt;li&gt;Streaming to two platforms at once - &lt;a href="https://restream.io/"&gt;Restream.io&lt;/a&gt; - Free version, but if you're streaming to a lot of platforms you might want to upgrade&lt;/li&gt;
&lt;li&gt;Coding - &lt;a href="https://code.visualstudio.com/"&gt;VS Code&lt;/a&gt; and &lt;a href="https://rstudio.com/"&gt;RStudio&lt;/a&gt; - both free&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For my livecoding I have a scene with three sources:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Display capture for my entire monitor (right now &lt;a href="https://www.reddit.com/r/obs/comments/6nmfnm/obs_studio_visual_studio_code/"&gt;that's the easiest way to stream VS Code specifically&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;Video capture for my webcam, &lt;a href="https://www.streamscheme.com/how-to-set-up-your-green-screen-in-streamlabs-obs-slobs/"&gt;with a filter for chromakeying to my green screen&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Audio input capture for my microphone&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I keep Streamlabs and the chats open on a second monitor, as well as a Chrome window to look things up in. I drag the relevant window over VS Code when I find something I want to share. This avoids accidentally sharing anything I don't want to, like a site that seemed helpful in the search results but looks spammy when I open it up.&lt;/p&gt;

&lt;h2&gt;
  
  
  Hardware
&lt;/h2&gt;

&lt;p&gt;This is the hardware I use, which I've added piece by piece over the last couple years of live-coding. You don't need to use exactly what I use, but I've gotten enough requests for specifics that I figured it was worthwhile sharing.&lt;/p&gt;

&lt;p&gt;I've ranked these in order of how important they are.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Microphone - &lt;a href="https://www.bluedesigns.com/products/yeti"&gt;Yeti Blue, aka the same mic as pretty much every other internet content creator&lt;/a&gt; - $129.99&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You definitely don't need a camera to livecode: for the first year or so I only did voice over &amp;amp; it was completely fine. If you do decide you want a face cam, I'd do what I did at the start and just find a neutral background. (I used this pop up chair back think for probably six months--I lived in a pretty small apartment and didn't have space for a dedicated "YouTube wall" to sit in front of). &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Camera - &lt;a href="https://www.logitech.com/en-us/product/hd-pro-webcam-c920"&gt;C920 HD Pro webcam&lt;/a&gt; -$79.99&lt;/li&gt;
&lt;li&gt;Non-green screen background - &lt;a href="https://thewebaround.com/product/the-fan-favorite/"&gt;Webaround&lt;/a&gt; - $60.00

&lt;ul&gt;
&lt;li&gt;This is a little pop up circle you put on the back of your chair to hide the background. You can crop your video fairly close or use a circular effect filter so only you and the background are in frame.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you decide you really want a green screen and you have the extra RAM to do real-time chromakeying in addition to whatever you need for your coding, then I'd strongly recommend getting lighting at the same time. Flat, soft lighting is really the only way to make the outline of your silhouette look crisp. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Lighting - &lt;a href="https://neewer.com/collections/lighting-controls-modifiers/products/lighting-studio-10093255"&gt;Neewer Bi-color Dimmable LED Softbox Lighting Kit with 20"x27" Softboxes&lt;/a&gt; - $119.99&lt;/li&gt;
&lt;li&gt;Green screen - &lt;a href="https://streamvalera.com/collections/explorer/products/greenscreen-explorer90"&gt;Explorer 90 Professional Green Screen&lt;/a&gt; - $139.99&lt;/li&gt;
&lt;li&gt;Translucent face powder - &lt;a href="https://www.sephora.com/product/translucent-loose-setting-powder-P109908?icid2=products%20grid:p109908&amp;amp;skuId=2250520"&gt;LAURA MERCIER
Translucent Loose Setting Powder&lt;/a&gt; - $23.00

&lt;ul&gt;
&lt;li&gt;Using a loose translucent powder appropriate for your skin tone will help you stop looking shiny on camera. I'd recommend using some even if you don't otherwise wear makeup. It's easiest to apply using &lt;a href="https://www.ulta.com/powder-brush?productId=xlsImpprod3220077&amp;amp;sku=2229996"&gt;a big fluffy powder brush&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So, for your total budget (assuming you have a computer already):&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Without face camera: $130&lt;/li&gt;
&lt;li&gt;Face camera without greenscreen: $270&lt;/li&gt;
&lt;li&gt;Face camera with greenscreen: $500&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I understand this is a lot for most people to spend on starting up a new project, so I'd like to remind you again: you don't &lt;em&gt;need&lt;/em&gt; any fancy equipment to get started beyond your computer and a microphone. And even the microphone doesn't need to be that fancy. &lt;/p&gt;

</description>
      <category>livecoding</category>
      <category>streaming</category>
      <category>windows</category>
      <category>devrel</category>
    </item>
    <item>
      <title>My top 5 live-coding tips</title>
      <dc:creator>Rachael Tatman</dc:creator>
      <pubDate>Tue, 31 Mar 2020 15:30:02 +0000</pubDate>
      <link>https://dev.to/rctatman/my-top-5-live-coding-tips-3ckm</link>
      <guid>https://dev.to/rctatman/my-top-5-live-coding-tips-3ckm</guid>
      <description>&lt;p&gt;Hello friends! As some of you may know, I've been livecoding at least once a week for several years now. (&lt;a href="https://www.twitch.tv/rctatman/"&gt;My Twitch is here&lt;/a&gt; if you want to tune in. Right now I'm working on a open source conversational assistant that will quiz you on your dialect &amp;amp; guess where you're from. 👀) &lt;/p&gt;

&lt;p&gt;In the past couple weeks I've had a number of interested or new livecoders reach out to me for tips. Because I started to repeat myself  I figured I'd follow the &lt;a href="https://twitter.com/drob/status/928447584712253440?s=20"&gt;David Robinson school of advice giving&lt;/a&gt; and write a blog post. 😁&lt;/p&gt;




&lt;h2&gt;
  
  
  🌿 Figure out what's sustainable &amp;amp; stick with it
&lt;/h2&gt;

&lt;p&gt;Livecoding is different from a lot of other types of content production, like blogging. &lt;strong&gt;Think of it like a radio show: people want to know what time to tune in and, if they're big fans, they may even sechdule things around it.&lt;/strong&gt; (I'll sometimes move my lunch around a bit so I can watch streams I especially enjoy while I eat.)&lt;/p&gt;

&lt;p&gt;With that in mind, try to find a frequency and duration that's sustainable for &lt;em&gt;you&lt;/em&gt; and stick with it. If you do change your sechdule, make sure to communicate that to your viewers early and often. &lt;/p&gt;

&lt;h2&gt;
  
  
  😌 Make peace with making mistakes
&lt;/h2&gt;

&lt;p&gt;You will make silly mistakes on stream. That's ok. No one writes bug-free code and that's especially true when your attention is divided by checking in with the chat and talking through what you're doing.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What's more important than avoiding mistakes is showing how to correct them and talking through your process as you do this&lt;/strong&gt;. One of the most challenging parts of livecoding, I've found, is clearly communicating the different possibilities you're considering when you're trying to figure out what went wrong. &lt;/p&gt;

&lt;p&gt;If it helps, think about it like a coding interview. People aren't necessarily tuning in to see you do things perfectly the first time. They want to see how your work and share in your struggles &amp;amp; successes. (I've also found that folks who tune in to my streams are very helpful with suggesting different approaches!) &lt;/p&gt;

&lt;h2&gt;
  
  
  ⏱️ Pick a task for your stream that you think will be quick
&lt;/h2&gt;

&lt;p&gt;It generally takes me 3 to 5 times longer to livecode something than to code it on my own. &lt;strong&gt;For a one hour stream I'll try to pick something I think I could do in 20 minutes or so on my own.&lt;/strong&gt; Ideally, I want to start out with a concrete goal (like "I want to export a set label encoders") that I have a good chance of achieving in the given time. That way I'm more likely to end my stream with a satisfying conclusion. &lt;/p&gt;

&lt;p&gt;On that same theme, I recommend having stream titles like "My Project: Updating Foo to version 16.0.2" instead of "My Project Part 150" to give folks a better idea of what you'll be doing. &lt;/p&gt;

&lt;h2&gt;
  
  
  🔂 Repeat yourself
&lt;/h2&gt;

&lt;p&gt;Because it's live, you're likely to have people dropping in and out. You'll need to situate new viewers fairly often so I end up intentionally repeating what I'm doing and the problem I'm currently working much more often than I would otherwise. &lt;/p&gt;

&lt;p&gt;A good rule of thumb is to &lt;strong&gt;include a quick summary every ten to fifteen minutes or whenever you're waiting for something to load, train, compile or install&lt;/strong&gt;. Some folks instead prefer to have a card on the screen that summarizes the current project but I find I don't want to sacrifice any precious, precious screen real estate. &lt;/p&gt;

&lt;h2&gt;
  
  
  🚰 Hydrate
&lt;/h2&gt;

&lt;p&gt;Streaming is rough on the voice and staying hydrated is very important for maintaining vocal health. &lt;strong&gt;Ideally you want to drink enough that you avoid ever feeling thirsty while streaming,&lt;/strong&gt; since by the time you feel thirsty you're already dehydrated. In a one hour stream I usually go through a 12 oz glass of water with electrolytes &amp;amp; a 12 oz cup of coffee or tea--but I do try not to chug them right at the beginning of the hour. (Just in case: it's a good idea to have a "BRB" scene set up you can switch to if you need to take a quick break.)&lt;/p&gt;




&lt;p&gt;This isn't an exhaustive list, but these are some of the tips I wish I'd known when I started live coding. Hopefully you find them equally helpful &amp;amp; I'll see you on stream! 😎&lt;/p&gt;

&lt;p&gt;And for reading to the end, have a &lt;strong&gt;bonus tip&lt;/strong&gt;: you could always zoom in a little more. &lt;a href="https://muchneeded.com/twitch-statistics/"&gt;More than a third of Twitch viewers are on mobile&lt;/a&gt; and I guarantee there's some text they're having a hard time seeing somewhere on your screen.&lt;/p&gt;

</description>
      <category>devrel</category>
      <category>learning</category>
      <category>twitch</category>
      <category>livestreaming</category>
    </item>
    <item>
      <title>Are BERT and other large language models conscious?</title>
      <dc:creator>Rachael Tatman</dc:creator>
      <pubDate>Wed, 13 Nov 2019 20:01:13 +0000</pubDate>
      <link>https://dev.to/rctatman/are-bert-and-other-large-language-models-conscious-4adf</link>
      <guid>https://dev.to/rctatman/are-bert-and-other-large-language-models-conscious-4adf</guid>
      <description>&lt;p&gt;NLP models that produce fluent-sounding text are coming into vogue again. (I say again because systems like &lt;a href="http://psych.fullerton.edu/mbirnbaum/psych101/Eliza.htm" rel="noopener noreferrer"&gt;Eliza&lt;/a&gt; and &lt;a href="https://www.cs.princeton.edu/courses/archive/spring05/cos126/assignments/markov.html" rel="noopener noreferrer"&gt;Markov chain text generators&lt;/a&gt; have been around for decades.) A set of new systems trained using the &lt;a href="https://papers.nips.cc/paper/7181-attention-is-all-you-need" rel="noopener noreferrer"&gt;transformer deep learning architecture&lt;/a&gt; including BERT and GPT-2 have been setting new high water marks across various NLP leaderboards. It's an exciting time!&lt;/p&gt;

&lt;p&gt;The problem is that, along with that excitement, we see an increasing desire to assign human-like cognition to text generated by NLP systems. Take this tweet for example:&lt;/p&gt;

&lt;p&gt;&lt;iframe class="tweet-embed" id="tweet-1193933587017420800-644" src="https://platform.twitter.com/embed/Tweet.html?id=1193933587017420800"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-1193933587017420800-644');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=1193933587017420800&amp;amp;theme=dark"
  }



&lt;/p&gt;

&lt;p&gt;First I want to make it very clear that &lt;em&gt;I'm not trying to dunk on Tyler here&lt;/em&gt;. I've seen similar questions asked by lots of very smart folks and I think it's a perfectly reasonable thing to wonder about.&lt;/p&gt;

&lt;p&gt;I genuinely understand the desire to ascribe consciousness to ML systems. After all, folks have been hollering about AGI and the singularity for years. And humans have a &lt;a href="https://www.frontiersin.org/research-topics/6772/the-cognitive-underpinnings-of-anthropomorphism" rel="noopener noreferrer"&gt;deep seated desire to see human qualities&lt;/a&gt; in non-human things. &lt;/p&gt;

&lt;p&gt;That said, this very natural tendency, compounded by the fever pitch hype cycle and cherry picked example could lead a casual observer of the field to genuinely start wondering: are these systems genuinely showing patterns of humanlike thought? &lt;/p&gt;

&lt;p&gt;Short answer: no.&lt;/p&gt;

&lt;p&gt;Systems like BERT and GPT-2 do not have consciousness. They don't understand language in &lt;a href="https://ehudreiter.com/2018/09/13/language-grounding/" rel="noopener noreferrer"&gt;a grounded way&lt;/a&gt;. They don't have keep track of information between different generated utterances. They don't "know" that down is the opposite of up or that three is more than two or that a child is a kind of human.&lt;/p&gt;

&lt;p&gt;What they do have is highly, highly optimized models of (usually English) words that humans tend to use together in specific orders. In other words, they're very good statistical approximations of patterns of language use. &lt;a href="https://www.aclweb.org/anthology/P19-1459/" rel="noopener noreferrer"&gt;This ACL paper&lt;/a&gt; has some good experimental results that provide evidence for this as well as some of the accompanying drawbacks. &lt;/p&gt;

&lt;p&gt;Why is this important?&lt;/p&gt;

&lt;p&gt;On the one hand, it's not! BERT, GPT-2 et al &lt;strong&gt;aren't designed to be grounded language models or include knowledge about relationships between entities&lt;/strong&gt;. There's absolutely nothing in the algorithm design or training data to ensure that text generated by these models is factual. This isn't a drawback of the models: it's just not in scope. &lt;/p&gt;

&lt;p&gt;On the other hand, it's very important that users with these models understand that this is the case. These are language models and, like all language models, are designed to be components in larger NLP systems rather than an entire system in themselves.&lt;/p&gt;

&lt;p&gt;So, while it's definitely fun to play around with text generated by these models, it's akin to interacting with a parrot that's been taught to mimic your ringtone. It may sound like a phone, but it has none of the other features that make it one.&lt;/p&gt;

</description>
      <category>machinelearning</category>
      <category>nlp</category>
      <category>transformers</category>
      <category>deeplearning</category>
    </item>
  </channel>
</rss>
