<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Rafal Hofman</title>
    <description>The latest articles on DEV Community by Rafal Hofman (@rafhofman).</description>
    <link>https://dev.to/rafhofman</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/rafhofman"/>
    <language>en</language>
    <item>
      <title>Sekurak MSHP CTF Summary - Part 1</title>
      <dc:creator>Rafal Hofman</dc:creator>
      <pubDate>Mon, 17 Oct 2022 15:25:29 +0000</pubDate>
      <link>https://dev.to/rafhofman/sekurak-mshp-ctf-summary-part-1-hh</link>
      <guid>https://dev.to/rafhofman/sekurak-mshp-ctf-summary-part-1-hh</guid>
      <description>&lt;p&gt;Recently (15.10-16.10) I took part in the Sekurak Mega Hacking Party CTF contest. For those who did not hear of it, CTF is kind of a security hackathon with pre-prepared tasks in which you have to find a flag within known vulnerabilities. This was the first time I have taken part in such a contest. It was quite interesting! Below you will find the first post in series describing the tasks which I solved or tried to solve ;).&lt;/p&gt;

&lt;h1&gt;
  
  
  &lt;strong&gt;d﻿eobf&lt;/strong&gt;
&lt;/h1&gt;

&lt;p&gt;So the first task was as follows:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// deobfuscate the code, or call appropriate function after executing it, to get the flag
var _0x553b6f=_0x4c5c;(function(_0x1e3834,_0x3f47f5){var _0x5dc057=_0x4c5c,_0x3162e1=_0x1e3834();while(!![]){try{var _0x4d3ec8=parseInt(_0x5dc057(0xc5,'q)cg'))/0x1+parseInt(_0x5dc057(0xc9,'rLxo'))/0x2*(parseInt(_0x5dc057(0xc8,'Khqd'))/0x3)+-parseInt(_0x5dc057(0xb8,'ucN2'))/0x4*(parseInt(_0x5dc057(0xb7,'g0t9'))/0x5)+-parseInt(_0x5dc057(0xb6,'rW2u'))/0x6+parseInt(_0x5dc057(0xbe,'X0LD'))/0x7+parseInt(_0x5dc057(0xba,'KPPr'))/0x8*(-parseInt(_0x5dc057(0xbf,'9ewY'))/0x9)+parseInt(_0x5dc057(0xbb,'H%x$'))/0xa*(parseInt(_0x5dc057(0xcc,'rNIa'))/0xb);if(_0x4d3ec8===_0x3f47f5)break;else _0x3162e1['push'](_0x3162e1['shift']());}catch(_0x1ec551){_0x3162e1['push'](_0x3162e1['shift']());}}}(_0x4ade,0xade96),[][_0x553b6f(0xbc,'Kmu$')][_0x553b6f(0xc0,'De1O')]=()=&amp;gt;window[_0x553b6f(0xca,'xKir')](_0x553b6f(0xc2,'rW2u')));function _0x4c5c(_0x17c2b0,_0x231ba2){var _0x4adec6=_0x4ade();return _0x4c5c=function(_0x4c5c7a,_0x22dce2){_0x4c5c7a=_0x4c5c7a-0xb6;var _0x3f97df=_0x4adec6[_0x4c5c7a];if(_0x4c5c['KZZRud']===undefined){var _0x39ebb7=function(_0x507494){var _0x54c208='abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789+/=';var _0x2eca4b='',_0x180c44='';for(var _0x31cf09=0x0,_0x153548,_0x11dd12,_0x3e2f84=0x0;_0x11dd12=_0x507494['charAt'](_0x3e2f84++);~_0x11dd12&amp;amp;&amp;amp;(_0x153548=_0x31cf09%0x4?_0x153548*0x40+_0x11dd12:_0x11dd12,_0x31cf09++%0x4)?_0x2eca4b+=String['fromCharCode'](0xff&amp;amp;_0x153548&amp;gt;&amp;gt;(-0x2*_0x31cf09&amp;amp;0x6)):0x0){_0x11dd12=_0x54c208['indexOf'](_0x11dd12);}for(var _0x4d8e2b=0x0,_0x4afade=_0x2eca4b['length'];_0x4d8e2b&amp;lt;_0x4afade;_0x4d8e2b++){_0x180c44+='%'+('00'+_0x2eca4b['charCodeAt'](_0x4d8e2b)['toString'](0x10))['slice'](-0x2);}return decodeURIComponent(_0x180c44);};var _0x40e39c=function(_0x14c145,_0x38c081){var _0x181656=[],_0x3d9ee9=0x0,_0x3afb58,_0x3dd4ab='';_0x14c145=_0x39ebb7(_0x14c145);var _0x31f48b;for(_0x31f48b=0x0;_0x31f48b&amp;lt;0x100;_0x31f48b++){_0x181656[_0x31f48b]=_0x31f48b;}for(_0x31f48b=0x0;_0x31f48b&amp;lt;0x100;_0x31f48b++){_0x3d9ee9=(_0x3d9ee9+_0x181656[_0x31f48b]+_0x38c081['charCodeAt'](_0x31f48b%_0x38c081['length']))%0x100,_0x3afb58=_0x181656[_0x31f48b],_0x181656[_0x31f48b]=_0x181656[_0x3d9ee9],_0x181656[_0x3d9ee9]=_0x3afb58;}_0x31f48b=0x0,_0x3d9ee9=0x0;for(var _0x3b0565=0x0;_0x3b0565&amp;lt;_0x14c145['length'];_0x3b0565++){_0x31f48b=(_0x31f48b+0x1)%0x100,_0x3d9ee9=(_0x3d9ee9+_0x181656[_0x31f48b])%0x100,_0x3afb58=_0x181656[_0x31f48b],_0x181656[_0x31f48b]=_0x181656[_0x3d9ee9],_0x181656[_0x3d9ee9]=_0x3afb58,_0x3dd4ab+=String['fromCharCode'](_0x14c145['charCodeAt'](_0x3b0565)^_0x181656[(_0x181656[_0x31f48b]+_0x181656[_0x3d9ee9])%0x100]);}return _0x3dd4ab;};_0x4c5c['cCKsUi']=_0x40e39c,_0x17c2b0=arguments,_0x4c5c['KZZRud']=!![];}var _0x277549=_0x4adec6[0x0],_0x534e81=_0x4c5c7a+_0x277549,_0x4a09ac=_0x17c2b0[_0x534e81];return!_0x4a09ac?(_0x4c5c['nXMkoz']===undefined&amp;amp;&amp;amp;(_0x4c5c['nXMkoz']=!![]),_0x3f97df=_0x4c5c['cCKsUi'](_0x3f97df,_0x22dce2),_0x17c2b0[_0x534e81]=_0x3f97df):_0x3f97df=_0x4a09ac,_0x3f97df;},_0x4c5c(_0x17c2b0,_0x231ba2);}function _0x4ade(){var _0x2f84c2=['W6FcR8kFaCoHWOv8','x14FoxX0WQ3cVG','WQ4IWQRdIc8UW6CthW','cSoCW4i7t14EWPeKWQKBW6dcMW','vmoVja7cUSo+vb7dGhfRWRK','W6qpbriGWPCga8k9WRBcJrmz','dwvRW4xcHW','DSokWO3dNrekW4/cRa','WRDDWPSZc8oOW6ldV8kJrN1beComsXbiimosW53cHmoEe8kMea','W5/dImo/WPxcUCoxjmo9ehD6Bmou','W6FcU8obWPNdNCkiW7RcKCokE0xcRG','W5ldGSkbAZy/WQvyqrOGW4S','WOJcPaHrW7m/WPRcJIxdUr3dSa','gqrMsCkWWR3cNKyN','W6hdMxxcQ14qW5Pl','jxxcLCk+grNcGsmTW4PlDa','tq/cNWJcTq','cmk+W5hcL8o2WPxcOKtdSLZcSbi','WRDrW4VcImkwbmo2ySolq0ym','W4a7W65uuCkMWRNdGCkguwz6hW','fmo+q8kSW7dcLxdcKea','DmogW5NcUWGEW6/cTZi1','i8kxzmormmkyWRD1'];_0x4ade=function(){return _0x2f84c2;};return _0x4ade();}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Instead of deobfuscation which might be too cumbersome, I had formatted the code and looked into it.&lt;/p&gt;

&lt;p&gt;After investigation, it looked like this function here could be executed (and this was also indicated in a hint to the task).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---F4bXP3b--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qr0hrk2f1brig5fjxc2w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---F4bXP3b--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qr0hrk2f1brig5fjxc2w.png" alt="Image description" width="880" height="440"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I﻿ followed to Chrome dev tools and executed code in the console calling a function. Results? &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--2QeXJzlp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fa07gdtc85f7m4p0dl2d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2QeXJzlp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fa07gdtc85f7m4p0dl2d.png" alt="Image description" width="880" height="301"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I﻿ have the flag!&lt;/p&gt;

&lt;h1&gt;
  
  
  &lt;strong&gt;traversal&lt;/strong&gt;
&lt;/h1&gt;

&lt;p&gt;S﻿o in next task, we got page like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0H--m3Lv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dkpgjbn0anwkmzum484h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0H--m3Lv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dkpgjbn0anwkmzum484h.png" alt="Image description" width="880" height="895"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A﻿s you can see, this is web app written in .NET. After clicking on one of the files, I got following view: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ySAIHYWh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ib3u5rl12g76oqqfd365.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ySAIHYWh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ib3u5rl12g76oqqfd365.png" alt="Image description" width="880" height="317"&gt;&lt;/a&gt;&lt;br&gt;
So it looks like files were referenced by the filename query param.&lt;/p&gt;

&lt;p&gt;After looking in the code, it looks like &lt;code&gt;..&lt;/code&gt; path would throw Bad Request error. I guess this was protection of reusing know payloads for path traversal vulnerability by other contestants :D. Simple change from CV file to flag file given the expected result:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Xt2WAm-V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/t7g2pe08uhwp2oanjjy8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Xt2WAm-V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/t7g2pe08uhwp2oanjjy8.png" alt="Image description" width="880" height="150"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;B﻿ingo! &lt;/p&gt;

&lt;p&gt;F﻿or the next post in series I will present other tasks which I have tried to solve - still, even if not successful, I have learned something valuable :)&lt;/p&gt;

</description>
      <category>security</category>
      <category>ctf</category>
      <category>hackathon</category>
      <category>vulnerability</category>
    </item>
    <item>
      <title>SQL Performance - pagination scalability</title>
      <dc:creator>Rafal Hofman</dc:creator>
      <pubDate>Thu, 26 Aug 2021 09:46:14 +0000</pubDate>
      <link>https://dev.to/rafhofman/sql-performance-pagination-scalability-46an</link>
      <guid>https://dev.to/rafhofman/sql-performance-pagination-scalability-46an</guid>
      <description>&lt;p&gt;Recently I have been reading &lt;a href="https://www.goodreads.com/book/show/17225810-sql-performance-explained"&gt;SQL Performance Explained by Markus Winand&lt;/a&gt; and I wanted to share with you what I have learned regarding SQL pagination scalability. &lt;/p&gt;

&lt;p&gt;Probably if you would be given the task to implement pagination, you would do it with &lt;code&gt;LIMIT&lt;/code&gt; and &lt;code&gt;OFFSET&lt;/code&gt; in a query. This would be completely fine, but it is good to know the limitations of it. &lt;/p&gt;

&lt;p&gt;The more you browse back in history (increase &lt;code&gt;LIMIT&lt;/code&gt; and &lt;code&gt;OFFSET&lt;/code&gt;) the more &lt;strong&gt;response time increases.&lt;/strong&gt; This is due to the fact that DB has to count all rows until it reaches the requested page. &lt;/p&gt;

&lt;p&gt;An answer for that would be to include a &lt;code&gt;WHERE&lt;/code&gt; statement in a query with &lt;code&gt;FETCH FIRST X ROWS ONLY&lt;/code&gt;, which does not select previous results. Each "page" is limited with a different &lt;code&gt;WHERE&lt;/code&gt; statement.  It also has its own limitations (harder to implement pagination, harder to browser backward, fetch arbitrary pages) but at a cost of simplicity, you get a performance increase. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Cc_LG0oC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wnxywsgiuwuxai5x5jnz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Cc_LG0oC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wnxywsgiuwuxai5x5jnz.png" alt="Markus Winand SQL Performance Explained pagination scalability" title="Pagination Scalability from SQL Performance Explained by Markus Winand"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Which option we should choose? As always in computer science - it depends :)&lt;/p&gt;

&lt;p&gt;Let me know how do you use pagination and if you ever had any performance issues related to that.&lt;/p&gt;

&lt;p&gt;P.S Remember that pagination needs deterministic order and do include &lt;code&gt;ORDER BY&lt;/code&gt; in your queries when needed ;)&lt;/p&gt;

</description>
      <category>sql</category>
      <category>performance</category>
      <category>pagination</category>
    </item>
    <item>
      <title>SQL Performance - composite indexes</title>
      <dc:creator>Rafal Hofman</dc:creator>
      <pubDate>Fri, 23 Jul 2021 13:05:45 +0000</pubDate>
      <link>https://dev.to/rafhofman/sql-performance-composite-indexes-4he8</link>
      <guid>https://dev.to/rafhofman/sql-performance-composite-indexes-4he8</guid>
      <description>&lt;p&gt;Recently I have been reading &lt;a href="https://www.goodreads.com/book/show/17225810-sql-performance-explained"&gt;SQL Performance Explained by Markus Winand&lt;/a&gt; and I wanted to share with you what I have learned about the composite index. &lt;/p&gt;

&lt;p&gt;You are probably familiar with the concept of the index in the Database. In very simple terms, you can imagine that it is similar to telephone directory (B-tree structure in DB) - instead of traversing through the whole book (DB), you are using directory (DB index) to find it faster. For more about indexes, you can read &lt;a href="https://use-the-index-luke.com/sql/anatomy"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Imagine you have table employees with columns &lt;em&gt;id, department, name.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;As you have some SELECT queries accessing those fields, you have created index for &lt;em&gt;department&lt;/em&gt; and index for &lt;em&gt;name:&lt;/em&gt; &lt;/p&gt;

&lt;p&gt;&lt;code&gt;CREATE INDEX department_index ON employees(department);&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;code&gt;CREATE INDEX name_index ON employees(name);&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;It works well for queries like: &lt;/p&gt;

&lt;p&gt;&lt;code&gt;SELECT * FROM "employees" WHERE "department" = 'IT';&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;or &lt;/p&gt;

&lt;p&gt;&lt;code&gt;SELECT * FROM "employees" WHERE "name" = 'Rafal';&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Now, imagine you have a query that selects &lt;strong&gt;both&lt;/strong&gt; department and surname: &lt;/p&gt;

&lt;p&gt;&lt;code&gt;SELECT * FROM "employees" WHERE "department" = 'IT' AND "name" = 'Rafal';&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;You already have an index on those fields, right? So what is the problem? &lt;/p&gt;

&lt;p&gt;DB engine will use only one of the indexes you have created. Going back to the example with telephone directory - you have created two separate directories - those are two separate structures and you either look in one or another. When two fields are selected,  one of the fields will be selected to choose index structure and then found in this index to select the results.  The second field will be traversed based on the selected results. You can imagine with &lt;em&gt;department&lt;/em&gt; index example: &lt;em&gt;department&lt;/em&gt; (IT) index will be selected and then the &lt;em&gt;name&lt;/em&gt; (Rafal) entry will be searched for in the leaf nodes of index. So if the name column will consit of "Agata, Tomek, Rafal, Zenek, (...)" it will traverse through several entries to find a correct entry (Rafal). &lt;/p&gt;

&lt;p&gt;A Solution for that would be to use a composite index. You can create it like: &lt;br&gt;
&lt;code&gt;CREATE INDEX composite_index_name on employees(department, name)&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;It will then create an index on two columns. With telephone directory example - you will have a directory that consists of both of those informations and directs to the right place. So query: &lt;/p&gt;

&lt;p&gt;&lt;code&gt;SELECT * FROM "employees" WHERE "department" = 'IT' AND "name" = 'Rafal';&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;will hit exactly one entry in the index with one leaf node.&lt;/p&gt;

&lt;p&gt;Might be useful if you are using heavy queries used at hot spots in the system and performance there is crucial. &lt;/p&gt;

&lt;p&gt;DISCLAIMER: Be aware indexes also do use storage and every INSERT/UPDATE needs to update the index structure as well :)&lt;/p&gt;

</description>
      <category>sql</category>
      <category>backend</category>
      <category>performance</category>
      <category>index</category>
    </item>
    <item>
      <title>XSS - are you sure you are protected?</title>
      <dc:creator>Rafal Hofman</dc:creator>
      <pubDate>Mon, 05 Jul 2021 06:15:23 +0000</pubDate>
      <link>https://dev.to/rafhofman/xss-are-you-sure-you-are-protected-5ego</link>
      <guid>https://dev.to/rafhofman/xss-are-you-sure-you-are-protected-5ego</guid>
      <description>&lt;p&gt;As a developer, you probably have heard what &lt;a href="https://owasp.org/www-community/attacks/xss/"&gt;XSS&lt;/a&gt; is and how to defend against it by escaping user input. You also probably might have heard that modern frontend frameworks like React or Angular are XSS safe (due to escaping). Still, though there are some XSS caveats worth remembering: &lt;/p&gt;

&lt;p&gt;Imagine you have a form where the user adds an address to his page/Facebook/Instagram etc. You might have HTML code like:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;&amp;lt;a href="https://brightinventions.pl/"&amp;gt;User page&amp;lt;/a&amp;gt;&lt;br&gt;
&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;When taking input from the user which later will be displayed in a href tag (or any other "new link" click tag-like frame) it is important to validate the protocol of the URL. User can simply add their page with javascript protocol and execute XSS.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;&amp;lt;a href="javascript:alert('XSS!');"&amp;gt;User page&amp;lt;/a&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;To conclude: to defend against XSS, besides escaping user input do validate the protocol of URL. Let me know if you have any other interesting thoughts when it comes to XSS!&lt;/p&gt;

</description>
      <category>security</category>
      <category>xss</category>
      <category>javascript</category>
      <category>owasp</category>
    </item>
    <item>
      <title>CloudWatch Insights - how to find the context of multiple requests?</title>
      <dc:creator>Rafal Hofman</dc:creator>
      <pubDate>Thu, 06 May 2021 08:34:33 +0000</pubDate>
      <link>https://dev.to/rafhofman/cloudwatch-insights-how-to-find-the-context-of-multiple-requests-30d4</link>
      <guid>https://dev.to/rafhofman/cloudwatch-insights-how-to-find-the-context-of-multiple-requests-30d4</guid>
      <description>&lt;p&gt;Recently I was searching through our application logs. The task was to extract extra context for a group of requests (ex. errors in the external provider system with the original request). For our app, we are using &lt;a href="https://aws.amazon.com/cloudwatch/"&gt;CloudWatch&lt;/a&gt; to store the logs. I have used &lt;a href="https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/AnalyzingLogData.html"&gt;CloudWatch Insights&lt;/a&gt; as out of the box tool to analyze them. &lt;/p&gt;

&lt;p&gt;Our logs have a format like below, with each console output in a separate line:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;2021-02-06T13:38:31.730Z info [some request id 1; some user id 1] Some external provider error message
2021-02-06T14:21:00.000Z info [some request id 2; some user id 2] Some external provider error message
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We can use Cloudwatch Insights to extract all the information related to that requests:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;filter @message like "Some context to error message log"
| parse @message "* * [* *] *" as timestamp,type,requestId, user, textMessage
| filter requestId in ["some request id 1;", "some request id 2"]
| sort @ingestionTime desc
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If the field you are searching for is a JSON array, you can search it like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;filter @message like "Some context to error message log {
    "someInfo": [
        some1,
        some2
    ]
}"
| parse @message "* * [* *] *" as timestamp,type,requestId, user, textMessage
| parse textMessage '"someInfo":[*]' as someInfo
| filter requestId in ["some request id 1;", "some request id 2"]
| sort @ingestionTime desc
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can then export the data that you need or build some stats around it. &lt;/p&gt;

&lt;p&gt;Let me know in the comments if you found CloudWatch Insights useful too and how you are using them. &lt;/p&gt;

</description>
      <category>aws</category>
      <category>logs</category>
      <category>cloudwatch</category>
      <category>insights</category>
    </item>
    <item>
      <title>Mocha.js - how to enable multiple test runners on CI/CD?</title>
      <dc:creator>Rafal Hofman</dc:creator>
      <pubDate>Thu, 15 Apr 2021 12:40:37 +0000</pubDate>
      <link>https://dev.to/rafhofman/mocha-js-how-to-enable-multiple-test-runners-on-ci-cd-3bda</link>
      <guid>https://dev.to/rafhofman/mocha-js-how-to-enable-multiple-test-runners-on-ci-cd-3bda</guid>
      <description>&lt;p&gt;One of our projects is running automated tests on CI/CD AzurePipelines. &lt;/p&gt;

&lt;p&gt;For the test runner, AzurePipeline &lt;a href="https://docs.microsoft.com/en-us/azure/devops/pipelines/tasks/test/publish-test-results?view=azure-devops&amp;amp;tabs=trx%2Cyaml"&gt;supports several&lt;/a&gt; test results templates but not the default Mocha spec one. &lt;/p&gt;

&lt;p&gt;This is why tests are running on the mocha Junit reporter producing the JUnit XML result. &lt;/p&gt;

&lt;p&gt;As there was a difference between running tests locally and on CI/CD environment, I wanted to debug the logs of the job. That what not possible as the Mocha JUnit reporter was not collecting console outputs/errors. &lt;/p&gt;

&lt;p&gt;Mocha.js does not support multiple runners right now. The solution for that was to introduce own runner, which combines both Mocha JUnit reporter and default spec Mocha reporter:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;use strict&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;Mocha&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;mocha&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;JUnit&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;mocha-junit-reporter&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;Spec&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;Mocha&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;reporters&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;Spec&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;Base&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;Mocha&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;reporters&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;Base&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;// This is combination of spec (mocha normal) + junit reporter so both is displayed on azure&lt;/span&gt;
&lt;span class="kd"&gt;class&lt;/span&gt; &lt;span class="nx"&gt;AzurePipelinesReporter&lt;/span&gt; &lt;span class="kd"&gt;extends&lt;/span&gt; &lt;span class="nx"&gt;Base&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;constructor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;runner&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;options&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;super&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;runner&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;options&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;_junitReporter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nx"&gt;JUnit&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;runner&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;options&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;_specReporter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nx"&gt;Spec&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;runner&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;options&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="nx"&gt;module&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;exports&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;AzurePipelinesReporter&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then, the custom reporter can be used by specyfing the file name and flags to both reporters if needed: &lt;/p&gt;

&lt;p&gt;&lt;code&gt;mocha --reporter azurePipelinesReporter.js --reporter-options mochaFile=some_path_to_results&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Let me know if you had a similar problem, stay tuned for the next tips &amp;amp; tricks!&lt;/p&gt;

&lt;p&gt;This blog post was original posted on &lt;a href="https://brightinventions.pl/blog/mocha-js-how-to-enable-multiple-test-runners-on-ci-cd"&gt;https://brightinventions.pl/blog/mocha-js-how-to-enable-multiple-test-runners-on-ci-cd&lt;/a&gt;&lt;/p&gt;

</description>
      <category>mocha</category>
      <category>ci</category>
      <category>cd</category>
      <category>runner</category>
    </item>
  </channel>
</rss>
