<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: harsha-dasari</title>
    <description>The latest articles on DEV Community by harsha-dasari (@harsha1377).</description>
    <link>https://dev.to/harsha1377</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/harsha1377"/>
    <language>en</language>
    <item>
      <title>swift language</title>
      <dc:creator>harsha-dasari</dc:creator>
      <pubDate>Mon, 25 Oct 2021 16:22:01 +0000</pubDate>
      <link>https://dev.to/harsha1377/swift-language-3ghe</link>
      <guid>https://dev.to/harsha1377/swift-language-3ghe</guid>
      <description>&lt;p&gt;ABOUT:&lt;/p&gt;

&lt;p&gt;Swift is a general-purpose programming language built by apple using a modern approach to safety, performance, and software design patterns.&lt;/p&gt;

&lt;p&gt;The goal of the Swift project is to create the best available language for uses ranging from systems programming, to mobile and desktop apps, scaling up to cloud services. Most importantly, Swift is designed to make writing and maintaining correct programs easier for the developer. To achieve this goal, we believe that the most obvious way to write Swift code must also be:&lt;/p&gt;

&lt;p&gt;Safe. The most obvious way to write code should also behave in a safe manner. Undefined behavior is the enemy of safety, and developer mistakes should be caught before software is in production. Opting for safety sometimes means Swift will feel strict, but we believe that clarity saves time in the long run.&lt;/p&gt;

&lt;p&gt;Fast. Swift is intended as a replacement for C-based languages . As such, Swift must be comparable to those languages in performance for most tasks. Performance must also be predictable and consistent, not just fast in short bursts that require clean-up later. There are lots of languages with novel features — being fast is rare.&lt;/p&gt;

&lt;p&gt;Expressive. Swift benefits from decades of advancement in computer science to offer syntax that is a joy to use, with modern features developers expect. But Swift is never done. We will monitor language advancements and embrace what works, continually evolving to make Swift even better.&lt;/p&gt;

&lt;p&gt;FEATURES:&lt;/p&gt;

&lt;p&gt;Swift includes features that make code easier to read and write, while giving the developer the control needed in a true systems programming language. Swift supports inferred types to make code cleaner and less prone to mistakes, and modules eliminate headers and provide namespaces. Memory is managed automatically, and you don’t even need to type semi-colons. Swift also borrows from other languages, for instance named parameters brought forward from Objective-C are expressed in a clean syntax that makes APIs in Swift easy to read and maintain.&lt;/p&gt;

&lt;p&gt;The features of Swift are designed to work together to create a language that is powerful, yet fun to use. Some additional features of Swift include:&lt;/p&gt;

&lt;p&gt;Closures unified with function pointers&lt;br&gt;
Tuples and multiple return values&lt;br&gt;
Generics&lt;br&gt;
Fast and concise iteration over a range or collection&lt;br&gt;
Structs that support methods, extensions, and protocols&lt;br&gt;
Functional programming patterns, e.g., map and filter&lt;br&gt;
Powerful error handling built-in&lt;br&gt;
Advanced control flow with do, guard, defer, and repeat keywords&lt;/p&gt;

&lt;p&gt;SAFTEY:&lt;/p&gt;

&lt;p&gt;Swift was designed from the outset to be safer than C-based languages, and eliminates entire classes of unsafe code. Variables are always initialized before use, arrays and integers are checked for overflow, and memory is managed automatically. Syntax is tuned to make it easy to define your intent — for example, simple three-character keywords define a variable (var) or constant (let).&lt;/p&gt;

&lt;p&gt;Another safety feature is that by default Swift objects can never be nil, and trying to make or use a nil object results in a compile-time error. This makes writing code much cleaner and safer, and prevents a common cause of runtime crashes. However, there are cases where nil is appropriate, and for these situations Swift has an innovative feature known as optional. An optional may contain nil, but Swift syntax forces you to safely deal with it using ? to indicate to the compiler you understand the behavior and will handle it safely.&lt;/p&gt;

</description>
    </item>
    <item>
      <title> Emojify – Create your own emoji with Deep Learning</title>
      <dc:creator>harsha-dasari</dc:creator>
      <pubDate>Sat, 08 May 2021 07:06:09 +0000</pubDate>
      <link>https://dev.to/harsha1377/emojify-create-your-own-emoji-with-deep-learning-2ab6</link>
      <guid>https://dev.to/harsha1377/emojify-create-your-own-emoji-with-deep-learning-2ab6</guid>
      <description>&lt;p&gt;Emojis are ways to indicate nonverbal clues. These clues have become an essential part of online chatting, product review, brand emotion, and many more. It also lead to increasing data science research dedicated to emoji-driven storytelling.&lt;/p&gt;

&lt;p&gt;With advancements in computer vision and deep learning, it is now possible to detect human emotions from images. In this deep learning project, we will classify human facial expressions to filter and map corresponding emojis or avatars.&lt;br&gt;
About the Dataset:&lt;br&gt;
The facial expression recognition dataset consists of 48*48 pixel grayscale face images. The images are centered and occupy an equal amount of space. This dataset consist of facial emotions of following categories:&lt;/p&gt;

&lt;p&gt;0:angry&lt;br&gt;
1:disgust&lt;br&gt;
2:feat&lt;br&gt;
3:happy&lt;br&gt;
4:sad&lt;br&gt;
5:surprise&lt;br&gt;
6:natural&lt;br&gt;
Facial Emotion Recognition using CNN:&lt;br&gt;
In the below steps will build a convolution neural network architecture and train the model on FER2013 dataset for Emotion recognition from images.&lt;br&gt;
Make a file train.py and follow the steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Imports:
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PDUuDziJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ovyvtye8g1xo7cm5j97v.png" alt="Alt Text"&gt;
&lt;/li&gt;
&lt;li&gt;Initialize the training and validation generators:
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TGMHHT9K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ig3vafvu9c83f4tveoo2.png" alt="Alt Text"&gt;
&lt;/li&gt;
&lt;li&gt;Build the convolution network architecture:
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--XtWnvL_i--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4lsek40p9vn3rnm040k7.png" alt="Alt Text"&gt;
&lt;/li&gt;
&lt;li&gt;Compile and train the model:
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--D--4FnpL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/d425urktbr35hh6kkipn.png" alt="Alt Text"&gt;
&lt;/li&gt;
&lt;li&gt;Save the model weights:
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--g0hw9Ggy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/q439ekopyvfc8di6m0s8.png" alt="Alt Text"&gt;
&lt;/li&gt;
&lt;li&gt;Using openCV haarcascade xml detect the bounding boxes of face in the webcam and predict the emotions:
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--H_bpFTBv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ld93lis7unvjmjj7b6c3.png" alt="Alt Text"&gt;
Code for GUI and mapping with emojis:
Create a folder named emojis and save the emojis corresponding to each of the seven emotions in the dataset.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Paste the below code in gui.py and run the file.&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--VB340e_m--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/odi8xos2pcq8kkzw1zvk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--VB340e_m--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/odi8xos2pcq8kkzw1zvk.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--T68v8Dvp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/5cmqza74e2o81xqnfw74.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--T68v8Dvp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/5cmqza74e2o81xqnfw74.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3ul-Y7z4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/r2atpmgw3q7hmcjdligp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3ul-Y7z4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/r2atpmgw3q7hmcjdligp.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;br&gt;
Summary:&lt;br&gt;
In this deep learning project for beginners, we have built a convolution neural network to recognize facial emotions. We have trained our model on the FER2013 dataset. Then we are mapping those emotions with the corresponding emojis or avatars.&lt;/p&gt;

&lt;p&gt;Using OpenCV’s haarcascade xml we are getting the bounding box of the faces in the webcam. Then we feed these boxes to the trained model for classification.&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
