<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Abhishek Shrivastava</title>
    <description>The latest articles on DEV Community by Abhishek Shrivastava (@abhishe89636035).</description>
    <link>https://dev.to/abhishe89636035</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/abhishe89636035"/>
    <language>en</language>
    <item>
      <title>Google Cloud Machine Learning</title>
      <dc:creator>Abhishek Shrivastava</dc:creator>
      <pubDate>Fri, 17 May 2024 18:10:23 +0000</pubDate>
      <link>https://dev.to/abhishe89636035/google-cloud-machine-learning-3ki5</link>
      <guid>https://dev.to/abhishe89636035/google-cloud-machine-learning-3ki5</guid>
      <description>&lt;p&gt;Happy to announce another feather added in #MyCloudDiary :: &lt;a href="https://lnkd.in/dHAMiTcH"&gt;https://lnkd.in/dHAMiTcH&lt;/a&gt; journey 🎉🎉..&lt;/p&gt;

&lt;p&gt;I’m happy to share that I’ve obtained a new certification: Use Machine Learning APIs on Google Cloud Skill Badge from Google!&lt;/p&gt;

&lt;p&gt;Life hack: Celebrate more, and more to celebrate! Happy Weekend LinkedIn!&lt;/p&gt;

&lt;p&gt;All thanks to the GCAP training provided by #tcs 😊&lt;/p&gt;

&lt;p&gt;If you cannot do great things, do small things in a great way.&lt;/p&gt;

&lt;blockquote&gt;
&lt;blockquote&gt;
&lt;blockquote&gt;
&lt;blockquote&gt;
&lt;/blockquote&gt;


&lt;/blockquote&gt;
&lt;br&gt;
&lt;/blockquote&gt;
&lt;br&gt;
&lt;/blockquote&gt;

&lt;p&gt;Unable to connect, please follow &amp;gt;&amp;gt; &lt;a href="https://lnkd.in/dNb7dqve"&gt;https://lnkd.in/dNb7dqve&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We cannot seek achievement for ourselves and forget about progress and prosperity for our community... Our ambitions must be broad enough to include the aspirations and needs of others, for their sakes and for our own.&lt;/p&gt;

&lt;p&gt;Thank you once again for your commitment and invaluable contributions toward helping me to reach its full potential.🎓&lt;/p&gt;

&lt;p&gt;Thanks to 🛡Mark Johnston Thu Ya Kyaw Magda Jary Priyanka Vergadia to encourage always with his booster words and recognition ..&lt;/p&gt;

&lt;p&gt;Special thanks to Yujun Liang ⎈☁️🌎Kasam Shaikh - Microsoft [Azure] AI MVP ☁ 🇮🇳 (since way before ChatGPT 😀) Dan Rey Walter Lee to always motivate me towards Google Cloud learning.&lt;/p&gt;

&lt;p&gt;CC- Adding my leaders and colleagues to share achievement::&lt;/p&gt;

&lt;p&gt;Mahesh Bhosale Swapneel Doshi Girish Chhabra Luis Méndez Osvaldo Cantu Jason Chance Anji Reddy Venumula Susan Cutinha Murli Reddy Sohil Shah Anuj Jain PgMP®,PMP®, Prince2 Agile, CSA, ITIL EUR ING Ioannis Kolaxis MSc Sabahat S Amit Batra, PMP® Rajeev Choudhary Mukul Sharma Shantinath Teradale Samantak Biswas Dr. Ritu Anand Farhad Sayeed Pawan Kumar Chadha Pritish Kumar Anand Sitaraman K Krithivasan Amit Bajaj Suyash Tiwari&lt;/p&gt;

&lt;p&gt;Adding community Leaders:: Dahnesh Dilkhush Nikkia Carter Victor Morales&lt;/p&gt;

&lt;p&gt;Please like ,share and subscribe below #channelpartners on &lt;/p&gt;

&lt;h1&gt;
  
  
  youtube :: &lt;a href="https://lnkd.in/dDaZPGR5"&gt;https://lnkd.in/dDaZPGR5&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  github :: &lt;a href="https://lnkd.in/dHAMiTcH"&gt;https://lnkd.in/dHAMiTcH&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  hashnode :: &lt;a href="https://lnkd.in/d_gtGxuS"&gt;https://lnkd.in/d_gtGxuS&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  twitter :: &lt;a href="https://lnkd.in/e5ZY5j-x"&gt;https://lnkd.in/e5ZY5j-x&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  DevCodeCommunity :: &lt;a href="https://lnkd.in/duMEcSnc"&gt;https://lnkd.in/duMEcSnc&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  Tealfeed :: &lt;a href="https://lnkd.in/eTyp-Xe4"&gt;https://lnkd.in/eTyp-Xe4&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  medium :: &lt;a href="https://lnkd.in/dpEzM8GU"&gt;https://lnkd.in/dpEzM8GU&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  cloud #CloudSkillsChallenge #ignite #cloud #identity #gdg #gdgcloud #GDGPUNE #Compliance #mipexperience #TCS #opportunity #share
&lt;/h1&gt;

&lt;h1&gt;
  
  
  cloudskillschallenge #education #cloud #community #like #github #leaders #agile #design #msbuild #microsoft #cloudskillschallenge #azure #infrastructure #business #data #tcsion #generativeai #llms #llama #googleai #bard #bert #ai #ml #neuralnetworks #deeplearning #google #generativeai #googlecloud #googlecloudplatform #googlebardai #chatgpt #encoder #machinelearning #cloud #opportunity #training #share #design #language #learning #bardai #generativeai #like #agile #leaders 😊#IAmRemarkable Bard by Google’s AI #TCSMEA #employeerecognition #employeeengagement #peoplemanagement #terraform #googlecloud #lifeatgoogle #googlecloudconsulting
&lt;/h1&gt;

</description>
      <category>googlecloud</category>
      <category>ai</category>
      <category>machinelearning</category>
    </item>
    <item>
      <title>Google Cloud Certified Professional Cloud Security Engineer</title>
      <dc:creator>Abhishek Shrivastava</dc:creator>
      <pubDate>Mon, 06 May 2024 11:40:53 +0000</pubDate>
      <link>https://dev.to/abhishe89636035/google-cloud-certified-professional-cloud-security-engineer-3lfi</link>
      <guid>https://dev.to/abhishe89636035/google-cloud-certified-professional-cloud-security-engineer-3lfi</guid>
      <description>&lt;p&gt;Big milestone reached! I’m officially hashtag#GoogleCloudCertified! Ready to put my skills to use as a Cloud Security Engineer! hashtag#GoogleCloudLearning&lt;/p&gt;

&lt;p&gt;Happy to announce another feather added in hashtag#MyCloudDiary :: &lt;a href="https://lnkd.in/dHAMiTcH"&gt;https://lnkd.in/dHAMiTcH&lt;/a&gt; journey 🎉🎉..&lt;/p&gt;

&lt;p&gt;Life hack: Celebrate more, and more to celebrate! Happy Monday LinkedIn!&lt;/p&gt;

&lt;p&gt;Delighted to share that I am now a Google Cloud Certified Professional Cloud Security Engineer !!!&lt;/p&gt;

&lt;p&gt;All thanks to the GCAP training provided by hashtag#tcs 😊&lt;br&gt;
If you cannot do great things, do small things in a great way.&lt;/p&gt;

&lt;blockquote&gt;
&lt;blockquote&gt;
&lt;blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Unable to connect, please follow &amp;gt;&amp;gt; &lt;a href="https://lnkd.in/dNb7dqve"&gt;https://lnkd.in/dNb7dqve&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;


&lt;/blockquote&gt;
&lt;br&gt;
&lt;/blockquote&gt;
&lt;br&gt;
&lt;/blockquote&gt;

&lt;p&gt;There is still plenty of achievable lined up for every year, and I am sure that we will keep up with our excellence.&lt;/p&gt;

&lt;p&gt;Thank you once again for your commitment and invaluable contributions toward helping me to reach its full potential.🎓&lt;/p&gt;

&lt;p&gt;Thanks to Mahesh Bhosale Baskar Natarajan to encourage always with his booster words and recognition ..A teacher affects eternity: he can never tell where his influence stops. 🙏🙏&lt;/p&gt;

&lt;p&gt;Special thanks to Yujun Liang ⎈☁️🌎Kasam Shaikh - Microsoft [Azure] AI MVP ☁ 🇮🇳 (since way before ChatGPT 😀) Dan Rey to always motivate me towards Google Cloud learning.&lt;/p&gt;

&lt;p&gt;CC- Adding my leaders and colleagues to share achievement::&lt;br&gt;
Swapneel Doshi Girish Chhabra Luis Méndez Osvaldo Cantu Jason Chance Anji Reddy Venumula Susan Cutinha Murli Reddy Sohil Shah Anuj Jain PgMP®,PMP®, Prince2 Agile, CSA, ITIL EUR ING Ioannis Kolaxis MSc Sabahat Siddiqui Amit Batra, PMP® Rajeev Choudhary Mukul Sharma Shantinath Teradale Samantak Biswas Dr. Ritu Anand Farhad Sayeed Pawan Kumar Chadha Pritish Kumar Anand Sitaraman K Krithivasan Amit Bajaj Suyash Tiwari&lt;/p&gt;

&lt;p&gt;Adding community Leaders:: Walter Lee Dahnesh Dilkhush Nikkia Carter Victor Morales Thu Ya Kyaw 🛡Mark Johnston Magda Jary&lt;/p&gt;

&lt;p&gt;Please like ,share and subscribe below hashtag#channelpartners on &lt;br&gt;
hashtag#youtube :: &lt;a href="https://lnkd.in/dDaZPGR5"&gt;https://lnkd.in/dDaZPGR5&lt;/a&gt;&lt;br&gt;
hashtag#github :: &lt;a href="https://lnkd.in/dHAMiTcH"&gt;https://lnkd.in/dHAMiTcH&lt;/a&gt;&lt;br&gt;
hashtag#hashnode :: &lt;a href="https://lnkd.in/d_gtGxuS"&gt;https://lnkd.in/d_gtGxuS&lt;/a&gt;&lt;br&gt;
hashtag#twitter :: &lt;a href="https://lnkd.in/e5ZY5j-x"&gt;https://lnkd.in/e5ZY5j-x&lt;/a&gt;&lt;br&gt;
hashtag#DevCodeCommunity :: &lt;a href="https://lnkd.in/duMEcSnc"&gt;https://lnkd.in/duMEcSnc&lt;/a&gt;&lt;br&gt;
hashtag#Tealfeed :: &lt;a href="https://lnkd.in/eTyp-Xe4"&gt;https://lnkd.in/eTyp-Xe4&lt;/a&gt;&lt;br&gt;
hashtag#medium :: &lt;a href="https://lnkd.in/dpEzM8GU"&gt;https://lnkd.in/dpEzM8GU&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;hashtag#cloud hashtag#CloudSkillsChallenge hashtag#ignite hashtag#cloud hashtag#identity hashtag#gdg hashtag#gdgcloud hashtag#GDGPUNE hashtag#Compliance hashtag#mipexperience hashtag#TCS hashtag#opportunity hashtag#share&lt;br&gt;
hashtag#cloudskillschallenge hashtag#education hashtag#cloud hashtag#community hashtag#like hashtag#github hashtag#leaders hashtag#agile hashtag#design hashtag#msbuild hashtag#microsoft hashtag#cloudskillschallenge hashtag#azure hashtag#infrastructure hashtag#business hashtag#data hashtag#tcsion hashtag#generativeai hashtag#llms hashtag#llama hashtag#googleai hashtag#bard hashtag#bert hashtag#ai hashtag#ml hashtag#neuralnetworks hashtag#deeplearning hashtag#google hashtag#generativeai hashtag#googlecloud hashtag#googlecloudplatform hashtag#googlebardai hashtag#chatgpt hashtag#encoder hashtag#machinelearning hashtag#cloud hashtag#opportunity hashtag#training hashtag#share hashtag#design hashtag#language hashtag#learning hashtag#bardai hashtag#generativeai hashtag#like hashtag#agile hashtag#leaders 😊hashtag#IAmRemarkable Bard by Google’s AI hashtag#TCSMEA hashtag#employeerecognition hashtag#employeeengagement hashtag#peoplemanagement Monumental hashtag#terraform Devops /Azure /Gcp / Aws / Terraform / Docker /Kubernates hashtag#google hashtag#googlecloud hashtag#lifeatgoogle hashtag#googlecloudconsulting 😊 &lt;/p&gt;

</description>
      <category>gcp</category>
      <category>pcse</category>
      <category>googlecloud</category>
    </item>
    <item>
      <title>Google Cloud Digital Leader Certification</title>
      <dc:creator>Abhishek Shrivastava</dc:creator>
      <pubDate>Mon, 29 Apr 2024 11:29:46 +0000</pubDate>
      <link>https://dev.to/abhishe89636035/google-cloud-digital-leader-certification-3chi</link>
      <guid>https://dev.to/abhishe89636035/google-cloud-digital-leader-certification-3chi</guid>
      <description>&lt;p&gt;Big milestone reached! I’m officially #GoogleCloudCertified! Ready to put my skills to use as a Cloud Digital Leader! #GoogleCloudLearning&lt;/p&gt;

&lt;p&gt;Happy to announce another feather added in #MyCloudDiary :: &lt;a href="https://lnkd.in/dHAMiTcH"&gt;https://lnkd.in/dHAMiTcH&lt;/a&gt; journey 🎉🎉..&lt;/p&gt;

&lt;p&gt;Life hack: Celebrate more, and more to celebrate! Happy Weekend LinkedIn!&lt;/p&gt;

&lt;p&gt;Delighted to share that I am now a Google Cloud Certified - Digital Cloud Leader !!!&lt;/p&gt;

&lt;p&gt;All thanks to the GCAP training provided by #tcs 😊&lt;/p&gt;

&lt;p&gt;If you cannot do great things, do small things in a great way.&lt;/p&gt;

&lt;blockquote&gt;
&lt;blockquote&gt;
&lt;blockquote&gt;
&lt;blockquote&gt;
&lt;/blockquote&gt;


&lt;/blockquote&gt;
&lt;br&gt;
&lt;/blockquote&gt;
&lt;br&gt;
&lt;/blockquote&gt;

&lt;p&gt;Unable to connect, please follow &amp;gt;&amp;gt; &lt;a href="https://lnkd.in/dNb7dqve"&gt;https://lnkd.in/dNb7dqve&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There is still plenty of achievable lined up for every year, and I am sure that we will keep up with our excellence.&lt;/p&gt;

&lt;p&gt;Thank you once again for your commitment and invaluable contributions toward helping me to reach its full potential.🎓&lt;/p&gt;

&lt;p&gt;Thanks to Mahesh Bhosale Baskar Natarajan to encourage always with his booster words and recognition ..A teacher affects eternity: he can never tell where his influence stops. 🙏🙏&lt;/p&gt;

&lt;p&gt;Special thanks to Yujun Liang ⎈☁️🌎Kasam Shaikh - Microsoft [Azure] AI MVP ☁ 🇮🇳 (since way before ChatGPT 😀) Dan Rey to always motivate me towards Google Cloud learning.&lt;/p&gt;

&lt;p&gt;CC- Adding my leaders and colleagues to share achievement::&lt;/p&gt;

&lt;p&gt;Swapneel Doshi Girish Chhabra Luis Méndez Osvaldo Cantu Jason Chance Anji Reddy Venumula Susan Cutinha Murli Reddy Sohil Shah Anuj Jain PgMP®,PMP®, Prince2 Agile, CSA, ITIL EUR ING Ioannis Kolaxis MSc Sabahat Siddiqui Amit Batra, PMP® Rajeev Choudhary Mukul Sharma Shantinath Teradale Samantak Biswas Dr. Ritu Anand Farhad Sayeed Pawan Kumar Chadha Pritish Kumar Anand Sitaraman K Krithivasan Amit Bajaj Suyash Tiwari&lt;/p&gt;

&lt;p&gt;Adding community Leaders:: Walter Lee Dahnesh Dilkhush Nikkia Carter Victor Morales Thu Ya Kyaw 🛡Mark Johnston Magda Jary&lt;/p&gt;

&lt;p&gt;Please like ,share and subscribe below #channelpartners on &lt;/p&gt;

&lt;h1&gt;
  
  
  youtube :: &lt;a href="https://lnkd.in/dDaZPGR5"&gt;https://lnkd.in/dDaZPGR5&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  github :: &lt;a href="https://lnkd.in/dHAMiTcH"&gt;https://lnkd.in/dHAMiTcH&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  hashnode :: &lt;a href="https://lnkd.in/d_gtGxuS"&gt;https://lnkd.in/d_gtGxuS&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  twitter :: &lt;a href="https://lnkd.in/e5ZY5j-x"&gt;https://lnkd.in/e5ZY5j-x&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  DevCodeCommunity :: &lt;a href="https://lnkd.in/duMEcSnc"&gt;https://lnkd.in/duMEcSnc&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  Tealfeed :: &lt;a href="https://lnkd.in/eTyp-Xe4"&gt;https://lnkd.in/eTyp-Xe4&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  medium :: &lt;a href="https://lnkd.in/dpEzM8GU"&gt;https://lnkd.in/dpEzM8GU&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  cloud #CloudSkillsChallenge #ignite #cloud #identity #gdg #gdgcloud #GDGPUNE #Compliance #mipexperience #TCS #opportunity #share
&lt;/h1&gt;

&lt;h1&gt;
  
  
  cloudskillschallenge #education #cloud #community #like #github #leaders #agile #design #msbuild #microsoft #cloudskillschallenge #azure #infrastructure #business #data #tcsion #generativeai #llms #llama #googleai #bard #bert #ai #ml #neuralnetworks #deeplearning #google #generativeai #googlecloud #googlecloudplatform #googlebardai #chatgpt #encoder #machinelearning #cloud #opportunity #training #share #design #language #learning #bardai #generativeai #like #agile #leaders 😊#IAmRemarkable Bard by Google AI #TCSMEA #employeerecognition #employeeengagement #peoplemanagement Monumental #terraform Devops /Azure /Gcp / Aws  / Terraform / Docker /Kubernates#google #googlecloud #lifeatgoogle #googlecloudconsulting
&lt;/h1&gt;

</description>
      <category>gcp</category>
      <category>googlecloud</category>
      <category>ai</category>
    </item>
    <item>
      <title>Contextual Mastery in TCS</title>
      <dc:creator>Abhishek Shrivastava</dc:creator>
      <pubDate>Mon, 29 Apr 2024 11:28:29 +0000</pubDate>
      <link>https://dev.to/abhishe89636035/contextual-mastery-in-tcs-1g52</link>
      <guid>https://dev.to/abhishe89636035/contextual-mastery-in-tcs-1g52</guid>
      <description>&lt;p&gt;Happy to share small achievement from Tata Consultancy Services.&lt;/p&gt;

&lt;h1&gt;
  
  
  TCS Provide best platform for learners to learn anything what he/she wants to learn. Thanks
&lt;/h1&gt;

&lt;h1&gt;
  
  
  TCS to provide wonderful platform.
&lt;/h1&gt;

&lt;p&gt;olutions can't go wrong if we just get the first step right - Define problem in the right "Context" and see it from customer's perspective. In IT world most Solutions are standard and known, it's the context of the problem which makes it unique and is often un-known to the Technologists which is why at TCS we have a tribe of Technologists who live and breath customer business long enough before proposing any solution.&lt;/p&gt;

&lt;p&gt;Extremely delighted to be a part of this elite group in TCS called "Contextual Masters" !!!!&lt;/p&gt;

&lt;p&gt;“Stay away from those people who try to disparage your ambitions. Small minds will always do that, but great minds will give you a feeling that you can become great too.”&lt;/p&gt;

&lt;p&gt;Please like ,share and subscribe below #channelpartners on&lt;/p&gt;

&lt;h1&gt;
  
  
  YouTube :: lnkd.in/dDaZPGR5
&lt;/h1&gt;

&lt;h1&gt;
  
  
  GitHub :: lnkd.in/dHAMiTcH
&lt;/h1&gt;

&lt;h1&gt;
  
  
  hashnode :: lnkd.in/d_gtGxuS
&lt;/h1&gt;

&lt;h1&gt;
  
  
  Twitter :: lnkd.in/e5ZY5j-x
&lt;/h1&gt;

&lt;h1&gt;
  
  
  DevCodeCommunity :: lnkd.in/duMEcSnc
&lt;/h1&gt;

&lt;h1&gt;
  
  
  Tealfeed :: lnkd.in/eTyp-Xe4
&lt;/h1&gt;

&lt;h1&gt;
  
  
  Medium :: lnkd.in/dpEzM8GU
&lt;/h1&gt;

&lt;p&gt;Special thanks to Milind Lakkad Sukanya Ghosh Tanvi Tirthankar Mousumi Chatterjee Kottai A 😍&lt;/p&gt;

&lt;p&gt;CC- Adding my leaders and colleagues to share achievement:&lt;/p&gt;

&lt;p&gt;Amit Batra, PMP® Baskar Natarajan Samantak Biswas Shantinath Teradale Amit Bajaj Dr. Ritu Anand Anand Sitaraman K Krithivasan Mukul Sharma Sudeshna Biswas Nigar Sayed Abhishek Saxena Kumar Amitesh Sriram Venkatesan Seethalakshmi Sridhar Keith Mary Rantas-Drew Kapil Gandhi Dr Usha Narasimhan Balasubramaniam Sambandam Maya Sankaranarayanan Dibyendu Naha Ranjith Jayaraman Paresh Sheth Tapan Das Jimmy Mukkalath Selvaraj Shanmugam Minitha Saxena Ramanand Garimella Preetha Kalaivanan Prudvi Raj Amit Vadhera Harshita Singh Chauhan Sapna Acharya&lt;/p&gt;

&lt;h1&gt;
  
  
  kubernetes #k8s #devops #sitereliabilityengineering #costoptimization #gke #google #developercommunity #tcs #gems #developer #leaders #like #cloud #pune #india #github #community #share #education #learning #GENAI #TCSgems #Xcelerateachievement #TCSian #iamremarkable
&lt;/h1&gt;

</description>
      <category>tcs</category>
    </item>
    <item>
      <title>Microsoft Azure AI Fundamentals: Get started with artificial intelligence</title>
      <dc:creator>Abhishek Shrivastava</dc:creator>
      <pubDate>Wed, 20 Mar 2024 15:18:31 +0000</pubDate>
      <link>https://dev.to/abhishe89636035/microsoft-azure-ai-fundamentals-get-started-with-artificial-intelligence-4p71</link>
      <guid>https://dev.to/abhishe89636035/microsoft-azure-ai-fundamentals-get-started-with-artificial-intelligence-4p71</guid>
      <description>&lt;p&gt;Introduction&lt;br&gt;
Artificial Intelligence (AI) is changing our world and there’s hardly an industry that hasn't been affected. From better healthcare to online safety, AI is helping us to tackle some of society’s biggest issues.&lt;/p&gt;

&lt;p&gt;Azure AI services are a portfolio of AI capabilities that unlock automation for workloads in language, vision, intelligent search, content generation, and much more. They are straightforward to implement and don’t require specialist AI knowledge.Introduction&lt;/p&gt;

&lt;p&gt;Artificial Intelligence (AI) is changing our world and there’s hardly an industry that hasn't been affected. From better healthcare to online safety, AI is helping us to tackle some of society’s biggest issues.&lt;/p&gt;

&lt;p&gt;Azure AI services are a portfolio of AI capabilities that unlock automation for workloads in language, vision, intelligent search, content generation, and much more. They are straightforward to implement and don’t require specialist AI knowledge.&lt;/p&gt;

&lt;p&gt;Organizations are using Azure AI services in innovative ways, such as within robots to provide life-like companionship to older people by expressing happiness, concern, and even laughter. In other use cases, scientists are using AI to protect endangered species by identifying hard-to-find animals in images. This was previously time-consuming and error-prone work, which the Azure AI Vision service can complete quickly and with a high degree of accuracy, freeing scientists to do other work.&lt;/p&gt;

&lt;p&gt;In this module you will learn what Azure AI services are, and how you can use them in your own applications.&lt;/p&gt;

&lt;p&gt;AI services on the Azure platform&lt;br&gt;
Azure AI services are AI capabilities that can be built into web or mobile applications, in a way that's straightforward to implement. These AI services include image recognition, natural language processing, speech, AI-powered search, and more. There are over a dozen different services that can be used separately or together to add AI power to applications.&lt;/p&gt;

&lt;p&gt;Let's take a look at some examples of what can be done with Azure AI services. The Azure AI Content Safety service can be used to detect harmful content within text or images, including violent or hateful content, and report on its severity. The Azure AI Language service can be used to summarize text, classify information, or extract key phrases. The Azure AI Speech service provides powerful speech to text and text to speech capabilities, allowing speech to be accurately transcribed into text, or text to natural sounding voice audio.&lt;/p&gt;

&lt;p&gt;Azure AI services are based on three principles that dramatically improve speed-to-market:&lt;/p&gt;

&lt;p&gt;Prebuilt and ready to use&lt;/p&gt;

&lt;p&gt;Accessed through APIs&lt;/p&gt;

&lt;p&gt;Available on Azure&lt;/p&gt;

&lt;p&gt;Azure AI services are prebuilt and ready to use&lt;br&gt;
AI has been prohibitive for all but the largest technology companies because of several factors, including the large amounts of data required to train models, the massive amount of computing power needed, and the budget to hire specialist programmers. Azure AI services make AI accessible to businesses of all sizes by using pre-trained machine learning models to deliver AI as a service. Azure AI services use high-performance Azure computing to deploy advanced AI models as resources, making decades of research available to developers of all skill levels.&lt;/p&gt;

&lt;p&gt;Azure AI services are a portfolio of services, with capabilities suitable for use cases across sectors and industries.&lt;/p&gt;

&lt;p&gt;For example, in education, Immersive Reader is being used to support students by adapting to their requirements. Learners can have varying needs, such as wanting to read more slowly, get words or text translated into another language, or see pictures to aid their understanding. Immersive Reader helps students with different needs learn at their own pace, and in their own way.&lt;/p&gt;

&lt;p&gt;While Azure AI services can be used without any modification, some AI services can be customized to better fit specific requirements. Customization capabilities in Azure AI Vision, Azure AI Speech, and Azure OpenAI all allow you to add data to existing models.&lt;/p&gt;

&lt;p&gt;For example, in sport, athletes, and coaches are customizing Azure AI Vision to improve performance and reduce injury. One application allows surfers to upload a video and receive AI-generated insights and analysis. These insights can then be used by coaches, medics, judges, and event broadcasters.&lt;/p&gt;

&lt;p&gt;Azure AI services are accessed through APIs&lt;br&gt;
Azure AI services are designed to be used in different development environments, with minimal coding. Developers can access AI services through REST APIs, client libraries, or integrate them with tools such as Logic Apps and Power Automate. APIs are application programming interfaces that define the information that is required for one component to use the services of the other. APIs enable software components to communicate, so one side can be updated without stopping the other from working. Find out more about development options for Azure AI services here.&lt;/p&gt;

&lt;p&gt;Azure AI services are available on Azure&lt;br&gt;
AI services are cloud-based and accessed through Azure resource. This means that they're managed in the same way as other Azure services, such as platform as a service (PaaS), infrastructure as a service (IaaS), or a managed database service. The Azure platform and Resource Manager provide a consistent framework for all your Azure services, from creating or deleting resources, to availability and billing.&lt;/p&gt;

&lt;p&gt;Create Azure AI service resources&lt;br&gt;
Azure AI services are cloud-based, and like all Azure services you need to create a resource to use them. There are two types of AI service resources: multi-service or single-service. Your development requirements and how you want costs to be billed determine the types of resources you need.&lt;/p&gt;

&lt;p&gt;Multi-service resource: a resource created in the Azure portal that provides access to multiple Azure AI services with a single key and endpoint. Use the resource Azure AI services when you need several AI services or are exploring AI capabilities. When you use an Azure AI services resource, all your AI services are billed together.&lt;/p&gt;

&lt;p&gt;Single-service resources: a resource created in the Azure portal that provides access to a single Azure AI service, such as Speech, Vision, Language, etc. Each Azure AI service has a unique key and endpoint. These resources might be used when you only require one AI service or want to see cost information separately.&lt;/p&gt;

&lt;p&gt;You can create a resource several ways, such as in the Azure portal.&lt;/p&gt;

&lt;p&gt;How to use the Azure portal to create an Azure AI services resource&lt;br&gt;
To create an Azure AI services resource, sign in to the Azure portal with Contributor access and select Create a resource. To create a multi-services resource search for Azure AI services in the marketplace.&lt;/p&gt;

&lt;p&gt;Minimize image&lt;br&gt;
Edit image&lt;br&gt;
Delete image&lt;/p&gt;

&lt;p&gt;To create a single-service resource, search for the specific Azure AI service such as Face, Language, or Content Safety, and so on. Most AI services have a free price tier to allow you to explore their capabilities. After clicking Create for the resource you require, you will be prompted to complete details of your subscription, the resource group to contain the resource, the region, a unique name, and the price tier.&lt;/p&gt;

&lt;p&gt;Use Azure AI services&lt;br&gt;
Once you create an Azure AI service resource, you can build applications using the REST API, software development kits (SDKs), or visual studio interfaces.&lt;/p&gt;

&lt;p&gt;Minimize image&lt;br&gt;
Edit image&lt;br&gt;
Delete image&lt;/p&gt;

&lt;p&gt;Using service studio interfaces&lt;br&gt;
Studio interfaces provide a friendly user interface to explore Azure AI services. There are different studios for different Azure AI services, such as Vision Studio, Language Studio, Speech Studio, and the Content Safety Studio. You can test out Azure AI services using the samples provided, or experiment with your own content. A studio-based approach allows you to explore, demo, and evaluate Azure AI services regardless of your experience with AI or coding.&lt;/p&gt;

&lt;p&gt;Associate the AI service resource&lt;br&gt;
Before you can use an AI service resource, you must associate it with the studio you want to use on the Settings page. Select the resource, and then select Use Resource. You are then ready to explore the Azure AI service within the studio.&lt;/p&gt;

&lt;p&gt;Minimize image&lt;br&gt;
Edit image&lt;br&gt;
Delete image&lt;/p&gt;

&lt;p&gt;As an example, let's look at the Azure AI Content Safety service, which identifies harmful text or images. To explore what the Content Safety service does, let's use the Content Safety Studio. First create either a multi-service Azure AI services resource, or a single-service Content Safety resource. Then, on the Content Safety Studio Settings page, select the resource, and select Use Resource. The AI service you created is now associated with the Content Safety Studio, and ready to be used.&lt;/p&gt;

&lt;p&gt;Use Azure AI services&lt;br&gt;
Once you create an Azure AI service resource, you can build applications using the REST API, software development kits (SDKs), or visual studio interfaces.&lt;/p&gt;

&lt;p&gt;Minimize image&lt;br&gt;
Edit image&lt;br&gt;
Delete image&lt;/p&gt;

&lt;p&gt;Using service studio interfaces&lt;br&gt;
Studio interfaces provide a friendly user interface to explore Azure AI services. There are different studios for different Azure AI services, such as Vision Studio, Language Studio, Speech Studio, and the Content Safety Studio. You can test out Azure AI services using the samples provided, or experiment with your own content. A studio-based approach allows you to explore, demo, and evaluate Azure AI services regardless of your experience with AI or coding.&lt;/p&gt;

&lt;p&gt;Associate the AI service resource&lt;br&gt;
Before you can use an AI service resource, you must associate it with the studio you want to use on the Settings page. Select the resource, and then select Use Resource. You are then ready to explore the Azure AI service within the studio.&lt;/p&gt;

&lt;p&gt;Minimize image&lt;br&gt;
Edit image&lt;br&gt;
Delete image&lt;/p&gt;

&lt;p&gt;As an example, let's look at the Azure AI Content Safety service, which identifies harmful text or images. To explore what the Content Safety service does, let's use the Content Safety Studio. First create either a multi-service Azure AI services resource, or a single-service Content Safety resource. Then, on the Content Safety Studio Settings page, select the resource, and select Use Resource. The AI service you created is now associated with the Content Safety Studio, and ready to be used.&lt;/p&gt;

&lt;p&gt;Summary&lt;br&gt;
Azure AI services are easy to use AI capabilities made available as resources on the Azure platform. Azure AI service capabilities include Language, Speech, Vision, Decision, Search, and Azure OpenAI.&lt;/p&gt;

&lt;p&gt;In this module we’ve used several different terms relating to AI services. Here's a recap:&lt;/p&gt;

&lt;p&gt;API – application programming interfaces (APIs) enable software components to communicate, so one side can be updated without stopping the other from working.&lt;/p&gt;

&lt;p&gt;Artificial Intelligence (AI) – computer programs that respond in ways that are normally associated with human reasoning, learning, and thought.&lt;/p&gt;

&lt;p&gt;Azure AI services – a portfolio of AI services that can be incorporated into applications quickly and easily without specialist knowledge. Azure AI services is also the name for the multi-service resource created in the Azure portal that provides access to several different Azure AI services with a single key and endpoint.&lt;/p&gt;

&lt;p&gt;Endpoint – the location of a resource, such as an Azure AI service.&lt;/p&gt;

&lt;p&gt;Key – a private string that is used to authenticate a request.&lt;/p&gt;

&lt;p&gt;Machine learning – the ability for computer programs to learn from large amounts of data, in a process known as "training".&lt;/p&gt;

&lt;p&gt;Multi-service resource – the AI service resource created in the Azure portal that provides access to a bundle of AI services.&lt;/p&gt;

&lt;p&gt;Single-service resource – a resource created in the Azure portal that provides access to a single Azure AI service, such as Speech, Vision, Language, etc. Each Azure AI service has a unique key and endpoint.&lt;/p&gt;

&lt;p&gt;RESTful API – a scalable web application programming interface used to access Azure AI services.&lt;/p&gt;

&lt;p&gt;You can find out more about Azure AI services in the service documentation.&lt;/p&gt;

&lt;p&gt;Organizations are using Azure AI services in innovative ways, such as within robots to provide life-like companionship to older people by expressing happiness, concern, and even laughter. In other use cases, scientists are using AI to protect endangered species by identifying hard-to-find animals in images. This was previously time-consuming and error-prone work, which the Azure AI Vision service can complete quickly and with a high degree of accuracy, freeing scientists to do other work.&lt;/p&gt;

&lt;p&gt;In this module you will learn what Azure AI services are, and how you can use them in your own applications.&lt;/p&gt;

&lt;p&gt;AI services on the Azure platform&lt;br&gt;
Azure AI services are AI capabilities that can be built into web or mobile applications, in a way that's straightforward to implement. These AI services include image recognition, natural language processing, speech, AI-powered search, and more. There are over a dozen different services that can be used separately or together to add AI power to applications.&lt;/p&gt;

&lt;p&gt;Let's take a look at some examples of what can be done with Azure AI services. The Azure AI Content Safety service can be used to detect harmful content within text or images, including violent or hateful content, and report on its severity. The Azure AI Language service can be used to summarize text, classify information, or extract key phrases. The Azure AI Speech service provides powerful speech to text and text to speech capabilities, allowing speech to be accurately transcribed into text, or text to natural sounding voice audio.&lt;/p&gt;

&lt;p&gt;Azure AI services are based on three principles that dramatically improve speed-to-market:&lt;/p&gt;

&lt;p&gt;Prebuilt and ready to use&lt;/p&gt;

&lt;p&gt;Accessed through APIs&lt;/p&gt;

&lt;p&gt;Available on Azure&lt;/p&gt;

&lt;p&gt;Azure AI services are prebuilt and ready to use&lt;br&gt;
AI has been prohibitive for all but the largest technology companies because of several factors, including the large amounts of data required to train models, the massive amount of computing power needed, and the budget to hire specialist programmers. Azure AI services make AI accessible to businesses of all sizes by using pre-trained machine learning models to deliver AI as a service. Azure AI services use high-performance Azure computing to deploy advanced AI models as resources, making decades of research available to developers of all skill levels.&lt;/p&gt;

&lt;p&gt;Azure AI services are a portfolio of services, with capabilities suitable for use cases across sectors and industries.&lt;/p&gt;

&lt;p&gt;For example, in education, Immersive Reader is being used to support students by adapting to their requirements. Learners can have varying needs, such as wanting to read more slowly, get words or text translated into another language, or see pictures to aid their understanding. Immersive Reader helps students with different needs learn at their own pace, and in their own way.&lt;/p&gt;

&lt;p&gt;While Azure AI services can be used without any modification, some AI services can be customized to better fit specific requirements. Customization capabilities in Azure AI Vision, Azure AI Speech, and Azure OpenAI all allow you to add data to existing models.&lt;/p&gt;

&lt;p&gt;For example, in sport, athletes, and coaches are customizing Azure AI Vision to improve performance and reduce injury. One application allows surfers to upload a video and receive AI-generated insights and analysis. These insights can then be used by coaches, medics, judges, and event broadcasters.&lt;/p&gt;

&lt;p&gt;Azure AI services are accessed through APIs&lt;br&gt;
Azure AI services are designed to be used in different development environments, with minimal coding. Developers can access AI services through REST APIs, client libraries, or integrate them with tools such as Logic Apps and Power Automate. APIs are application programming interfaces that define the information that is required for one component to use the services of the other. APIs enable software components to communicate, so one side can be updated without stopping the other from working. Find out more about development options for Azure AI services here.&lt;/p&gt;

&lt;p&gt;Minimize image&lt;br&gt;
Edit image&lt;br&gt;
Delete image&lt;/p&gt;

&lt;p&gt;Azure AI services are available on Azure&lt;br&gt;
AI services are cloud-based and accessed through Azure resource. This means that they're managed in the same way as other Azure services, such as platform as a service (PaaS), infrastructure as a service (IaaS), or a managed database service. The Azure platform and Resource Manager provide a consistent framework for all your Azure services, from creating or deleting resources, to availability and billing.&lt;/p&gt;

&lt;p&gt;Create Azure AI service resources&lt;br&gt;
Minimize image&lt;br&gt;
Edit image&lt;br&gt;
Delete image&lt;/p&gt;

&lt;p&gt;Azure AI services are cloud-based, and like all Azure services you need to create a resource to use them. There are two types of AI service resources: multi-service or single-service. Your development requirements and how you want costs to be billed determine the types of resources you need.&lt;/p&gt;

&lt;p&gt;Multi-service resource: a resource created in the Azure portal that provides access to multiple Azure AI services with a single key and endpoint. Use the resource Azure AI services when you need several AI services or are exploring AI capabilities. When you use an Azure AI services resource, all your AI services are billed together.&lt;/p&gt;

&lt;p&gt;Single-service resources: a resource created in the Azure portal that provides access to a single Azure AI service, such as Speech, Vision, Language, etc. Each Azure AI service has a unique key and endpoint. These resources might be used when you only require one AI service or want to see cost information separately.&lt;/p&gt;

&lt;p&gt;You can create a resource several ways, such as in the Azure portal.&lt;/p&gt;

&lt;p&gt;How to use the Azure portal to create an Azure AI services resource&lt;br&gt;
To create an Azure AI services resource, sign in to the Azure portal with Contributor access and select Create a resource. To create a multi-services resource search for Azure AI services in the marketplace.&lt;/p&gt;

&lt;p&gt;Minimize image&lt;br&gt;
Edit image&lt;br&gt;
Delete image&lt;/p&gt;

&lt;p&gt;To create a single-service resource, search for the specific Azure AI service such as Face, Language, or Content Safety, and so on. Most AI services have a free price tier to allow you to explore their capabilities. After clicking Create for the resource you require, you will be prompted to complete details of your subscription, the resource group to contain the resource, the region, a unique name, and the price tier.&lt;/p&gt;

&lt;p&gt;Use Azure AI services&lt;br&gt;
Once you create an Azure AI service resource, you can build applications using the REST API, software development kits (SDKs), or visual studio interfaces.&lt;/p&gt;

&lt;p&gt;Minimize image&lt;br&gt;
Edit image&lt;br&gt;
Delete image&lt;/p&gt;

&lt;p&gt;Using service studio interfaces&lt;br&gt;
Studio interfaces provide a friendly user interface to explore Azure AI services. There are different studios for different Azure AI services, such as Vision Studio, Language Studio, Speech Studio, and the Content Safety Studio. You can test out Azure AI services using the samples provided, or experiment with your own content. A studio-based approach allows you to explore, demo, and evaluate Azure AI services regardless of your experience with AI or coding.&lt;/p&gt;

&lt;p&gt;Associate the AI service resource&lt;br&gt;
Before you can use an AI service resource, you must associate it with the studio you want to use on the Settings page. Select the resource, and then select Use Resource. You are then ready to explore the Azure AI service within the studio.&lt;/p&gt;

&lt;p&gt;Minimize image&lt;br&gt;
Edit image&lt;br&gt;
Delete image&lt;/p&gt;

&lt;p&gt;As an example, let's look at the Azure AI Content Safety service, which identifies harmful text or images. To explore what the Content Safety service does, let's use the Content Safety Studio. First create either a multi-service Azure AI services resource, or a single-service Content Safety resource. Then, on the Content Safety Studio Settings page, select the resource, and select Use Resource. The AI service you created is now associated with the Content Safety Studio, and ready to be used.&lt;/p&gt;

&lt;p&gt;Use Azure AI services&lt;br&gt;
Once you create an Azure AI service resource, you can build applications using the REST API, software development kits (SDKs), or visual studio interfaces.&lt;/p&gt;

&lt;p&gt;Minimize image&lt;br&gt;
Edit image&lt;br&gt;
Delete image&lt;/p&gt;

&lt;p&gt;Using service studio interfaces&lt;br&gt;
Studio interfaces provide a friendly user interface to explore Azure AI services. There are different studios for different Azure AI services, such as Vision Studio, Language Studio, Speech Studio, and the Content Safety Studio. You can test out Azure AI services using the samples provided, or experiment with your own content. A studio-based approach allows you to explore, demo, and evaluate Azure AI services regardless of your experience with AI or coding.&lt;/p&gt;

&lt;p&gt;Associate the AI service resource&lt;br&gt;
Before you can use an AI service resource, you must associate it with the studio you want to use on the Settings page. Select the resource, and then select Use Resource. You are then ready to explore the Azure AI service within the studio.&lt;/p&gt;

&lt;p&gt;Minimize image&lt;br&gt;
Edit image&lt;br&gt;
Delete image&lt;/p&gt;

&lt;p&gt;As an example, let's look at the Azure AI Content Safety service, which identifies harmful text or images. To explore what the Content Safety service does, let's use the Content Safety Studio. First create either a multi-service Azure AI services resource, or a single-service Content Safety resource. Then, on the Content Safety Studio Settings page, select the resource, and select Use Resource. The AI service you created is now associated with the Content Safety Studio, and ready to be used.&lt;/p&gt;

&lt;p&gt;Summary&lt;br&gt;
Azure AI services are easy to use AI capabilities made available as resources on the Azure platform. Azure AI service capabilities include Language, Speech, Vision, Decision, Search, and Azure OpenAI.&lt;/p&gt;

&lt;p&gt;In this module we’ve used several different terms relating to AI services. Here's a recap:&lt;/p&gt;

&lt;p&gt;API – application programming interfaces (APIs) enable software components to communicate, so one side can be updated without stopping the other from working.&lt;/p&gt;

&lt;p&gt;Artificial Intelligence (AI) – computer programs that respond in ways that are normally associated with human reasoning, learning, and thought.&lt;/p&gt;

&lt;p&gt;Azure AI services – a portfolio of AI services that can be incorporated into applications quickly and easily without specialist knowledge. Azure AI services is also the name for the multi-service resource created in the Azure portal that provides access to several different Azure AI services with a single key and endpoint.&lt;/p&gt;

&lt;p&gt;Endpoint – the location of a resource, such as an Azure AI service.&lt;/p&gt;

&lt;p&gt;Key – a private string that is used to authenticate a request.&lt;/p&gt;

&lt;p&gt;Machine learning – the ability for computer programs to learn from large amounts of data, in a process known as "training".&lt;/p&gt;

&lt;p&gt;Multi-service resource – the AI service resource created in the Azure portal that provides access to a bundle of AI services.&lt;/p&gt;

&lt;p&gt;Single-service resource – a resource created in the Azure portal that provides access to a single Azure AI service, such as Speech, Vision, Language, etc. Each Azure AI service has a unique key and endpoint.&lt;/p&gt;

&lt;p&gt;RESTful API – a scalable web application programming interface used to access Azure AI services.&lt;/p&gt;

&lt;p&gt;You can find out more about Azure AI services in the service documentation.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>azurefunctions</category>
      <category>ai</category>
      <category>eventgrid</category>
    </item>
    <item>
      <title>Oracle Cloud Infrastructure 2023 AI Foundations Associate (1Z0-1122-23)</title>
      <dc:creator>Abhishek Shrivastava</dc:creator>
      <pubDate>Sat, 04 Nov 2023 14:28:07 +0000</pubDate>
      <link>https://dev.to/abhishe89636035/oracle-cloud-infrastructure-2023-ai-foundations-associate-1z0-1122-23-2j27</link>
      <guid>https://dev.to/abhishe89636035/oracle-cloud-infrastructure-2023-ai-foundations-associate-1z0-1122-23-2j27</guid>
      <description>&lt;p&gt;New feather added in #MyCloudDiary :: &lt;a href="https://lnkd.in/dHAMiTcH"&gt;https://lnkd.in/dHAMiTcH&lt;/a&gt; ..&lt;/p&gt;

&lt;p&gt;I have some exciting news to share! I have just uploaded a comprehensive practice test for the 1Z0-1122-23 Oracle Cloud Infrastructure 2023 AI Foundations Associate certification, on Udemy.&lt;/p&gt;

&lt;p&gt;You can search 1Z0-1122-23 OCI 2023 AI Foundation Trial&lt;/p&gt;

&lt;p&gt;This practice test is designed to help aspiring candidates like us prepare for the exam with confidence. It covers all the essential topics and provides realistic questions to simulate the actual exam experience. Whether you're aiming to become certified or simply want to strengthen your understanding of AI within the OCI environment, this practice test can be a valuable resource. Feel free to check it out and let me know your thoughts.&lt;/p&gt;

&lt;p&gt;Certification Name - Oracle Cloud Infrastructure 2023 AI Foundations Associate (1Z0-1122-23)&lt;/p&gt;

&lt;p&gt;Certification Link :: &lt;a href="https://brm-certification.oracle.com/apex/f?p=1111:6:14885428904220:::::#xd_co_f=YTY0MjMyMGItM2ZjZC00NDQyLTg0MTktZTJlMzFhZDAzOThh%7E"&gt;https://brm-certification.oracle.com/apex/f?p=1111:6:14885428904220:::::#xd_co_f=YTY0MjMyMGItM2ZjZC00NDQyLTg0MTktZTJlMzFhZDAzOThh~&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Please like ,share and subscribe below #channelpartners on&lt;/p&gt;

&lt;h1&gt;
  
  
  YouTube :: &lt;a href="https://lnkd.in/d2zATThK"&gt;https://lnkd.in/d2zATThK&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  GitHub :: &lt;a href="https://lnkd.in/dHAMiTcH"&gt;https://lnkd.in/dHAMiTcH&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  Hashnode :: &lt;a href="https://lnkd.in/d_gtGxuS"&gt;https://lnkd.in/d_gtGxuS&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  Twitter :: &lt;a href="https://lnkd.in/e5ZY5j-x"&gt;https://lnkd.in/e5ZY5j-x&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  DevCodeCommunityTLV :: &lt;a href="https://lnkd.in/duMEcSnc"&gt;https://lnkd.in/duMEcSnc&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  Tealfeed :: &lt;a href="https://lnkd.in/eTyp-Xe4"&gt;https://lnkd.in/eTyp-Xe4&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  Medium :: &lt;a href="https://lnkd.in/dpEzM8GU"&gt;https://lnkd.in/dpEzM8GU&lt;/a&gt;
&lt;/h1&gt;

&lt;p&gt;CC- Adding my leaders and colleagues to share achievement::&lt;/p&gt;

&lt;p&gt;Mahesh Bhosale Swapneel Doshi Girish Chhabra Luis Méndez Osvaldo Cantu Jason Chance Anji Reddy Venumula Susan Cutinha Murli Reddy EUR ING Ioannis Kolaxis MSc Sabahat Siddiqui Amit Batra, PMP® Rajeev Choudhary Mukul Sharma Dr. Ritu Anand Farhad Sayeed Pritish Kumar Anand Sitaraman Abhishek Saxena Kumar Amitesh Pawan Kumar Chadha Vijay Bijjargi&lt;/p&gt;

&lt;p&gt;Education for Cloud Leaders :: Dan Rey Denise Reed Lamoreaux Sjoukje Zaal Deepak Rajendran Kasam Shaikh - Microsoft [Azure] AI MVP ☁ 🇮🇳 Yujun Liang ⎈☁️🌎 Yongkang ⎈ ☁️ HE Walter Lee Assoc. Prof. Dr. Eleni Meletiadou, SFHEA, UTF, MCIPD, CMBE,MIE Expert Dahnesh Dilkhush Nikkia Carter&lt;/p&gt;

&lt;h1&gt;
  
  
  mvpbuzz #mct #devops #cloud #software #community #oraclecloud #oci #oraclecertified  #awscommunitybuilders #dataanalytics #share #microsoft #like #github #dataanalysis #kubernetes #kubernetesservices #kubernetescluster #azure #aws #complexity #amazon #google #leadershipbyexample #IAmRemarkable #architecturelovers #architect #education #k8s Tata Consultancy Services #cybersecSecurity #security #power #leaders #cybersecurity #mvpbuzz #mvp #mvps #mct #powerbiexperience #IAmRemarkable #ArtificialIntelligence Embassy of the Republic of Latvia to the United Kingdom of Great Britain and Northern Ireland
&lt;/h1&gt;

&lt;h1&gt;
  
  
  oracle #oraclecertified #oraclecloudinfrastructure #oraclecloudinfrastructureai
&lt;/h1&gt;

&lt;h1&gt;
  
  
  FirstAIcertification
&lt;/h1&gt;

&lt;h1&gt;
  
  
  ai
&lt;/h1&gt;

&lt;h1&gt;
  
  
  aicertification
&lt;/h1&gt;

&lt;h1&gt;
  
  
  aicertified
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--M97X7vPt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4gbh9qoe9nk3p9l96ayj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--M97X7vPt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4gbh9qoe9nk3p9l96ayj.png" alt="Image description" width="800" height="546"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>oracle</category>
      <category>oraclecloud</category>
      <category>oci</category>
      <category>ai</category>
    </item>
    <item>
      <title>Kubeflow Pipelines v2: Making ML pipelines easier, faster, and more scalable</title>
      <dc:creator>Abhishek Shrivastava</dc:creator>
      <pubDate>Thu, 02 Nov 2023 13:29:42 +0000</pubDate>
      <link>https://dev.to/abhishe89636035/kubeflow-pipelines-v2-making-ml-pipelines-easier-faster-and-more-scalable-4fdn</link>
      <guid>https://dev.to/abhishe89636035/kubeflow-pipelines-v2-making-ml-pipelines-easier-faster-and-more-scalable-4fdn</guid>
      <description>&lt;p&gt;Machine learning is increasingly essential for businesses of all sizes. However, building, deploying, and continuously training ML models can be complex and time-consuming. That's where Kubeflow Pipelines (KFP) comes in. Over the years, we have seen increasing adoption of KFP. KFP provides an ecosystem to compose, deploy and manage reusable end-to-end machine learning workflows, making it a no-lock-in hybrid solution from prototyping to production. So we are excited to announce the release of KFP v2. This blog post will take you through what's new in KFP v2.&lt;br&gt;
Wait, what’s Kubeflow?&lt;br&gt;
The Kubeflow ecosystem was initially open-sourced by Google and partners in 2018 to extend Kubernetes for machine learning. In 2019 Kubeflow Pipelines was introduced as a standalone component of that ecosystem for defining and orchestrating MLOps workflows to continuously train models via the execution of a directed acyclic graph (DAG) of container images. KFP provides a Python SDK and domain-specific language (DSL) for defining a pipeline, and backend and frontend services for running and scheduling pipelines on your Kubernetes cluster of choice. Since its launch, KFP has accrued a rich community of orchestration options (e.g., Vertex AI Pipelines) and pre-built components (e.g., Google Cloud Pipeline Components).&lt;br&gt;
What’s new in KFP v2?&lt;br&gt;
KFP v2 has several major improvements. This section will highlight some of them.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;An improved Python-based authoring experience for components and pipelines
The new &lt;a class="mentioned-user" href="https://dev.to/dsl"&gt;@dsl&lt;/a&gt;.component and &lt;a class="mentioned-user" href="https://dev.to/dsl"&gt;@dsl&lt;/a&gt;.container_component decorators simplify and standardize the component authoring experience while improving readability. Additionally, Python docstrings are automatically propagated to the pipeline specification, improving the understandability and reuse of the pipelines you author.&lt;/li&gt;
&lt;li&gt;A new Argo-independent pipeline definition that enables compile and run anywhere: executable components, nested pipelines, and potentially, new orchestrator options
KFP v2’s updated intermediate representation (IR) includes additional details that makes the pipeline executable by any backend (not just Argo). Another big benefit is that you can now compile and run individual components (not just pipelines), and nest pipelines as a component of a larger pipeline.&lt;/li&gt;
&lt;li&gt;An uplifted Workflow GUI
KFP v2 introduces several improvements to help visualize ML workflows. KFP now surfaces input and output artifacts (e.g., datasets, models, and metrics) as first-class nodes in the DAG visualization. This enables users to view how artifacts are used and produced in the workflow and metadata describing them. Nested pipelines are now supported and represented as a sub-DAG. Users can now zoom in the workflow canvas, which greatly improves usability for large pipelines. And there’s more: new run comparison, run cloning, and listing Artifacts and Executions features.&lt;/li&gt;
&lt;li&gt;First class support for ML metadata (MLMD) artifacts and lineage
Previously MLMD was an optional integration. In KFP v2, MLMD is a required dependency. This enables KFP to provide (1) rich lineage tracking and visualization out-of-the-box and eventually (2) custom artifact schemas for strict type checking of component interfaces and improved visualization of custom artifacts.&lt;/li&gt;
&lt;li&gt;Increased security due to upgrading upstream dependencies to their latest major version (e.g. Argo, MinIO, MySQL, Envoy Proxy, and MLMD)
We have taken the opportunity afforded to us by this major version bump to upgrade many upstream dependencies to their latest major version. The KFP v2 backend is completely backwards-compatible with v1 (v1 APIs still exist), so we recommend that all KFP backend users upgrade to KFP v2.
These changes were discussed in greater depth during the 2022 Kubeflow Summit.
What’s next?
Have a look at this notebook to learn more about KFP v2 and how to use it with Vertex AI Pipelines.
SDK users, including Vertex AI Pipelines users, can find migration guidance at: Migrate from KFP SDK v1. Open source backend and frontend users can find upgrade instructions for their KFP clusters at: Installation.
KFP has come a long way since 2019, and proven itself at scale in powering complex MLOps workflow orchestration. We recommend upgrading to KFP v2 without delay to enjoy all the benefits it provides. We greatly appreciate your feedback via GitHub, support for, and contributions to KFP. You can find KFP’s contribution guide at CONTRIBUTING.md.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Credit goes to :: Chase Lyall Product Manager&lt;br&gt;
Erwin Huizenga Developer Advocate Machine Learning&lt;/p&gt;

</description>
      <category>ai</category>
      <category>aip</category>
      <category>kubernetes</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Azure Open AI - TCS</title>
      <dc:creator>Abhishek Shrivastava</dc:creator>
      <pubDate>Thu, 26 Oct 2023 14:33:43 +0000</pubDate>
      <link>https://dev.to/abhishe89636035/azure-open-ai-tcs-43o</link>
      <guid>https://dev.to/abhishe89636035/azure-open-ai-tcs-43o</guid>
      <description>&lt;p&gt;Happy to share small achievement from Tata Consultancy Services.&lt;/p&gt;

&lt;h1&gt;
  
  
  TCS Provide best platform for learners to learn anything what he/she wants to learn. Thanks #TCS to provide wonderful platform.
&lt;/h1&gt;

&lt;p&gt;“Stay away from those people who try to disparage your ambitions. Small minds will always do that, but great minds will give you a feeling that you can become great too.”&lt;/p&gt;

&lt;p&gt;Please like ,share and subscribe below #channelpartners on &lt;/p&gt;

&lt;h1&gt;
  
  
  YouTube  :: &lt;a href="https://lnkd.in/dDaZPGR5"&gt;https://lnkd.in/dDaZPGR5&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  GitHub  :: &lt;a href="https://lnkd.in/dHAMiTcH"&gt;https://lnkd.in/dHAMiTcH&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  Hashnode  :: &lt;a href="https://lnkd.in/d_gtGxuS"&gt;https://lnkd.in/d_gtGxuS&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  Twitter  :: &lt;a href="https://lnkd.in/e5ZY5j-x"&gt;https://lnkd.in/e5ZY5j-x&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  DevCodeCommunity :: &lt;a href="https://lnkd.in/duMEcSnc"&gt;https://lnkd.in/duMEcSnc&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  Tealfeed :: &lt;a href="https://lnkd.in/eTyp-Xe4"&gt;https://lnkd.in/eTyp-Xe4&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  Medium  :: &lt;a href="https://lnkd.in/dpEzM8GU"&gt;https://lnkd.in/dpEzM8GU&lt;/a&gt;
&lt;/h1&gt;

&lt;p&gt;Special thanks to Girish Phadke Namrata Gaikwad😍 &lt;/p&gt;

&lt;p&gt;CC- Adding my leaders and colleagues to share achievement:&lt;/p&gt;

&lt;p&gt;Amit Batra, PMP® Baskar Natarajan Samantak Biswas Shantinath Teradale Amit Bajaj Dr. Ritu Anand Anand Sitaraman K Krithivasan Mukul Sharma Sudeshna Biswas Nigar Sayed Abhishek Saxena Kumar Amitesh Sriram Venkatesan Seethalakshmi Sridhar Keith Mary Rantas-Drew Kapil Gandhi Dr Usha Narasimhan Balasubramaniam Sambandam Maya Sankaranarayanan Dibyendu Naha Ranjith Jayaraman Paresh Sheth Tapan Das Jimmy Mukkalath Selvaraj Shanmugam Minitha Saxena Ramanand Garimella Preetha Kalaivanan Prudvi Raj Amit Vadhera&lt;/p&gt;

&lt;h1&gt;
  
  
  kubernetes #k8s #devops #sitereliabilityengineering #costoptimization #gke #google #developercommunity #tcs #gems #developer #leaders #like #cloud #pune #india #github #community #share #education #learning  #GENAI #TCSgems #Xcelerateachievement #TCSian #IAmRemarkable
&lt;/h1&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>ai</category>
      <category>tcs</category>
    </item>
    <item>
      <title>Introduction to Azure OpenAI Service</title>
      <dc:creator>Abhishek Shrivastava</dc:creator>
      <pubDate>Sat, 29 Jul 2023 09:50:11 +0000</pubDate>
      <link>https://dev.to/abhishe89636035/introduction-to-azure-openai-service-63n</link>
      <guid>https://dev.to/abhishe89636035/introduction-to-azure-openai-service-63n</guid>
      <description>&lt;p&gt;Introduction&lt;br&gt;
Suppose you want to help your team understand the latest artificial intelligence (AI) innovations in the news. Your team would like to evaluate the opportunities these innovations support and understand what is done to keep AI advancements ethical.&lt;/p&gt;

&lt;p&gt;You share with your team that today, stable AI models are regularly put into production and used commercially around the world. For example, Microsoft's existing Azure AI services have been handling the needs of businesses for many years to date. In 2022, OpenAI, an AI research company, created a chatbot known as ChatGPT and an image generation application known as DALL-E. These technologies were built with AI models which can take natural language input from a user and return a machine-created human-like response.&lt;/p&gt;

&lt;p&gt;You share with your team that Azure OpenAI Service enables users to build enterprise-grade solutions with OpenAI models. With Azure OpenAI, users can summarize text, get code suggestions, generate images for a web site, and much more. This module dives into these capabilities.&lt;/p&gt;

&lt;p&gt;Capabilities of OpenAI AI models&lt;br&gt;
There are several categories of capabilities found in OpenAI AI models, three of these include:&lt;/p&gt;

&lt;p&gt;Align to the left&lt;br&gt;
Align in the middle&lt;br&gt;
Resize to full width&lt;br&gt;
Align to the right&lt;br&gt;
Add a link to the embedded image&lt;/p&gt;

&lt;p&gt;Add alt text&lt;br&gt;
Delete image&lt;br&gt;
No alt text provided for this image&lt;br&gt;
Set Example&lt;br&gt;
What is generative AI&lt;br&gt;
OpenAI makes its AI models available to developers to build powerful software applications, such as ChatGPT. There are tons of other examples of OpenAI applications on the OpenAI site, ranging from practical, such as generating text from code, to purely entertaining, such as making up scary stories.&lt;/p&gt;

&lt;p&gt;Let's identify where OpenAI models fit into the AI landscape.&lt;/p&gt;

&lt;p&gt;Artificial Intelligence imitates human behavior by relying on machines to learn and execute tasks without explicit directions on what to output.&lt;br&gt;
Machine learning algorithms take in data like weather conditions and fit models to the data, to make predictions like how much money a store might make in a given day.&lt;br&gt;
Deep learning models use layers of algorithms in the form of artificial neural networks to return results for more complex use cases. Many Azure AI services are built on deep learning models. You can check out this article to learn more about the difference between machine learning and deep learning.&lt;br&gt;
Generative AI models can produce new content based on what is described in the input. The OpenAI models are a collection of generative AI models that can produce language, code, and images.&lt;br&gt;
Next you'll learn how Azure OpenAI gives users the ability to combine Azure's enterprise-grade solutions with many of OpenAI's same generative AI models.&lt;/p&gt;

&lt;p&gt;Describe Azure OpenAI&lt;br&gt;
Microsoft has partnered with OpenAI to deliver on three main goals:&lt;/p&gt;

&lt;p&gt;To utilize Azure's infrastructure, including security, compliance, and regional availability, to help users build enterprise-grade applications.&lt;br&gt;
To deploy OpenAI AI model capabilities across Microsoft products, including and beyond Azure AI products.&lt;br&gt;
To use Azure to power all of OpenAI's workloads.&lt;br&gt;
Introduction to Azure OpenAI Service&lt;br&gt;
Azure OpenAI Service is a result of the partnership between Microsoft and OpenAI. The service combines Azure's enterprise-grade capabilities with OpenAI's generative AI model capabilities.&lt;/p&gt;

&lt;p&gt;Azure OpenAI is available for Azure users and consists of four components:&lt;/p&gt;

&lt;p&gt;Pre-trained generative AI models&lt;br&gt;
Customization capabilities; the ability to fine-tune AI models with your own data&lt;br&gt;
Built-in tools to detect and mitigate harmful use cases so users can implement AI responsibly&lt;br&gt;
Enterprise-grade security with role-based access control (RBAC) and private networks&lt;br&gt;
Using Azure OpenAI allows you to transition between your work with Azure services and OpenAI, while utilizing Azure's private networking, regional availability, and responsible AI content filtering.&lt;/p&gt;

&lt;p&gt;Understand Azure OpenAI workloads&lt;br&gt;
Azure OpenAI supports many common AI workloads and solves for some new ones.&lt;/p&gt;

&lt;p&gt;Common AI workloads include machine learning, computer vision, natural language processing, conversational AI, anomaly detection, and knowledge mining.&lt;/p&gt;

&lt;p&gt;Other AI workloads Azure OpenAI supports can be categorized by tasks they support:&lt;/p&gt;

&lt;p&gt;Generating Natural Language&lt;br&gt;
Text completion: generate and edit text&lt;br&gt;
Embeddings: search, classify, and compare text&lt;br&gt;
Generating Code: generate, edit, and explain code&lt;br&gt;
Generating Images: generate and edit images&lt;br&gt;
Azure OpenAI's relationship to Azure AI services&lt;br&gt;
Align to the left&lt;br&gt;
Align in the middle&lt;br&gt;
Resize to full width&lt;br&gt;
Align to the right&lt;br&gt;
Add a link to the embedded image&lt;/p&gt;

&lt;p&gt;Add alt text&lt;br&gt;
Delete image&lt;br&gt;
No alt text provided for this image&lt;br&gt;
Azure's AI services are tools for solving AI workloads and can be categorized into three groupings: Azure's Machine Learning platform, Cognitive Services, and Applied AI Services.&lt;/p&gt;

&lt;p&gt;Azure AI Services has five pillars: vision, speech, language, decision, and the Azure OpenAI Service. The services you choose to use depend on what you need to accomplish. In particular, there are several overlapping capabilities between the Cognitive Service's Language service and OpenAI's service, such as translation, sentiment analysis, and keyword extraction.&lt;/p&gt;

&lt;p&gt;While there's no strict guidance on when to use a particular service, Azure's existing Language service can be used for widely known use-cases that require minimal tuning (the process of optimizing a model's performance). Azure OpenAI's service may be more beneficial for use-cases that require highly customized generative models, or for exploratory research.&lt;/p&gt;

&lt;p&gt;When making business decisions about what type of model to use, it's important to understand how time and compute needs factor into machine learning training. In order to produce an effective machine learning model, the model needs to be trained with a substantial amount of cleaned data. The 'learning' portion of training requires a computer to identify an algorithm that best fits the data. The complexity of the task the model needs to solve for and the desired level of model performance all factor into the time required to run through possible solutions for a best fit algorithm.&lt;/p&gt;

&lt;p&gt;How to use Azure OpenAI&lt;br&gt;
Currently you need to apply for access to Azure OpenAI. Once granted access, you can use the service by creating an Azure OpenAI resource, like you would for other Azure services. Once the resource is created, you can use the service through REST APIs, Python SDK, or the web-based interface in the Azure OpenAI Studio.&lt;/p&gt;

&lt;p&gt;Azure OpenAI Studio&lt;br&gt;
Align to the left&lt;br&gt;
Align in the middle&lt;br&gt;
Resize to full width&lt;br&gt;
Align to the right&lt;br&gt;
Add a link to the embedded image&lt;/p&gt;

&lt;p&gt;Add alt text&lt;br&gt;
Delete image&lt;br&gt;
No alt text provided for this image&lt;br&gt;
In the Azure OpenAI Studio, you can build AI models and deploy them for public consumption in software applications. Azure OpenAI's capabilities are made possible by specific generative AI models. Different models are optimized for different tasks; some models excel at summarization and providing general unstructured responses, and others are built to generate code or unique images from text input.&lt;/p&gt;

&lt;p&gt;These Azure OpenAI models include:&lt;/p&gt;

&lt;p&gt;GPT-4 models that represent the latest generative models for natural language and code.&lt;br&gt;
GPT-3.5 models that can generate natural language and code responses based on prompts.&lt;br&gt;
Embeddings models that convert to text to numeric vectors for analysis - for example comparing sources of text for similarity.&lt;br&gt;
DALL-E models that generate images based on natural language descriptions.&lt;br&gt;
Azure OpenAI's AI models can all be trained and customized with fine-tuning. We won't go into custom models here, but you can learn more on the fine-tuning your model Azure documentation.&lt;/p&gt;

&lt;p&gt;Playgrounds&lt;br&gt;
In the Azure OpenAI Studio, you can experiment with OpenAI models in playgrounds. In the Completions playground, you can type in prompts, configure parameters, and see responses without having to code.&lt;/p&gt;

&lt;p&gt;Align to the left&lt;br&gt;
Align in the middle&lt;br&gt;
Resize to full width&lt;br&gt;
Align to the right&lt;br&gt;
Add a link to the embedded image&lt;/p&gt;

&lt;p&gt;Add alt text&lt;br&gt;
Delete image&lt;br&gt;
No alt text provided for this image&lt;br&gt;
In the Chat playground, you can use the assistant setup to instruct the model about how it should behave. The assistant will try to mimic the responses you include in tone, rules, and format you've defined in your system message.&lt;/p&gt;

&lt;p&gt;Align to the left&lt;br&gt;
Align in the middle&lt;br&gt;
Resize to full width&lt;br&gt;
Align to the right&lt;br&gt;
Add a link to the embedded image&lt;/p&gt;

&lt;p&gt;Add alt text&lt;br&gt;
Delete image&lt;br&gt;
No alt text provided for this image&lt;br&gt;
Understand OpenAI's natural language capabilities&lt;br&gt;
Azure OpenAI's natural language models are able to take in natural language and generate responses.&lt;/p&gt;

&lt;p&gt;Natural language learning models are trained on words or chunks of characters known as tokens. For example, the word "hamburger" gets broken up into the tokens ham, bur, and ger, while a short and common word like "pear" is a single token. These tokens are mapped into vectors for a machine learning model to use for training. When a trained natural language model takes in a user's input, it also breaks down the input into tokens.&lt;/p&gt;

&lt;p&gt;Understanding GPT models for natural language generation&lt;br&gt;
Generative pre-trained transformer (GPT) models are excellent at both understanding and creating natural language. If you've seen recent news around AI answering questions or writing a paragraph based on a prompt, it likely could have been generated by a GPT model such as GPT-35-Turbo or GPT-4. To use GPT-4 models in Azure OpenAI, you must apply for access).&lt;/p&gt;

&lt;p&gt;What does a response from a GPT model look like?&lt;br&gt;
A key aspect of OpenAI's generative AI is that it takes an input, or prompt, to return a natural language, visual, or code response. GPT tries to infer, or guess, the context of the user's question based on the prompt.&lt;/p&gt;

&lt;p&gt;GPT models are great at completing several natural language tasks, some of which include:&lt;/p&gt;

&lt;p&gt;Align to the left&lt;br&gt;
Align in the middle&lt;br&gt;
Resize to full width&lt;br&gt;
Align to the right&lt;br&gt;
Add a link to the embedded image&lt;/p&gt;

&lt;p&gt;Add alt text&lt;br&gt;
Delete image&lt;br&gt;
No alt text provided for this image&lt;br&gt;
For example, given a prompt where the user types in text asking for a cooking recipe:&lt;/p&gt;

&lt;p&gt;Delicious - maybe! It's important to understand that the generated responses are best guesses from a machine. In this case, the generated text may be useful for cooking something that tastes good in real life, or not.&lt;/p&gt;

&lt;p&gt;How models are applied to new use cases&lt;br&gt;
You may have tried out ChatGPT's predictive capabilities in a chat portal, where you can type prompts and receive automated responses. The portal consists of the front-end user interface (UI) users see, and a back-end that includes a generative AI model. The combination of the front and back end can be described as a chatbot. The model provided on the back end is what is available as a building block with both the OpenAI API and Azure OpenAI API. You can utilize ChatGPT's capabilities on Azure OpenAI via the GPT-35-turbo model. When you see generative AI capabilities in other applications, developers have taken the building blocks, customized them to a use case, and built them into the back end of new front-end user interfaces.&lt;/p&gt;

&lt;p&gt;Understand OpenAI code generation capabilities&lt;/p&gt;

&lt;p&gt;GPT models are able to take natural language or code snippets and translate them into code. The OpenAI GPT models are proficient in over a dozen languages, such as C#, JavaScript, Perl, PHP, and is most capable in Python.&lt;/p&gt;

&lt;p&gt;GPT models have been trained on both natural language and billions of lines of code from public repositories. The models are able to generate code from natural language instructions such as code comments, and can suggest ways to complete code functions.&lt;/p&gt;

&lt;p&gt;For example, given the prompt "Write a for loop counting from 1 to 10 in Python," the following answer is provided:&lt;/p&gt;

&lt;p&gt;Python&lt;/p&gt;

&lt;p&gt;Copy&lt;/p&gt;

&lt;p&gt;for i in range(1,11):&lt;br&gt;
    print(i)&lt;br&gt;
GPT models can help developers code faster, understand new coding languages, and focus on solving bigger problems in their application. Developers can break down their goal into simpler tasks and use GPT to help build those out tasks using known patterns.&lt;/p&gt;

&lt;p&gt;Examples of code generation&lt;br&gt;
Part of the training data for GPT-3 included programming languages, so it's no surprise that GPT models can answer programming questions if asked. What's unique about the Codex model family is that it's more capable across more languages than GPT models.&lt;/p&gt;

&lt;p&gt;Code generation goes beyond just writing code from natural language prompts. Given the following code, it can generate unit tests:&lt;/p&gt;

&lt;p&gt;Python&lt;/p&gt;

&lt;p&gt;Copy&lt;/p&gt;
&lt;h1&gt;
  
  
  Python 3
&lt;/h1&gt;

&lt;p&gt;def mult_numbers(a, b):&lt;br&gt;
  return a * b&lt;/p&gt;
&lt;h1&gt;
  
  
  Unit test
&lt;/h1&gt;

&lt;p&gt;def&lt;br&gt;
GPT builds out unit tests for our function:&lt;/p&gt;

&lt;p&gt;Python&lt;/p&gt;

&lt;p&gt;Copy&lt;/p&gt;
&lt;h1&gt;
  
  
  Python 3
&lt;/h1&gt;

&lt;p&gt;def mult_numbers(a, b):&lt;br&gt;
  return a * b&lt;/p&gt;
&lt;h1&gt;
  
  
  Unit test
&lt;/h1&gt;

&lt;p&gt;def test_mult_numbers():&lt;br&gt;
  assert mult_numbers(3, 4) == 12&lt;br&gt;
  assert mult_numbers(0, 10) == 0&lt;br&gt;
  assert mult_numbers(4, 0) == 0&lt;/p&gt;
&lt;h1&gt;
  
  
  Unit test
&lt;/h1&gt;

&lt;p&gt;def test_mult_numbers_negative():&lt;br&gt;
  assert mult_numbers(-1, 10) == -10&lt;br&gt;
  assert mult_numbers(10, -1) == -10&lt;br&gt;
GPT can also summarize functions that are already written, explain SQL queries or tables, and convert a function from one programming language into another.&lt;/p&gt;

&lt;p&gt;When interacting with GPT models, you can specify libraries or language specific tags to make it clear to Codex what we want. For example, we can provide this prompt formatted as an HTML comment: &amp;lt;!-- build a page titled "Let's Learn about AI" --&amp;gt;, and get this as a result:&lt;/p&gt;

&lt;p&gt;HTML&lt;/p&gt;

&lt;p&gt;Copy&lt;/p&gt;


&lt;br&gt;
Let's Learn about AI&lt;br&gt;
&lt;br&gt;
&lt;br&gt;




&lt;h1&gt;Let's Learn about AI&lt;/h1&gt;



&lt;ul&gt;
&lt;li&gt;&lt;a href="index.php"&gt;Home&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="about.php"&gt;About&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="resources.php"&gt;Resources&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="contact.php"&gt;Contact&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;



&lt;h2&gt;Contact&lt;/h2&gt;


&lt;p&gt;Name:&lt;/p&gt;

&lt;p&gt;Email:&lt;/p&gt;

&lt;p&gt;Subject:&lt;/p&gt;

&lt;p&gt;Message:&lt;/p&gt;

&lt;p&gt;Let's Learn about AI&lt;/p&gt;


&lt;br&gt;
&lt;br&gt;
&lt;br&gt;
GitHub Copilot&lt;br&gt;
OpenAI partnered with GitHub to create GitHub Copilot, which they call an AI pair programmer. GitHub Copilot integrates the power of OpenAI Codex into a plugin for developer environments like Visual Studio Code.

&lt;p&gt;Once the plugin is installed and enabled, you can start writing your code, and GitHub Copilot starts automatically suggesting the remainder of the function based on code comments or the function name. For example, we have only a function name in the file, and the gray text is automatically suggested to complete it.&lt;/p&gt;

&lt;p&gt;Ingredients:&lt;/p&gt;

&lt;p&gt;Strawberries&lt;/p&gt;

&lt;p&gt;Blueberries&lt;/p&gt;

&lt;p&gt;Flour&lt;/p&gt;

&lt;p&gt;Eggs&lt;/p&gt;

&lt;p&gt;Milk&lt;/p&gt;

&lt;p&gt;Align to the left&lt;br&gt;
Align in the middle&lt;br&gt;
Resize to full width&lt;br&gt;
Align to the right&lt;br&gt;
Add a link to the embedded image&lt;/p&gt;

&lt;p&gt;Add alt text&lt;br&gt;
Delete image&lt;br&gt;
No alt text provided for this image&lt;br&gt;
GitHub Copilot offers multiple suggestions for code completion, which you can tab through using keyboard shortcuts. When given informative code comments, it can even suggest a function name along with the complete function code.&lt;/p&gt;

&lt;p&gt;Align to the left&lt;br&gt;
Align in the middle&lt;br&gt;
Resize to full width&lt;br&gt;
Align to the right&lt;br&gt;
Add a link to the embedded image&lt;/p&gt;

&lt;p&gt;Add alt text&lt;br&gt;
Delete image&lt;br&gt;
No alt text provided for this image&lt;br&gt;
Understand OpenAI's image generation capabilities&lt;br&gt;
Image generation models can take a prompt, a base image, or both, and create something new. These generative AI models can create both realistic and artistic images, change the layout or style of an image, and create variations on a provided image.&lt;/p&gt;

&lt;p&gt;DALL-E&lt;br&gt;
In addition to natural language capabilities, generative AI models can edit and create images. The model that works with images is called DALL-E. Much like GPT models, subsequent versions of DALL-E are appended onto the name, such as DALL-E 2. Image capabilities generally fall into the three categories of image creation, editing an image, and creating variations of an image.&lt;/p&gt;

&lt;p&gt;Image generation&lt;br&gt;
Original images can be generated by providing a text prompt of what you would like the image to be of. The more detailed the prompt, the more likely the model will provide a desired result.&lt;/p&gt;

&lt;p&gt;With DALL-E, you can even request an image in a particular style, such as "a dog in the style of Vincent van Gogh". Styles can be used for edits and variations as well.&lt;/p&gt;

&lt;p&gt;For example, given the prompt "an elephant standing with a burger on top, style digital art", the model generates digital art images depicting exactly what is asked for.&lt;/p&gt;

&lt;p&gt;Align to the left&lt;br&gt;
Align in the middle&lt;br&gt;
Resize to full width&lt;br&gt;
Align to the right&lt;br&gt;
Add a link to the embedded image&lt;/p&gt;

&lt;p&gt;Add alt text&lt;br&gt;
Delete image&lt;br&gt;
No alt text provided for this image&lt;br&gt;
When asked for something more generic like "a pink fox", the images generated are more varied and simpler while still fulfilling what is asked for.&lt;/p&gt;

&lt;p&gt;Align to the left&lt;br&gt;
Align in the middle&lt;br&gt;
Resize to full width&lt;br&gt;
Align to the right&lt;br&gt;
Add a link to the embedded image&lt;/p&gt;

&lt;p&gt;Add alt text&lt;br&gt;
Delete image&lt;br&gt;
No alt text provided for this image&lt;br&gt;
However when we make the prompt more specific, such as "a pink fox running through a field, in the style of Monet", the model creates much more similar detailed images.&lt;/p&gt;

&lt;p&gt;Align to the left&lt;br&gt;
Align in the middle&lt;br&gt;
Resize to full width&lt;br&gt;
Align to the right&lt;br&gt;
Add a link to the embedded image&lt;/p&gt;

&lt;p&gt;Add alt text&lt;br&gt;
Delete image&lt;br&gt;
No alt text provided for this image&lt;br&gt;
Editing an image&lt;br&gt;
When provided an image, DALL-E can edit the image as requested by changing its style, adding or removing items, or generating new content to add. Edits are made by uploading the original image and specifying a transparent mask that indicates what area of the image to edit. Along with the image and mask, a prompt indicating what is to be edited instructs the model to then generate the appropriate content to fill the area.&lt;/p&gt;

&lt;p&gt;When given one of the above images of a pink fox, a mask covering the fox, and the prompt of "blue gorilla reading a book in a field", the model creates edits of the image based on the provided input.&lt;/p&gt;

&lt;p&gt;Align to the left&lt;br&gt;
Align in the middle&lt;br&gt;
Resize to full width&lt;br&gt;
Align to the right&lt;br&gt;
Add a link to the embedded image&lt;/p&gt;

&lt;p&gt;Add alt text&lt;br&gt;
Delete image&lt;br&gt;
No alt text provided for this image&lt;br&gt;
Image variations&lt;br&gt;
Image variations can be created by providing an image and specifying how many variations of the image you would like. The general content of the image will stay the same, but aspects will be adjusted such as where subjects are located or looking, background scene, and colors may change.&lt;/p&gt;

&lt;p&gt;For example, if I upload one of the images of the elephant wearing a burger as a hat, I get variations of the same subject.&lt;/p&gt;

&lt;p&gt;Align to the left&lt;br&gt;
Align in the middle&lt;br&gt;
Resize to full width&lt;br&gt;
Align to the right&lt;br&gt;
Add a link to the embedded image&lt;/p&gt;

&lt;p&gt;Add alt text&lt;br&gt;
Delete image&lt;br&gt;
No alt text provided for this image&lt;br&gt;
Describe Azure OpenAI's access and responsible AI policies&lt;br&gt;
It's important to consider the ethical implications of working with AI systems. Azure OpenAI provides powerful natural language models capable of completing various tasks and operating in several different use cases, each with their own considerations for safe and fair use. Teams or individuals tasked with developing and deploying AI systems should work to identify, measure, and mitigate harm.&lt;/p&gt;

&lt;p&gt;Usage of Azure OpenAI should follow the six Microsoft AI principles:&lt;/p&gt;

&lt;p&gt;Fairness: AI systems shouldn't make decisions that discriminate against or support bias of a group or individual.&lt;br&gt;
Reliability and Safety: AI systems should respond safely to new situations and potential manipulation.&lt;br&gt;
Privacy and Security: AI systems should be secure and respect data privacy.&lt;br&gt;
Inclusiveness: AI systems should empower everyone and engage people.&lt;br&gt;
Accountability: People must be accountable for how AI systems operate.&lt;br&gt;
Transparency: AI systems should have explanations so users can understand how they're built and used.&lt;br&gt;
Responsible AI principles guide Microsoft's Transparency Notes on Azure OpenAI, as well as explanations of other products. Transparency Notes are intended to help you understand how Microsoft's AI technology works, the choices system owners can make that influence system performance and behavior, and the importance of thinking about the whole system, including the technology, the people, and the environment.&lt;/p&gt;

&lt;p&gt;If you haven't completed the Get started with AI on Azure module, it's worth reviewing its unit on responsible AI.&lt;/p&gt;

&lt;p&gt;Limited access to Azure OpenAI&lt;br&gt;
As part of Microsoft's commitment to using AI responsibly, access to Azure OpenAI is currently limited. Customers that wish to use Azure OpenAI must submit a registration form for both initial experimentation access, and again for approval for use in production.&lt;/p&gt;

&lt;p&gt;Additional registration is required for customers who want to modify content filters or modify abuse monitoring settings.&lt;/p&gt;

&lt;p&gt;To apply for access and learn more about the limited access policy, see the Azure OpenAI limited access documentation.&lt;/p&gt;

&lt;p&gt;Summary&lt;br&gt;
This module introduced you to the concept of generative AI and how Azure OpenAI Service provides access to generative AI models.&lt;/p&gt;

&lt;p&gt;In this module, you also learned how to:&lt;/p&gt;

&lt;p&gt;Describe Azure OpenAI workloads and how to access the Azure OpenAI Service&lt;br&gt;
Understand generative AI models&lt;br&gt;
Understand Azure OpenAI's language, code, and image capabilities&lt;br&gt;
Understand Azure OpenAI's Responsible AI practices and Limited Access Policy&lt;br&gt;
To continue learning about Azure OpenAI and find resources for implementation, you can check out the documentation on Azure OpenAI and the Develop AI solutions with Azure OpenAI Learning Path.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>azureai</category>
      <category>microsoft</category>
      <category>microsoftai</category>
    </item>
    <item>
      <title>AWS - Professional Services</title>
      <dc:creator>Abhishek Shrivastava</dc:creator>
      <pubDate>Sun, 02 Jul 2023 07:00:19 +0000</pubDate>
      <link>https://dev.to/abhishe89636035/aws-professional-services-151o</link>
      <guid>https://dev.to/abhishe89636035/aws-professional-services-151o</guid>
      <description>&lt;p&gt;What Is AWS?&lt;/p&gt;

&lt;p&gt;Cloud computing is the on-demand delivery of IT resources with primarily pay-as-you-go pricing.&lt;/p&gt;

&lt;p&gt;Cloud computing deployment models&lt;/p&gt;

&lt;p&gt;Cloud computing provides developers and IT departments with the ability to focus on what matters most by avoiding work like procurement, maintenance, and capacity planning. As cloud computing has grown in popularity, several deployment strategies have emerged to help meet specific needs of different users. Each type of deployment method provides you with different levels of control, flexibility, and management. Understanding the differences between these deployment strategies can help you decide what set of services is right for your needs. &lt;/p&gt;

&lt;p&gt;On-premises&lt;/p&gt;

&lt;p&gt;Before the cloud, companies and organizations hosted and maintained hardware such as compute, storage, and networking equipment in their own data centers. They often allocated entire infrastructure departments to take care of their data centers, which resulted in costly operations that made some workloads and experimentation impossible. &lt;/p&gt;

&lt;p&gt;As internet use became more widespread, the demand for compute, storage, and networking equipment increased. For some companies and organizations, the cost of maintaining a large physical presence was unsustainable. To solve this problem, cloud computing emerged.&lt;/p&gt;

&lt;p&gt;Cloud&lt;/p&gt;

&lt;p&gt;Cloud computing is the on-demand delivery of IT resources over the internet with primarily pay-as-you-go pricing. With cloud computing, companies do not have to manage and maintain their own hardware and data centers. Instead, companies like Amazon Web Services (AWS) own and maintain data centers and provide virtual data center technologies and services to companies and users over the internet.&lt;/p&gt;

&lt;p&gt;Hybrid&lt;/p&gt;

&lt;p&gt;A third option is a hybrid deployment. This type of deployment is a way to connect infrastructure and applications between cloud-based resources and existing resources that are not located in the cloud. The most common method of hybrid deployment between the cloud and existing on-premises infrastructure connects cloud resources to internal systems to extend and grow an organization's infrastructure into the cloud.&lt;/p&gt;

&lt;p&gt;To help differentiate between running workloads on premises and in the cloud, consider a scenario in which developers must deploy a new application feature. Before they deploy, the team wants to test the feature in a separate quality assurance (QA) environment that has the same configurations as production. In an on-premises solution, an additional environment requires you to buy and install hardware, connect the necessary cabling, provision power, install operating systems, and more. These tasks can be time consuming and expensive. Meanwhile, the new feature’s time-to-market increases while the developers wait for the QA environment. &lt;/p&gt;

&lt;p&gt;In contrast, by running your application in the cloud, you can replicate an entire production environment in a matter of minutes or even seconds. Instead of physically installing hardware and connecting cabling, the solution is managed over the internet.&lt;/p&gt;

&lt;p&gt;Using cloud computing saves time during setup and removes redundant and unnecessary tasks. If you look at any application, you will see that some of its aspects are very important to your business, like the code. However, other aspects are no different than any other application that you might make—for example, the compute that the code runs on. Some repetitive common tasks don’t differentiate your business, like installing virtual machines (VMs) or storing backups. By removing these tasks, you can focus on what is strategically unique to your business and let AWS handle the time-consuming tasks that don’t separate you from your competitors, we refer to this as removing "undifferentiated heavy lifting". That's where AWS fits into all of this.&lt;/p&gt;

&lt;p&gt;AWS provides cloud computing services. The IT resources mentioned in the cloud computing definition are AWS services. For this course's corporate directory application, you will use AWS services to architect a scalable, highly available, and cost-effective infrastructure to host the corporate directory application. That way, you can get the application out into the world quickly, without managing heavy-duty physical hardware.&lt;/p&gt;

&lt;p&gt;Six advantages of cloud computing&lt;/p&gt;

&lt;p&gt;To learn more, expand each of the following six categories.&lt;/p&gt;

&lt;p&gt;Pay as you go&lt;/p&gt;

&lt;p&gt;Benefit from massive economies of scale&lt;/p&gt;

&lt;p&gt;Stop guessing capacity&lt;/p&gt;

&lt;p&gt;Increase speed and agility&lt;/p&gt;

&lt;p&gt;Realize cost savings&lt;/p&gt;

&lt;p&gt;Go global in minutes&lt;/p&gt;

&lt;p&gt;AWS Global Infrastructure&lt;/p&gt;

&lt;p&gt;Infrastructure exists as the foundation of every cloud application.&lt;/p&gt;

&lt;p&gt;To watch the video on the AWS Global Infrastructure, choose the play button.&lt;/p&gt;

&lt;p&gt;To access a transcript of the video, expand the following block.&lt;/p&gt;

&lt;p&gt;Transcript&lt;/p&gt;

&lt;p&gt;Infrastructure, like data centers and networking connectivity, still exists as the foundation of every cloud application. In AWS, this physical infrastructure makes up the AWS Global Infrastructure, in the form of Regions and Availability Zones.&lt;/p&gt;

&lt;p&gt;Regions&lt;/p&gt;

&lt;p&gt;Regions are geographic locations worldwide where AWS hosts its data centers. AWS Regions are named after the location where they reside. For example, in the United States, the Region in Northern Virginia is called the Northern Virginia Region, and the Region in Oregon is called the Oregon Region. AWS has Regions in Asia Pacific, China, Europe, the Middle East, North America, and South America. And we continue to expand to meet our customers' needs.&lt;/p&gt;

&lt;p&gt;Each AWS Region is associated with a geographical name and a Region code.&lt;/p&gt;

&lt;p&gt;Here are examples of Region codes:&lt;/p&gt;

&lt;p&gt;us-east-1 is the first Region created in the eastern US area. The geographical name for this Region is N. Virginia.&lt;/p&gt;

&lt;p&gt;ap-northeast-1 is the first Region created in the northeast Asia Pacific area. The geographical name for this Region is Tokyo.&lt;/p&gt;

&lt;p&gt;Choosing the right AWS Region&lt;/p&gt;

&lt;p&gt;AWS Regions are independent from one another. Without explicit customer consent and authorization, data is not replicated from one Region to another. When you decide which AWS Region to host your applications and workloads, consider four main aspects: latency, price, service availability, and compliance.&lt;/p&gt;

&lt;p&gt;To learn about a category, choose the appropriate tab.&lt;/p&gt;

&lt;p&gt;LATENCY&lt;/p&gt;

&lt;p&gt;PRICING&lt;/p&gt;

&lt;p&gt;SERVICE AVAILABILITY&lt;/p&gt;

&lt;p&gt;DATA COMPLIANCE&lt;/p&gt;

&lt;p&gt;If your application is sensitive to latency (the delay between a request for data and the response), choose a Region that is close to your user base. This helps prevent long wait times for your customers. Synchronous applications such as gaming, telephony, WebSockets, and Internet of Things (IoT) are significantly affected by high latency. Asynchronous workloads, such as ecommerce applications, can also suffer from user connectivity delays.&lt;/p&gt;

&lt;p&gt;Availability Zones&lt;/p&gt;

&lt;p&gt;Inside every Region is a cluster of Availability Zones. An Availability Zone consists of one or more data centers with redundant power, networking, and connectivity. These data centers operate in discrete facilities in undisclosed locations. They are connected using redundant high-speed and low-latency links.&lt;/p&gt;

&lt;p&gt;Availability Zones also have code names. Because they are located inside Regions, they can be addressed by appending a letter to the end of the Region code name. Here are examples of Availability Zone codes:&lt;/p&gt;

&lt;p&gt;us-east-1a is an Availability Zone in us-east-1 (N. Virginia Region).&lt;/p&gt;

&lt;p&gt;sa-east-1b is an Availability Zone in sa-east-1 (São Paulo Region).&lt;/p&gt;

&lt;p&gt;Therefore, if you see that a resource exists in us-east-1c, you can infer that the resource is located in Availability Zone c of the us-east-1 Region.&lt;/p&gt;

&lt;p&gt;Scope of AWS services&lt;/p&gt;

&lt;p&gt;Depending on the AWS service that you use, your resources are either deployed at the Availability Zone, Region, or Global level. Each service is different, so you must understand how the scope of a service might affect your application architecture.&lt;/p&gt;

&lt;p&gt;When you operate a Region-scoped service, you only need to select the Region that you want to use. If you are not asked to specify an individual Availability Zone to deploy the service in, this is an indicator that the service operates on a Region-scope level. For Region-scoped services, AWS automatically performs actions to increase data durability and availability.&lt;/p&gt;

&lt;p&gt;On the other hand, some services ask you to specify an Availability Zone. With these services, you are often responsible for increasing the data durability and high availability of these resources.&lt;/p&gt;

&lt;p&gt;Maintaining resiliency&lt;/p&gt;

&lt;p&gt;To keep your application available, you must maintain high availability and resiliency. A well-known best practice for cloud architecture is to use Region-scoped, managed services. These services come with availability and resiliency built in. When that is not possible, make sure your workload is replicated across multiple Availability Zones. At a minimum, you should use two Availability Zones. That way, if an Availability Zone fails, your application will have infrastructure up and running in a second Availability Zone to take over the traffic.&lt;/p&gt;

&lt;p&gt;Edge locations&lt;/p&gt;

&lt;p&gt;Edge locations are global locations where content is cached. For example, if your media content is in London and you want to share video files with your customers in Sydney, you could have the videos cached in an edge location closest to Sydney. This would make it possible for your customers to access the cached videos more quickly than accessing them from London. Currently, there are over 400+ edge locations globally.&lt;/p&gt;

&lt;p&gt;Amazon CloudFront delivers your content through a worldwide network of edge locations. When a user requests content that is being served with CloudFront, the request is routed to the location that provides the lowest latency. So that content is delivered with the best possible performance. CloudFront speeds up the distribution of your content by routing each user request through the AWS backbone network to the edge location that can best serve your content.&lt;/p&gt;

&lt;p&gt;Interacting with AWS&lt;/p&gt;

&lt;p&gt;Every action that you make in AWS is an API call that is authenticated and authorized.&lt;/p&gt;

&lt;p&gt;To access a transcript of the video, expand the following block.&lt;/p&gt;

&lt;p&gt;Transcript&lt;/p&gt;

&lt;p&gt;Every action that you make in AWS is an API call that is authenticated and authorized. In AWS, you can make API calls to services and resources through the AWS Management Console, AWS Command Line Interface (AWS CLI), or AWS SDKs.&lt;/p&gt;

&lt;p&gt;AWS Management Console&lt;/p&gt;

&lt;p&gt;One way to manage cloud resources is through the web-based console, where you log in and choose the desired service. This can be the easiest way to create and manage resources when you first begin working with the cloud. The following is a screenshot that shows the landing page when you first log in to the console. &lt;/p&gt;

&lt;p&gt;In the upper-left corner, you can choose Services to view AWS services grouped by categories, such as Compute, Storage, Database, and Analytics.&lt;/p&gt;

&lt;p&gt;In the upper-right corner is the Region selector. If you choose it and change the Region, you will make requests to the services in the chosen Region. The URL changes, too. Changing the Region setting directs your browser to make requests to a different AWS Region, represented by a different subdomain.&lt;/p&gt;

&lt;p&gt;AWS CLI&lt;/p&gt;

&lt;p&gt;Consider the scenario where you run many servers on AWS for your application’s frontend. You want to run a report to collect data from all the servers. You need to do this programmatically every day because the server details might change. Instead of manually logging in to the console and then copying and pasting information, you can schedule an AWS CLI script with an API call to pull this data for you.&lt;/p&gt;

&lt;p&gt;The AWS CLI is a unified tool that you can use to manage AWS services. You can download and configure one tool that you can use to control multiple AWS services from the command line, and automate them with scripts. The AWS CLI is open source, and installers are available for Windows, Linux, and macOS.&lt;/p&gt;

&lt;p&gt;For example, you run the following API call against a service, using the AWS CLI:&lt;/p&gt;

&lt;p&gt;aws s3api list-buckets&lt;/p&gt;

&lt;p&gt;You will get a response similar to the following one, listing the buckets in your AWS accounts:&lt;/p&gt;

&lt;p&gt;AWS SDKs&lt;/p&gt;

&lt;p&gt;API calls to AWS can also be performed by running code with programming languages. You can do this by using AWS SDKs. SDKs are open source and maintained by AWS for the most popular programming languages, such as C++, Go, Java, JavaScript, .NET, Node.js, PHP, Python, Ruby, Rust, and Swift.&lt;/p&gt;

&lt;p&gt;Developers commonly use AWS SDKs to integrate their application source code with AWS services. For example, consider an application with a frontend that runs in Python. Every time the application receives a photo, it uploads the file to a storage service. This action can be achieved in the source code by using the AWS SDK for Python (Boto3). Here is an example of code that you can implement to work with AWS resources using the SDK for Python.&lt;/p&gt;

&lt;p&gt;Security and the AWS Shared Responsibility Model&lt;/p&gt;

&lt;p&gt;Security and compliance are a shared responsibility between AWS and you.&lt;/p&gt;

&lt;p&gt;When you work with the AWS Cloud, managing security and compliance is a shared responsibility between AWS and you. To depict this shared responsibility, AWS created the shared responsibility model. The distinction of responsibility is commonly referred to as security of the cloud as compared to security in the cloud. &lt;/p&gt;

&lt;p&gt;AWS responsibility&lt;/p&gt;

&lt;p&gt;AWS is responsible for security of the cloud. This means that AWS protects and secures the infrastructure that runs the services offered in the AWS Cloud. AWS is responsible for the following:&lt;/p&gt;

&lt;p&gt;Protecting and securing AWS Regions, Availability Zones, and data centers, down to the physical security of the buildings&lt;/p&gt;

&lt;p&gt;Managing the hardware, software, and networking components that run AWS services, such as the physical servers, host operating systems, virtualization layers, and AWS networking components&lt;/p&gt;

&lt;p&gt;The level of responsibility that AWS has depends on the service. AWS classifies services into two categories. The following table provides information about each, including the AWS responsibility.&lt;/p&gt;

&lt;p&gt;To learn more, expand the following category.&lt;/p&gt;

&lt;p&gt;AWS responsibility&lt;/p&gt;

&lt;p&gt;Customer responsibility&lt;/p&gt;

&lt;p&gt;Customers are responsible for security in the cloud. When using any AWS service, the customer is responsible for properly configuring the service and their applications, in addition to ensuring that their data is secure.&lt;/p&gt;

&lt;p&gt;The customers' level of responsibility depends on the AWS service. Some services require the customer to perform all the necessary security configuration and management tasks. Other more abstracted services require customers to only manage the data and control access to their resources. Using the two categories of AWS services, customers can determine their level of responsibility for each AWS service that they use.&lt;/p&gt;

&lt;p&gt;To learn more, expand the following category.&lt;/p&gt;

&lt;p&gt;Customer responsibility&lt;/p&gt;

&lt;p&gt;Due to the varying levels of effort, customers must consider which AWS services they use and review the level of responsibility required to secure each service. They must also review how the AWS shared responsibility model aligns with the security standards in their IT environment in addition to any applicable laws and regulations.&lt;/p&gt;

&lt;p&gt;A key concept is that customers maintain complete control of their data and are responsible for managing the security related to their content. For example, you are responsible for the following:&lt;/p&gt;

&lt;p&gt;Choosing a Region for AWS resources in accordance with data sovereignty regulations&lt;/p&gt;

&lt;p&gt;Implementing data-protection mechanisms, such as encryption and scheduled backups&lt;/p&gt;

&lt;p&gt;Using access control to limit who can access your data and AWS resources&lt;/p&gt;

&lt;p&gt;AWS responsibility&lt;/p&gt;

&lt;p&gt;AWS is responsible for security of the cloud. This means that AWS protects and secures the infrastructure that runs the services offered in the AWS Cloud. AWS is responsible for the following:&lt;/p&gt;

&lt;p&gt;Protecting and securing AWS Regions, Availability Zones, and data centers, down to the physical security of the buildings&lt;/p&gt;

&lt;p&gt;Managing the hardware, software, and networking components that run AWS services, such as the physical servers, host operating systems, virtualization layers, and AWS networking components&lt;/p&gt;

&lt;p&gt;The level of responsibility that AWS has depends on the service. AWS classifies services into two categories. The following table provides information about each, including the AWS responsibility.&lt;/p&gt;

&lt;p&gt;To learn more, expand the following category.&lt;/p&gt;

&lt;p&gt;AWS responsibility&lt;/p&gt;

&lt;p&gt;Customer responsibility&lt;/p&gt;

&lt;p&gt;Customers are responsible for security in the cloud. When using any AWS service, the customer is responsible for properly configuring the service and their applications, in addition to ensuring that their data is secure.&lt;/p&gt;

&lt;p&gt;The customers' level of responsibility depends on the AWS service. Some services require the customer to perform all the necessary security configuration and management tasks. Other more abstracted services require customers to only manage the data and control access to their resources. Using the two categories of AWS services, customers can determine their level of responsibility for each AWS service that they use.&lt;/p&gt;

&lt;p&gt;To learn more, expand the following category.&lt;/p&gt;

&lt;p&gt;Customer responsibility&lt;/p&gt;

&lt;p&gt;Due to the varying levels of effort, customers must consider which AWS services they use and review the level of responsibility required to secure each service. They must also review how the AWS shared responsibility model aligns with the security standards in their IT environment in addition to any applicable laws and regulations.&lt;/p&gt;

&lt;p&gt;A key concept is that customers maintain complete control of their data and are responsible for managing the security related to their content. For example, you are responsible for the following:&lt;/p&gt;

&lt;p&gt;Choosing a Region for AWS resources in accordance with data sovereignty regulations&lt;/p&gt;

&lt;p&gt;Implementing data-protection mechanisms, such as encryption and scheduled backups&lt;/p&gt;

&lt;p&gt;Using access control to limit who can access your data and AWS resources&lt;/p&gt;

&lt;p&gt;Protecting the AWS Root User&lt;/p&gt;

&lt;p&gt;When you first access AWS, you begin with a single sign-in identity known as the root user.&lt;/p&gt;

&lt;p&gt;To watch the video on protecting the AWS root user, choose the play button.&lt;/p&gt;

&lt;p&gt;AWS root user&lt;/p&gt;

&lt;p&gt;When you first create an AWS account, you begin with a single sign-in identity that has complete access to all AWS services and resources in the account. This identity is called the AWS root user and is accessed by signing in with the email address and password that were used to create the account. &lt;/p&gt;

&lt;p&gt;AWS root user credentials&lt;/p&gt;

&lt;p&gt;The AWS root user has two sets of credentials associated with it. One set of credentials is the email address and password that were used to create the account. This allows you to access the AWS Management Console. The second set of credentials is called access keys, which allow you to make programmatic requests from the AWS Command Line Interface (AWS CLI) or AWS API.&lt;/p&gt;

&lt;p&gt;Access keys consist of two parts:&lt;/p&gt;

&lt;p&gt;Access key ID: for example, A2lAl5EXAMPLE&lt;/p&gt;

&lt;p&gt;Secret access key: for example, wJalrFE/KbEKxE&lt;/p&gt;

&lt;p&gt;Similar to a user name and password combination, you need both the access key ID and secret access key to authenticate your requests through the AWS CLI or AWS API. Access keys should be managed with the same security as an email address and password.&lt;/p&gt;

&lt;p&gt;Delete your access keys to stay safe!&lt;/p&gt;

&lt;p&gt;If you don't have an access key for your AWS account root user, don't create one unless you absolutely need to. If you have an access key for your AWS account root user and want to delete the key, follow these steps:&lt;/p&gt;

&lt;p&gt;In the AWS Management Console, navigate to your username in the upper right section of the navigation bar. From the dropdown menu, go to the My Security Credentials page, and sign in with the root user’s email address and password.&lt;/p&gt;

&lt;p&gt;Open the Access keys section.&lt;/p&gt;

&lt;p&gt;Under Actions, choose Delete.&lt;/p&gt;

&lt;p&gt;Choose Yes.&lt;/p&gt;

&lt;p&gt;AWS root user best practices&lt;/p&gt;

&lt;p&gt;The root user has complete access to all AWS services and resources in your account, including your billing and personal information. Therefore, you should securely lock away the credentials associated with the root user and not use the root user for everyday tasks. Visit the links at the end of this lesson to learn more about when to use the AWS root user.&lt;/p&gt;

&lt;p&gt;To ensure the safety of the root user, follow these best practices:&lt;/p&gt;

&lt;p&gt;Choose a strong password for the root user.&lt;/p&gt;

&lt;p&gt;Enable multi-factor authentication (MFA) for the root user.&lt;/p&gt;

&lt;p&gt;Never share your root user password or access keys with anyone.&lt;/p&gt;

&lt;p&gt;Disable or delete the access keys associated with the root user.&lt;/p&gt;

&lt;p&gt;Create an Identity and Access Management (IAM) user for administrative tasks or everyday tasks.&lt;/p&gt;

&lt;p&gt;Multi-factor authentication&lt;/p&gt;

&lt;p&gt;When you create an AWS account and first log in to the account, you use single-factor authentication. Single-factor authentication is the simplest and most common form of authentication. It only requires one authentication method. In this case, you use a user name and password to authenticate as the AWS root user. Other forms of single-factor authentication include a security pin or a security token.&lt;/p&gt;

&lt;p&gt;However, sometimes a user’s password is easy to guess. For example, your coworker Bob’s password, IloveCats222, might be easy for someone who knows Bob personally to guess, because it’s a combination of information that is easy to remember and includes certain facts about Bob (Bob loves cats, and his birthday is February 22). If a bad actor guessed or cracked Bob’s password through social engineering, bots, or scripts, Bob might lose control of his account. Unfortunately, this is a common scenario that users of any website often face. This is why using multi-factor authentication (MFA) is important in preventing unwanted account access.&lt;/p&gt;

&lt;p&gt;MFA requires two or more authentication methods to verify an identity. To learn more about the three categories of information that MFA pulls from, flip each of the following flashcards by choosing them.&lt;/p&gt;

&lt;p&gt;Front of card&lt;/p&gt;

&lt;p&gt;Something you know&lt;/p&gt;

&lt;p&gt;Click to flip&lt;/p&gt;

&lt;p&gt;Back of card&lt;/p&gt;

&lt;p&gt;Something you know, such as a user name and password or pin number&lt;/p&gt;

&lt;p&gt;Click to flip&lt;/p&gt;

&lt;p&gt;Front of card&lt;/p&gt;

&lt;p&gt;Something you have&lt;/p&gt;

&lt;p&gt;Click to flip&lt;/p&gt;

&lt;p&gt;Back of card&lt;/p&gt;

&lt;p&gt;Something you have, such as a one-time passcode from a hardware device or mobile app&lt;/p&gt;

&lt;p&gt;Click to flip&lt;/p&gt;

&lt;p&gt;Front of card&lt;/p&gt;

&lt;p&gt;Something you are&lt;/p&gt;

&lt;p&gt;Click to flip&lt;/p&gt;

&lt;p&gt;Back of card&lt;/p&gt;

&lt;p&gt;Something you are, such as a fingerprint or face scanning technology&lt;/p&gt;

&lt;p&gt;Click to flip&lt;/p&gt;

&lt;p&gt;With a combination of this information, systems can provide a layered approach to account access. So even if the first method of authentication, like Bob’s password, is cracked by a malicious actor, the second method of authentication, such as a fingerprint, provides another level of security. This extra layer of security can help protect your most important accounts, which is why you should activate MFA on your AWS root user.&lt;/p&gt;

&lt;p&gt;MFA on AWS&lt;/p&gt;

&lt;p&gt;If you activate MFA on your root user, you must present a piece of identifying information from both the something you know category and the something you have category. The first piece of identifying information the user enters is an email and password combination. The second piece of information is a temporary numeric code provided by an MFA device.&lt;/p&gt;

&lt;p&gt;Using MFA adds an additional layer of security because it requires users to use a supported MFA mechanism in addition to their regular sign-in credentials. Activating MFA on the AWS root user account is an AWS best practice.&lt;/p&gt;

&lt;p&gt;Supported MFA devices&lt;/p&gt;

&lt;p&gt;AWS supports a variety of MFA mechanisms, such as virtual MFA devices, hardware time-based one-time password (TOTP) tokens, and FIDO security keys. To learn more, take a look at the table below. For instructions on how to set up each method, see the Resources section.&lt;/p&gt;

&lt;p&gt;DeviceDescriptionSupported DevicesVirtual MFAA software app that runs on a phone or other device that provides a one-time passcode. These applications can run on unsecured mobile devices, and because of that, they might not provide the same level of security as hardware or FIDO security keys.Twilio Authy Authenticator, Duo Mobile, LastPass Authenticator, Microsoft Authenticator, Google Authenticator, Symantec VIPHardware TOTP tokenA hardware device, generally a key fob or display card device, that generates a one-time, six-digit numeric code based on the time-based one-time password (TOTP) algorithm.Key fob, display cardFIDO security keys&lt;/p&gt;

&lt;p&gt;FIDO-certified hardware security keys are provided by third-party providers such as Yubico. You can plug your FIDO security key into a USB port on your computer and enable it using the instructions that follow.&lt;/p&gt;

&lt;p&gt;FIDO Certified products&lt;/p&gt;

&lt;p&gt;(opens in a new tab)&lt;/p&gt;

&lt;p&gt;AWS Identity and Access Management&lt;/p&gt;

&lt;p&gt;Authentication answers the question, "Are you who you say you are?" Authorization answers the question, "What actions can you perform?"&lt;/p&gt;

&lt;p&gt;To watch the video on identity and access management, choose the play button.&lt;/p&gt;

&lt;p&gt;Authentication and authorization&lt;/p&gt;

&lt;p&gt;When you configure access to any account, two terms come up frequently: authentication and authorization. Although these terms might seem basic, you must fully understand them to properly configure access management on AWS.&lt;/p&gt;

&lt;p&gt;Authentication&lt;/p&gt;

&lt;p&gt;When you create your AWS account, you use the combination of an email address and a password to verify your identity. If a user types in the correct email address and password, the system assumes the user is allowed to enter and grants them access. This is the process of authentication.&lt;/p&gt;

&lt;p&gt;Authentication ensures that the user is who they say they are. User names and passwords are the most common types of authentication. But you might also work with other forms, such as token-based authentication or biometric data, like a fingerprint. Authentication simply answers the question, “Are you who you say you are?”&lt;/p&gt;

&lt;p&gt;Authorization&lt;/p&gt;

&lt;p&gt;After you’re authenticated and in your AWS account, you might be curious about what actions you can take. This is where authorization comes in. Authorization is the process of giving users permission to access AWS resources and services. Authorization determines whether a user can perform certain actions, such as read, edit, delete, or create resources. Authorization answers the question, “What actions can you perform?” &lt;/p&gt;

&lt;p&gt;What is IAM?&lt;/p&gt;

&lt;p&gt;AWS Identity and Access Management (IAM) is an AWS service that helps you manage access to your AWS account and resources. It also provides a centralized view of who and what are allowed inside your AWS account (authentication), and who and what have permissions to use and work with your AWS resources (authorization).&lt;/p&gt;

&lt;p&gt;With IAM, you can share access to an AWS account and resources without sharing your set of access keys or password. You can also provide granular access to those working in your account, so people and services only have permissions to the resources that they need. For example, to provide a user of your AWS account with read-only access to a particular AWS service, you can granularly select which actions and which resources in that service that they can access.&lt;/p&gt;

&lt;p&gt;IAM features&lt;/p&gt;

&lt;p&gt;To help control access and manage identities in your AWS account, IAM offers many features to ensure security.&lt;/p&gt;

&lt;p&gt;To learn more, expand each of the following six categories.&lt;/p&gt;

&lt;p&gt;Global&lt;/p&gt;

&lt;p&gt;Integrated with AWS services&lt;/p&gt;

&lt;p&gt;Shared access&lt;/p&gt;

&lt;p&gt;Multi-factor authentication&lt;/p&gt;

&lt;p&gt;Identity federation&lt;/p&gt;

&lt;p&gt;Free to use&lt;/p&gt;

&lt;p&gt;IAM user&lt;/p&gt;

&lt;p&gt;An IAM user represents a person or service that interacts with AWS. You define the user in your AWS account. Any activity done by that user is billed to your account. When you create a user, that user can sign in to gain access to the AWS resources inside your account.&lt;/p&gt;

&lt;p&gt;You can also add more users to your account as needed. For example, for your cat photo application, you could create individual users in your AWS account that correspond to the people who are working on your application. Each person should have their own login credentials to prevent sharing credentials between users.&lt;/p&gt;

&lt;p&gt;IAM user credentials&lt;/p&gt;

&lt;p&gt;An IAM user consists of a name and a set of credentials. When you create a user, you can provide them with the following types of access:&lt;/p&gt;

&lt;p&gt;Access to the AWS Management Console&lt;/p&gt;

&lt;p&gt;Programmatic access to the AWS CLI and AWS API&lt;/p&gt;

&lt;p&gt;To access the console, provide the user with a user name and password. For programmatic access, AWS generates a set of access keys that can be used with the AWS CLI and AWS API. IAM user credentials are considered permanent, which means that they stay with the user until there’s a forced rotation by admins.&lt;/p&gt;

&lt;p&gt;When you create an IAM user, you can grant permissions directly at the user level. This can seem like a good idea if you have only one or a few users. However, as the number of users increases, keeping up with permissions can become more complicated. For example, if you have 3,000 users in your AWS account, administering access and getting a top-level view of who can perform what actions on which resources can be challenging.&lt;/p&gt;

&lt;p&gt;Fortunately, you can group IAM users and attach permissions at the group level.&lt;/p&gt;

&lt;p&gt;IAM groups&lt;/p&gt;

&lt;p&gt;An IAM group is a collection of users. All users in the group inherit the permissions assigned to the group. This makes it possible to give permissions to multiple users at once. It’s a more convenient and scalable way of managing permissions for users in your AWS account. This is why using IAM groups is a best practice.&lt;/p&gt;

&lt;p&gt;If you have an application that you’re trying to build and you have multiple users in one account working on the application, you might organize the users by job function. For example, you might organize your IAM groups by developers, security, and admins. You could then place all your IAM users into their respective groups.&lt;/p&gt;

&lt;p&gt;This provides a way to see who has what permissions in your organization. It also helps you scale when new people join, leave, and change roles in your organization.&lt;/p&gt;

&lt;p&gt;Consider the following examples:&lt;/p&gt;

&lt;p&gt;A new developer joins your AWS account to help with your application. You create a new user and add them to the developer group, without thinking about which permissions they need.&lt;/p&gt;

&lt;p&gt;A developer changes jobs and becomes a security engineer. Instead of editing the user’s permissions directly, you remove them from the old group and add them to the new group that already has the correct level of access.&lt;/p&gt;

&lt;p&gt;Keep in mind the following features of groups:&lt;/p&gt;

&lt;p&gt;Groups can have many users.&lt;/p&gt;

&lt;p&gt;Users can belong to many groups.&lt;/p&gt;

&lt;p&gt;Groups cannot belong to groups.&lt;/p&gt;

&lt;p&gt;The root user can perform all actions on all resources inside an AWS account by default. This is in contrast to creating new IAM users, new groups, or new roles. To allow an IAM identity to perform specific actions in AWS, such as implement resources, you must grant the IAM user the necessary permissions.&lt;/p&gt;

&lt;p&gt;The way you grant permissions in IAM is by using IAM policies.&lt;/p&gt;

&lt;p&gt;IAM policies&lt;/p&gt;

&lt;p&gt;To manage access and provide permissions to AWS services and resources, you create IAM policies and attach them to an IAM identity. Whenever an IAM identity makes a request, AWS evaluates the policies associated with them. For example, if you have a developer inside the developers group who makes a request to an AWS service, AWS evaluates any policies attached to the developers group and any policies attached to the developer user to determine if the request should be allowed or denied.&lt;/p&gt;

&lt;p&gt;IAM policy examples&lt;/p&gt;

&lt;p&gt;Most policies are stored in AWS as JSON documents with several policy elements. The following example provides admin access through an IAM identity-based policy.&lt;/p&gt;

&lt;p&gt;This policy has four major JSON elements: Version, Effect, Action, and Resource.&lt;/p&gt;

&lt;p&gt;The Version element defines the version of the policy language. It specifies the language syntax rules that are needed by AWS to process a policy. To use all the available policy features, include "Version": "2012-10-17" before the "Statement" element in your policies.&lt;/p&gt;

&lt;p&gt;The Effect element specifies whether the policy will allow or deny access. In this policy, the Effect is "Allow", which means you’re providing access to a particular resource.&lt;/p&gt;

&lt;p&gt;The Action element describes the type of action that should be allowed or denied. In the example policy, the action is "*". This is called a wildcard, and it is used to symbolize every action inside your AWS account.&lt;/p&gt;

&lt;p&gt;The Resource element specifies the object or objects that the policy statement covers. In the policy example, the resource is the wildcard "*". This represents all resources inside your AWS console.&lt;/p&gt;

&lt;p&gt;Putting this information together, you have a policy that allows you to perform all actions on all resources in your AWS account. This is what we refer to as an administrator policy.&lt;/p&gt;

&lt;p&gt;The next example shows a more granular IAM policy.&lt;/p&gt;

&lt;p&gt;This policy uses a Deny effect to block access to Amazon S3 actions, unless the Amazon S3 resource that's being accessed is in account 222222222222. This ensures that any Amazon S3 principals are accessing only the resources that are inside of a trusted AWS account.&lt;/p&gt;

&lt;p&gt;IAM roles&lt;/p&gt;

&lt;p&gt;IAM best practices&lt;/p&gt;

&lt;p&gt;Throughout this lesson, you have learned about some IAM best practices. This section summarizes some of the most important IAM best practices that you must be familiar with before building solutions in AWS. &lt;/p&gt;

&lt;p&gt;To learn more about each best practice, expand each of the following six categories. &lt;/p&gt;

&lt;p&gt;Lock down the AWS root user&lt;/p&gt;

&lt;p&gt;Follow the principle of least privilege&lt;/p&gt;

&lt;p&gt;Use IAM appropriately&lt;/p&gt;

&lt;p&gt;Use IAM roles when possible&lt;/p&gt;

&lt;p&gt;Consider using an identity provider&lt;/p&gt;

&lt;p&gt;Regularly review and remove unused users, roles, and other credentials&lt;/p&gt;

</description>
      <category>aws</category>
    </item>
    <item>
      <title>GENERATIVE AI</title>
      <dc:creator>Abhishek Shrivastava</dc:creator>
      <pubDate>Fri, 16 Jun 2023 08:04:55 +0000</pubDate>
      <link>https://dev.to/abhishe89636035/generative-ai-1mo1</link>
      <guid>https://dev.to/abhishe89636035/generative-ai-1mo1</guid>
      <description>&lt;p&gt;I’m glad to share that I’ve obtained all these badges from Google Cloud for completing all courses of Generative AI Learning Path.&lt;/p&gt;

&lt;p&gt;Since Google released this Free Learning Path for Generative AI, it has been the talk of the town. But it’s time to practically get our hands dirty &amp;amp; dig into these informative courses.&lt;/p&gt;

&lt;p&gt;Check out my Google Cloud Skills Boost credentials: &lt;a href="https://lnkd.in/dFem2yRk"&gt;https://lnkd.in/dFem2yRk&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Thanks to Tata Consultancy Services to provide opportunity : &lt;a href="https://lnkd.in/dDEGdXFV"&gt;https://lnkd.in/dDEGdXFV&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This learning path contains the following courses:&lt;/p&gt;

&lt;p&gt;✅Intro to Large Language Models&lt;br&gt;
✅Intro to Responsible AI&lt;br&gt;
✅Intro to Image Generation&lt;br&gt;
✅Generative AI fundamentals&lt;br&gt;
✅Encoder-Decoder&lt;br&gt;
✅Attention Mechanism&lt;br&gt;
✅Transformers and BERT Models&lt;br&gt;
✅Create Image Captioning Models&lt;br&gt;
✅Intro to Gen AI Studio&lt;br&gt;
✅Generative AI Explorer — Vertex AI&lt;/p&gt;

&lt;p&gt;It has 11 activities with 10 free courses &amp;amp; 4 credit labs. All of these are microlearning courses with short, concise but information-packed contents &amp;amp; tricky quizzes.&lt;br&gt;
Thanks to Qwiklabs for the quests on how to use Generative AI on Google Cloud. Through the labs, we get to learn about how to use the models in the Vertex AI PaLM API family, including text-bison, chat-bison, and textembedding-gecko. We also get to learn about prompt design best practices, and how it can be used for ideation, text classification, text extraction, text summarization, and more. We also learn how to tune a foundation model by training it via Vertex AI custom training and deploy it to a Vertex AI endpoint.&lt;/p&gt;

&lt;p&gt;Please like ,share and subscribe below #channelpartners on&lt;/p&gt;

&lt;h1&gt;
  
  
  youtube :: &lt;a href="https://lnkd.in/dDaZPGR5"&gt;https://lnkd.in/dDaZPGR5&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  github :: &lt;a href="https://lnkd.in/dHAMiTcH"&gt;https://lnkd.in/dHAMiTcH&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  hashnode :: &lt;a href="https://lnkd.in/d_gtGxuS"&gt;https://lnkd.in/d_gtGxuS&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  twitter :: &lt;a href="https://lnkd.in/e5ZY5j-x"&gt;https://lnkd.in/e5ZY5j-x&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  DevCodeCommunity :: &lt;a href="https://lnkd.in/duMEcSnc"&gt;https://lnkd.in/duMEcSnc&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  Tealfeed :: &lt;a href="https://lnkd.in/eTyp-Xe4"&gt;https://lnkd.in/eTyp-Xe4&lt;/a&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  medium :: &lt;a href="https://lnkd.in/dpEzM8GU"&gt;https://lnkd.in/dpEzM8GU&lt;/a&gt;
&lt;/h1&gt;

&lt;p&gt;generativeai #llms #llama #googleai #bard #bert #ai #ml #neuralnetworks #deeplearning #google #generativeai #googlecloud #googlecloudplatform #googlebardai #chatgpt #encoder #machinelearning #cloud #opportunity #training #share #design #language #learning #bardai #generativeai #like #agile #leaders GDG Cloud Pune GDG India #pune #india 😊&lt;/p&gt;

</description>
      <category>ai</category>
      <category>gcp</category>
      <category>generativeai</category>
    </item>
    <item>
      <title>Bringing Google Cloud generative AI to popular workplace apps</title>
      <dc:creator>Abhishek Shrivastava</dc:creator>
      <pubDate>Sun, 04 Jun 2023 14:31:35 +0000</pubDate>
      <link>https://dev.to/abhishe89636035/bringing-google-cloud-generative-ai-to-popular-workplace-apps-4l4k</link>
      <guid>https://dev.to/abhishe89636035/bringing-google-cloud-generative-ai-to-popular-workplace-apps-4l4k</guid>
      <description>&lt;p&gt;Generative AI is poised to transform and streamline many of the most common processes and tasks each of us performs every day at work — like how we communicate, how we analyze data, how we interface with apps, or even how we summarize content and conversations. &lt;/p&gt;

&lt;p&gt;At Google Cloud, our open approach to generative AI development has led to businesses around the world using Google’s large language models (LLMs) through Vertex AI. These include some of the most popular workplace applications that people log into every day. &lt;br&gt;
Today, we’re announcing a broader set of popular enterprise companies that are bringing Google Cloud generative AI capabilities to their applications this year. Each of these companies will address promising enterprise use cases for generative AI, including improving content management, bringing new capabilities for people to collaborate virtually, streamlining sales and customer service, delivering entirely new marketing content-creation abilities, and supporting marketers by embedding generative AI in the Salesforce customer data platform.&lt;br&gt;
Box and Google Cloud will build on our strategic partnership to create new ways for joint customers to work smarter and more productively with generative AI. Box and Google Cloud are working to integrate Google's advanced AI models into Box AI, its new intelligence capability, to power entirely new ways for users to interact with their content, like automatically classifying and extracting metadata or quickly finding answers and insights from documents.&lt;br&gt;
Canva is adding new generative AI features into its popular visual communication platform, including new automatic translation capabilities, and new generative AI video creation tools powered by Google Cloud’s Vertex AI that can enable teams to more quickly build custom content in Canva.&lt;br&gt;
Dialpad is building new tools to help sales and service teams better track and support prospects and customers with Google Cloud generative AI. For example, Dialpad has already launched new AI Recaps with a subset of its users, which harness the power of LLMs to generate concise summaries of conversations and highlight essential information and next-best action items for salespeople. The team at Dialpad is also working with Google Cloud’s models through Vertex AI to create new AI-powered playbooks that will provide real-time, context-sensitive suggestions and prompts to sales reps and support agents, and new AI-driven scorecards to help automate the call review process for sales managers and contact center supervisors. &lt;br&gt;
Jasper will work with Google Cloud to help marketers further automate content creation, with more sophisticated capabilities to target content for particular segments and voices. Jasper will bring Google Cloud foundation models to its AI Engine, and Google Cloud LLMs will give users more capabilities to create unique content matching their brand voice in the Jasper app. Users can also quickly access Jasper’s capabilities with the Jasper Extension for Chrome, and with new add-ons for Google Workspace which are in development.&lt;br&gt;
Salesforce is exploring how Google Cloud’s models and generative AI capabilities could bring new, trusted, and secure functionality to Salesforce customers through Salesforce Data Cloud, which unifies a company’s customer data across channels and interactions into single, real-time customer profiles, and Einstein, its AI-infused CRM assistant.&lt;br&gt;
UKG will bring Google Cloud’s generative AI capabilities into its widely adopted human capital management (HCM) apps to help transform employee experiences. UKG is already using Google Cloud’s LLMs through Vertex AI to create conversational AI interactions with its HCM solutions, augment employee requests with more relevant business insights, and support people managers with a deeper understanding of how business decisions can impact employee engagement. &lt;br&gt;
We’re partnering closely with teams at each of these companies to deploy generative AI responsibly and to enable businesses, developers, marketers, and millions of users who can benefit from Google Cloud’s LLMs in the applications they use every day.&lt;br&gt;
These partnerships build on our open ecosystem for generative AI development. We’re committed to this open future of AI, and we’re bringing the best of Google’s infrastructure, generative AI tooling, and foundation models to partners at every layer of the AI stack - like AI21 Labs, Aible, Anthropic, Anyscale, Bending Spoons, Cohere, Faraday, Glean, Gretel, Labelbox, Midjourney, Osmo, Replit, Snorkel AI, Tabnine, Weights &amp;amp; Biases and many more.&lt;br&gt;
In the coming weeks and months, you’ll hear more examples of how we’re partnering with leading enterprise platforms and applications to bring generative AI to more users. You’ll also hear more about Google Cloud’s approach to applied generative AI tomorrow at Google I/O, and you can learn more about our Trusted Tester program for Vertex AI here.&lt;/p&gt;

</description>
      <category>googlecloud</category>
      <category>google</category>
      <category>ai</category>
      <category>bard</category>
    </item>
  </channel>
</rss>
