AZURE DEVELOPER ASSOCIATE, MICROSOFT (AZ-204), FULL COURSE
Estimated read time: 1:20
Learn to use AI like a Pro
Get the latest AI workflows to boost your productivity and business performance, delivered weekly by expert consultants. Enjoy step-by-step guides, weekly Q&A sessions, and full access to our AI workflow archive.
Summary
In this full course, Innovated Store guides you through the AZ-204 Microsoft Azure Developer Associate certification. The course comprises 13 hands-on modules, beginning with Azure compute solutions and delving into Azure cloud services like app services, Azure functions, and Cosmos DB. It provides a comprehensive understanding of Azure by covering abundant practical coding exercises and deploying applications globally. Participants are advised to set up a free Azure subscription for hands-on practice and certification readiness.
Highlights
The course covers 13 detailed modules focusing on hands-on practical knowledge. 📚
Understand Azure's app services and deploy a wide range of applications. 🚀
Benefit from Visual Studio's integration with Azure's SDK for development. 🎨
Utilize Azure's global regions to strategically deploy resources. 🌐
Explore Azure's scaling capabilities and maximize application performance. 🔧
Learn about different Azure service offerings like app services, functions, and Cosmos DB. ☁️
Master the serverless architecture with Azure Functions and create efficient workflows. ⚙️
Understand the comprehensive monitoring tools provided by Azure, such as application insights. 🛠
Get insights into managing data using Azure's Cosmos DB and its various models. 📈
Key Takeaways
Master Azure app services for deploying web, mobile, and API apps effortlessly. 🌐
Leverage Azure's global infrastructure with 58 regions for strategic resource deployment. 🌍
Utilize Azure's free $200 credit for a month to prepare and clear certification exams. 💸
Engage with real-world coding using Visual Studio and deploy applications seamlessly. 👨💻
Scale applications horizontally and vertically using Azure's app services and service plans. ↕️
Explore serverless computing with Azure Functions, integrating effortlessly with other Azure services. ☁️
Dive deep into Cosmos DB, exploring its multi-model, distributed database capabilities. 📊
Enhance application security and monitoring using Azure's robust tools like application insights. 🔒
Overview
In this immersive AZ-204 Microsoft Azure Developer Associate course, learners are taken through an in-depth journey into Azure's diverse ecosystem of services and solutions. Starting with an overview of Azure compute solutions, the course promises to equip you with a robust understanding of cloud-based applications and global deployment strategies.
Each module of the course is strategically crafted to ensure developers gain practical, hands-on experience. From creating free Azure subscriptions for trial to understanding the nuances of deploying applications globally, every aspect of Azure development is covered. With detailed exploration of app services, Azure Functions, and Cosmos DB, you become adept at utilizing Azure's full potential.
The course places emphasis on utilizing Visual Studio for development and deployment, ensuring your ability to build, scale, and monitor applications is second to none. By the end of the course, you'll be proficient in strategizing the deployment of resources across Azure's 58 regions and have a thorough knowledge of leveraging Azure's tools for effective development and application management.
Chapters
00:00 - 03:00: Introduction to the Course and Modules Overview The chapter provides an overview of a course which comprises five primary modules. The initial module is focused on developing Azure compute solutions, and each module is designed to follow the latest guidelines and industry standards. Additionally, the entire course agenda encapsulates a total of 13 modules, each aiming to deliver comprehensive theoretical and practical knowledge through hands-on learning experiences.
03:00 - 06:00: Azure Global Footprint and Service Registration This chapter provides an overview of Microsoft's Azure global presence, emphasizing its extensive reach with 58 global regions. It highlights the flexibility developers have in selecting regions to host their applications, storage, and databases, ensuring optimal performance and proximity to users. The chapter combines both conceptual understanding and technical execution, guiding through the step-by-step processes involved in leveraging Azure's capacities.
06:00 - 09:00: Visual Studio and Azure SDK Setup The chapter emphasizes the importance of setting up resources before starting the course. It suggests registering for a free Azure subscription, which offers $200 worth of free credits for one month. This subscription is deemed sufficient for preparing and passing the certification exam. Additionally, it's recommended to have Visual Studio installed to enhance the learning experience.
09:00 - 15:00: App Services and Service Plans in Azure This chapter covers the use of Visual Studio 2019 and the Azure Software Development Kit (SDK) for developing applications, specifically focusing on deploying these applications on Microsoft's Azure platform. It mentions support for older versions like Visual Studio 2017 and alternatives like Visual Studio Code, but emphasizes the need for Azure SDK installation in Visual Studio 2019 for course exercises.
15:00 - 21:00: Azure PowerShell and Resource Deployment The first module of the course focuses on Azure App Services, which are a key feature of Azure, providing Platform as a Service (PaaS) capabilities. Azure App Services can handle web apps, mobile apps, and API apps, all considered as part of Azure's App Services. The chapter covers how these applications are integrated and managed under the Azure App Services umbrella, offering a streamlined approach to deploying and managing applications on Azure.
21:00 - 30:00: Application Deployment and Monitoring The chapter 'Application Deployment and Monitoring' discusses how to deploy applications on Azure's hosting service using a service plan.
30:00 - 36:00: Azure Functions Overview and Creation Azure Functions Overview and Creation: The chapter discusses the integration of applications with tools such as Visual Studio and Git, enabling the use of DevOps pipelines. This integration allows for leveraging cloud capabilities to scale applications based on customer load. Scaling can be done both vertically and horizontally. For projects with multiple development teams, separate staging deployments can be provided using multi-slot deployments.
36:00 - 45:00: Azure Functions Practical Implementation The chapter delves into the practical implementation of Azure Functions, emphasizing high availability across regions enabled by Azure Front Door. It discusses integration with existing SAS platforms and on-premise servers, highlighting security aspects like authentication and authorization. The chapter further explores the flexibility of adopting either a pure platform-as-a-service model or transitioning to a serverless architecture, while also focusing on monitoring functionalities.
45:00 - 58:00: Azure Storage Accounts and Blob Storage This chapter discusses the logging of data into app services and the availability of real-time data through application insights. It highlights the association of an internal local cache and the role of web jobs in running background processes. A walkthrough of these features is provided in the Azure portal for practical understanding.
58:00 - 73:00: Azure Cosmos DB Overview and Setup The chapter provides an overview of Azure Cosmos DB, focusing on the setup process. It mentions the Azure portal where an app service is associated with an app service plan. The example given is an app service named 'chat with maruti' hosted in the Central US region, associated with a standard S1 plan and accessible via a URL.
73:00 - 83:00: Cosmos DB API and Data Interaction The chapter introduces an application using Cosmos DB API and highlights the interactions available through its interface. Although the application itself isn't the main focus, the chapter emphasizes configuring settings related to monitoring, authentication, and authorization, including options for customizing domain purchases.
83:00 - 96:00: Graph DB in Cosmos DB The chapter 'Graph DB in Cosmos DB' explains how to associate a custom domain with the default URL provided by Azure. It describes the integration of TLS or SSL certificates and explores scaling options for web applications, such as horizontal or vertical scaling, through the app service plan. Additionally, it touches upon web jobs and the functionalities available in the app service plan section.
96:00 - 99:00: Course Conclusion and Further Learning Opportunities The chapter wraps up the course by revisiting the concept of an App Service Plan, emphasizing its capability to host multiple web apps. It invites students to explore further by examining the applications linked to a particular plan and monitoring various usage metrics like CPU, memory, and data input through the Azure portal. The chapter highlights the comprehensive overview Azure provides, while encouraging those unfamiliar with Azure basics to explore foundational courses to enhance their understanding before diving deeper into advanced topics.
AZURE DEVELOPER ASSOCIATE, MICROSOFT (AZ-204), FULL COURSE Transcription
00:00 - 00:30 is having totally five modules starting from this developing Azure compute solution and the weightage for these five modules are exactly like this to know about more details of this thing you can go to the official site of ac204 and I'm giving you a surety that this course is following all the updated guidelines of this course the agenda of the course is having total 13 modules and in each module we are going to learn each and every concept with a practical Hands-On and we are
00:30 - 01:00 going to see that how we understand each and everything with not only conceptual thing but also with the technical code and step by step doing those things with that Microsoft Azure cloud has a global footprint and that's the reason when you're developing for this Cloud you have plenty of options to choose from across the globe we have 58 regions available and while developing your application storages databases and all you can choose wherever you want to host
01:00 - 01:30 your resources that's the reason I strongly recommend that before joining this course make sure you have successfully registered for your free Azure subscription in which you will get 200 dollars of free Azure credit for one month and those 30 days will be enough for you to prepare for the certification and as well as to clear the certification with Azure subscription I also recommend that you should have a visual studio
01:30 - 02:00 2019 you can still use the older versions like visual show 2017 or maybe the free versions of visual Studios which are available if you're already good in handling with your studio you can also use Visual Studio code for this but for this particular course I am going to use Visual Studio 2019 and make sure that Azure SDK is installed inside your Visual Studio because we are going to develop lots of applications and then we are going to deploy this thing on azure
02:00 - 02:30 the first module of this particular course is focusing on app services and app services are one of the coolest feature of azure because it is providing you pure platform as a service features on azure when you're dealing with web apps mobile apps or any API apps on Azure all this are counted as app service because it's coming under the umbrella of app services like a normal application when you're
02:30 - 03:00 deploying on any hosting provider we have a server or a hosting plan but in this case in Azure we have something called service plan and your one service plan is going to have multiple applications hosted in that we are going to see in this module how we can have one or more web apps associated with one service plan and there are a couple of cool features which you are getting because of service plan and app services in azure
03:00 - 03:30 you can integrate these applications with the tools like Visual Studio git or maybe you're going to develop and devops Pipeline with that you can take the advantage of cloud and scale your application based on the customer load on that and you can scale it vertically horizontally if you have multiple teams who are developing this project and you want to give a separate staging deployment for them then we have multi-slot deployments also available this application can have
03:30 - 04:00 multiple instance running at the same time and you can get the high availability of this thing across different regions using a features like Azure front door you can connect with existing SAS platforms or maybe the data which is running on your on-premise servers you can also take care of Securities with respect to authentication authorization you can take care of the thing with either pure platform as a service model or if you want to convert this thing into a serverless architecture you can do that also the monitoring of each and
04:00 - 04:30 every data which is logged into this app Service as well as the real-time live data is also available with the application insights we have an internal local cache associated with app services and also it has a web jobs to run this thing in background we are going to see all this thing in action step by step practically in this module to get an idea of an app service I'm just taking you to my Azure portal and as you can see on this home page of this
04:30 - 05:00 Azure portal we have a app service which is associated with app service plan I have couple of resources hosted in the subscription I'm just clicking on one of them which is my app service with the name chat with maruti I want you to notice that this app service is hosted on Central us region of my azure it is associated with this app service plan which is on standard S1 plan and then it has a URL which is some domain
05:00 - 05:30 name Dot azurewebsize.net if I click on that URL I'll get a very basic application on that and the output of this application is not important right now what I want to focus is if I focus on the left side section we have a settings in which I can configure my application insights which is coming under monitoring authentication and authorization of this application maybe I have a purchased custom domain with me and I can
05:30 - 06:00 associate this thing with that something.com domain in spite of this default URL which is given by azure I can associate TLS or SSL kind of certificate settings in this or if I want to do scaling as I said a horizontal or vertical scaling I can associate with my app service plan in that the web jobs are also there and if I scroll down little bit more you can see we have a separate section of app service plan
06:00 - 06:30 as we already discussed one app service plan is going to have multiple web apps hosted on that if I click on this app service plan it will take me to that plan and I can see which applications are associated with that and also I can monitor the CPU memory and the data input usage of that Azure portal is giving you all of this thing in one shot and I hope you are familiar with the basics of azure fundamentals if you're not I request you to go through my Azure fundamental scores first and then you can go through
06:30 - 07:00 this particular course because this is focusing on az204 and this is going to be purely created for developers who are going to work on cloud computing model with azure okay so it's a time to create our first resource on Azure portal and obviously we are going to deploy our web app which is nothing but an app service
07:00 - 07:30 so as we discussed we have an app service plan on which the web app service is going to be hosted now on this Azure portal I can click on create resource button and I can choose web app and I can simply create a web app with app service plan but before I do that thing I just want you to see how we can deploy something on Azure portal through Powershell and that's the reason we are going to type shell.azure.com remember if you have configured Powershell in your subscription then
07:30 - 08:00 you'll directly get the Powershell prompt like this if not then you'll get a prompt to create Azure storage account which can associate with your subscription if it is asking for that you please please proceed with that and then you can fire a Powershell scripts or command once you're getting this PS prompt I have a Powershell strip open in my local machine which I have prepared for this lab and now you can check here that this is my local Powershell ISE I will not execute the script here I'll execute
08:00 - 08:30 this thing on the Azure Cloud shell only you can get the cloud Shell through portal directly by clicking on this button also but when you click on this it's going to give you a bottom bar where you have to execute script in the small window I prefer to do this thing in the full screen window like this and that's the reason I'm opening this shell.azure.com if we quickly check the Powershell script we have some variables on the top which are created with some GitHub
08:30 - 09:00 repository URL which is a sample given by Microsoft on this a web app name is going to be unique so I have a my web app and with that I'm giving a random number associated by command called get random the resource Group I'm keeping training rg1 and then the location for that we are choosing West Europe once we are done with this this variables will be used at many place in this Powershell script and technically we are actually firing
09:00 - 09:30 only four commands new AZ Resource Group which will create a new Resource Group on the specified location a new AZ app service plan which will create an app service plan for me in the free trial and then in the new AZ web app which is actually going to create a new application in the specified service plan Resource Group name and then finally we are trying to just put all these properties inside a variable called dollar properties object which later on going to be used in my
09:30 - 10:00 final command which is set a z resource which is going to set the resources for this project with respect to that GitHub repository and the branches which we have selected in this if you want to use the script the script is available on this URL which is paste.org 104467 and the same script I have shared for you so that you do not need to type all those things you can directly copy the script from that and then you can paste this thing in this Cloud shell you can directly copy the script from
10:00 - 10:30 this and then you can paste this thing in this Cloud shell it's showing me that Resource Group is created and the name of the resource Group and the provisioning studies succeeded now is trying to create an app service plan for me within few seconds that is also done and it is created inside this Resource Group in this location that's fine and finally now it's creating a web app which is showing me a progress bar associated with that
10:30 - 11:00 it will take few moments and once the progress is done I think new web app is also going to be created when you do this kind of copy and paste of our script it will always wait for you when you execute the last command the last command in my Powershell script is actually that set a z resource and you can see it's waiting for me to hit enter I'm going to hit enter and then it's going to take this final command and it's going to set up all the
11:00 - 11:30 thing based on the properties object which you have given while this is doing this thing we can parallely go to our portal click on resource groups and hopefully you'll find a new Resource Group called training rg1 if I go inside that I am going to have my app service plan and my app service inside that you can see this is my app service plan and this is my app service inside that if I go inside my app service
11:30 - 12:00 yeah as you can see this app service is showing me that it's hosted on this URL which is the name of the app service dot azurewebsize.net I can click on this and it's going to open this site in the new window parallely we can see that it's associated with this free service plan and we can scale it anytime by clicking on the scale up the location for this app is the West
12:00 - 12:30 Europe and if I click on this new tab now you can see that my application with the three different tabs are available this is a very basic dotnet based application and we directly took this thing from GitHub so this is not a code which we have written but yeah we have a basic application ready for this what I want you to focus is if I click on deployment Center this app deployment is actually happen
12:30 - 13:00 through the getup and you can see it's showing me here that we have a source which is actually an external git repository which is located on this particular URL if you want to disconnect from this or if you want to deal with some other options you can directly do it here if I click on configuration this is showing me my app settings and general settings available in this normally the settings you will find in your application code in your web config file or you're in the app config file
13:00 - 13:30 it's showing me the different application settings I can click on new application string and new connection string and I can configure all those things same way showing me the runtime stack and the version associated with that thing and which kind of platform I have configured for this all this configuration your developers can customize whenever you want same way we have path mapping we can add a new Handler for that we can add a new virtual applications or directory for this and all this configuration is just
13:30 - 14:00 a click away just now because we have deployed this application through this Cloud shell it happens step by step and even the last step which was set Azure resource is also done now let's say you do not want to do this thing through Powershell and you want to do this into a portal you can click on create resource choose a web app provide your resource Group because I have an existing Resource Group called training rg1 I'm going to choose this a
14:00 - 14:30 resource Group is a logical entity and in one Resource Group you can have multiple resources hosted on that my application name I'm giving maruti portal app because this is going to be deployed through portal and I'm giving some number like one and it's asking me how you want to deploy your code you want to do this thing directly as a code or you want to take these from a Docker container image I'm choosing code right now and in the runtime stack I'm going to select asp.net version 4.7
14:30 - 15:00 this is going to be an operating system based on Windows and let's say I do not want to host this thing on the free app service plans which we already created I have an existing service plan which is a standard S1 size which is giving me a dedicated 100 Azure compute unit and 1.75 GB memory for that now let me choose this one remember we are hosting this thing on the same Resource Group the previous app was deployed on West Europe region as we
15:00 - 15:30 know from this particular screen and this app I want to deploy on the central U.S region now can I deploy multiple resources on different regions within same Resource Group answer is obviously yes and that's the reason they are doing this thing if I click on next we have monitoring section of this app in which I am going to enable my application inside and it's going to create one application inside also for this we'll click on next next for tags and
15:30 - 16:00 review and create once this is done we'll click on create button and this will take few seconds to deploy this app service in that existing app service plan you can see showing me that my deployment is underway and while this is happening is actually taking care of all the thing through arm template left side you can see we have a section called template which is actually giving me a Json template created for this deployment and technically in this they are
16:00 - 16:30 deploying two resources one is an app service and the other one is my application inside these two resources required nine different parameters and they are going to deploy all those things in one shot it's showing me my deployment is complete I'm going to click on go to resource this resource is in central us but still it is in same Resource Group which is training rg1
16:30 - 17:00 and if I click on that you'll get to know that this is the same Resource Group in which I have certain resources coming from West Europe and certain resources coming in central us let's go back to that amp and this app is associated with the new service plan left side we have deployment Center and it's showing me that in this deployment Center I do not have anything configured if I want to configure an Azure CI CD Pipeline with Azure devops we can do that thing or I can associate any local
17:00 - 17:30 kit or a repository from OneDrive or Dropbox kind of thing so there are a couple of options available we have not said anything because it's a dummy app which is created just right now with a basic structure if I click on overview Tab and if I click on the URL of this app this is not having any code right now it's just having a dummy HTML page given by Microsoft and this young smart lady is developing on multiple programming languages at the same time this page is a dummy page if I want I
17:30 - 18:00 can download this publishing profile and I can publish my existing app which is developed in Visual Studio we'll see this kind of things later on but right now this application is deployed and in this video we learned that how we can deploy application through Azure portal and how we can deploy app service plan and app Services through Powershell foreign so we are still with the same application which we have created in the
18:00 - 18:30 previous video and now it's the time to see some insight configuration and monitoring capabilities of this application while creating this app through portal I have enabled application insights so make sure before you proceed with this you have also done the same step which I have followed in the previous video left side if I focus on configuration and if I check at the right side part we have a application setting and by
18:30 - 19:00 default I got couple of configurations which are already there in my app config if I want to do additional configuration of connection string with my databases then I can click on new connection string I can specify the name and value for that and I can choose the type of that database also maybe I can connect with SQL Server because it's a DOT database website and I can also enable a deployment slot settings normally when I check mark this thing a deployment slots are going to have an
19:00 - 19:30 additional database association with that what are the deployment slots well actually depends on your service plan if I go to my service plan you can see that I have a service plan which is selected for Dev and test and production kind of tabs are there if I click on Devin test we have a free service plan D1 B1 different kind of service plan if I choose a free one I do not have any additional features for my deployment slots but the one which I
19:30 - 20:00 have selected which is S1 with 100 Azure compute units and 1.75 GB memory this service plan is giving me additional features like I can have a staging slots up to 5 staging slots and I can scale my application up to 10 instances left side we have a section which is deployment slots it's showing me that I do not have any deployment slots right now I have only one deployment slot which is my actual
20:00 - 20:30 production which is associated with the actual URL of this application if I want to add an employment slot and I want to deploy applications on this we can do that thing same way if I click on scale out this is actually associated with my scaling where I can configure in manual or Custom Auto scale with this other than this if I click on authentication and authorization of this application by default right now Anonymous access is enabled for this application so technically we do not have any authentication or authorization
20:30 - 21:00 mechanism to configure authentication I can click on on and I can give an open authentication with websites like Microsoft Facebook Google and all and all I need to do is I just need to click on that configuration I need to provide an app ID and app secret from Facebook which you can directly click on this and it will navigate to developers.facebook.com and you have to register your application for different kind of authorizations which you want to
21:00 - 21:30 give the score which you select here is going to give that kind of Rights for that application if I choose Azure active directory I can go to my Azure ad and the users of my Azure active directory with that specific tenant can access my application things this all configuration we can see later on when we focus on the security part of my application other than this we have a monitoring capabilities given in application insights if I click on application inside because this is
21:30 - 22:00 enabled you can see here we have an application inside for this one I can see the data of application Inside by clicking on this link and this will take me to application inside of the same app you can check here if showing me all the live data associated with this thing and it's taking these things with that left side we have a something called live metrics smart detection all these tools are useful when your application is getting some issues and you want to keep track of that
22:00 - 22:30 within few seconds this live metrics will be ready and it's going to show me a live data associated with my application it's not showing me anything because maybe nobody is using my application right now I have my application open in a separate Tab and let's say if I am going to hit couple of requests here and I am going to just refresh my page couple of times and if I come back to this particular live Matrix within few seconds I am going to get the details about that now as you can see it started responding
22:30 - 23:00 this thing in that and because I did two requests in that is giving me that request associated with that any application load any fault or any error which is happening into that you can keep track of overall health of this application in this one single screen and as we know all these facilities are available to us as a platform as a service in this Azure app service your app service is hosted on app
23:00 - 23:30 service plan and this app service plan is actually scalable you can see we have two options here scale up and scale out and these two words are actually associated with the term called vertical scaling and horizontal scaling remember in all the Azure resources if it is app service virtual machine or storage Services we have a scaling facility available in that when we talk about scale which is vertical it is always about increasing
23:30 - 24:00 the capacity of the running instance now what I mean by that suppose if I choose S1 which is having 100 Azure compute you need and 1.75 GB memory allocated for my app service plan and if I decide that I want to go for the premium one I'll get the higher capacity of this or if I decide that I'm going to go with the lower one let's say if I'm choosing D1 then I'm going to have 1GB memory only associated with this thing so it's something like this I'm
24:00 - 24:30 downgrading this capacity of this service plan from S1 to D1 or maybe I can go with the free plan which is just giving me a shared infrastructure with 1GB memory now the moment I choose any of this plan this is actually showing me that I can scale this up or down in short I can increase the capacity and the computer associated with that and obviously the cost which is associated with this service plan will change based on that now this is also adding and removing
24:30 - 25:00 features which are associated with customized deployment or a hardware when you're dealing with this thing this is something which is known as vertical scaling because it's just dealing with the one single running instance and increasing and decreasing the capacity of that while in other case if I click on scale out scale out is all about horizontal scaling you have an option to configure horizontal scaling by manual scale and
25:00 - 25:30 depends upon your service plan which you have selected you will have number of instance count which you can increase and decrease because we have selected S1 we have up to 10 instances which we can configure in this if I specify two instance right now and after that if I just change it to save it's going to create two different instances of my app service plan running in this other than this if you want to configure an auto scale you can just click on this and then you have to set a rule for that
25:30 - 26:00 if I click on ADD rule in this new blade I have to configure which kind of criteria I want to measure let's say the metric which I want to measure is a CPU percentage and I am saying that if my CPU percentage is having a threshold of more than 80 percent so it's every time when my CPU is hitting 80 percent so more than 80 percent for a duration of 10 minutes if the CPU usage is more than 80 for the duration of 10 minutes this is that
26:00 - 26:30 particular condition which I want to add the moment I do the thing I also want to configure that the minimum instance will be 1 while I do not want to go more than seven instances and the default is also going to be one the moment I do this and if I click on save this Auto scale configuration will be applicable on my instance and if the load of my application increase and if CPU is reaching to more than 80 limit automatically Azure is going to take care of one instance increment into that
26:30 - 27:00 this is something which is a horizontal scaling where you are going to have multiple instances running at the same time okay so we are still with our same app service which we have deployed through portal and I have a URL of this app service which is open in a separate tab this app service URL is this name dot azurewebsize.net which is having this dummy page on that
27:00 - 27:30 now let's say I have a team of developers who's working on this application and they are working with visual studio and they have to deploy this application on azure when they want to deploy the thing I do not advise them that they are going to directly Deploy on this actual URL which is my production so it is advisable they go to deployment slots and I am going to provide a separate slot for them let's say I'm adding a new slot and I'm giving a name of this staging slot
27:30 - 28:00 I can add multiple staging slots so I'm giving staging slot 1. and then do I need to clone any settings from the existing application right now I do not want so I'm going to click on ADD the moment we do this thing this is going to add a new staging slot with the temporary URL dedicatedly assigned to this that URL and the configuration of that URL will be totally separate from the actual production you can provide the details of that URL with the FTP credentials and all to your
28:00 - 28:30 developers and they can associate this staging slot separately with the visual studio my staging slot is created I'm just going to click on close and you can see this staging slot is having a separate URL given here the traffic of this application is 100 given to my production only if I want I can even distribute some partial traffic to this staging slot let me click on the staging slot and this looks like a normal app service
28:30 - 29:00 only because this is also an app service where we can have a separate deployment this is also Associated on the same app service plan so performance wise I'll get the same performance let me click on this URL and this is a separate URL with the name hyphen staging slot one dot Azure web files.net so this is the name of my staging slot associated with its app service and now this staging slot I will give to my Developers I'm going to click on this link which is
29:00 - 29:30 get published profile and it's going to give me a full published profile with the XML configuration associated with the slot this profile will be helpful for my developers when they associate this thing to my visual studio let me open my visual share 2019 and we are going to develop one basic dotnet based application in that I'm going to click on create a new project selecting esp.net replication and then I'm going to click on next
29:30 - 30:00 the name of the application is web application triple one we are okay with that and the dotnet framework 4.7 is there we are okay with that also I'm selecting MVC as a application template form if I want to choose I can choose a pi also it's ultimately a same thing do I need to configure any authentication in this right now no I do not want that thing everything else is fine we'll just click on create I'm creating a very basic MVC
30:00 - 30:30 application in my visual studio and once this application is created the code of this application will be in my local machine inside the visual studio itself the goal is that application code now we want to push to this staging slot and as we know that in the staging slot we have that dummy page with that image in front of that we are going to push this code from Visual Studio into that yes my application is created inside
30:30 - 31:00 visual studio and right now if I try to run this application this is going to run on local IIs if I just quickly run this thing to check whether it's not having any issues in that my application is loaded in the browser and you can check this is running on localhost and some port number
31:00 - 31:30 if just a basic application with home about contact kind of tabs which is a default template of my MVC based application if I go back to my visual studio I have a folders like models controllers and Views because this is an MVC application with some web configuration and global.asix I'm not going into the core of this application right now but let me right click on this application and there is an option for publish
31:30 - 32:00 when we click on publish Visual Studio allows me to publish this application either as a normal app service and I can create a new app service or I can choose an existing app service also if I want to go for a deployment which is not a pass but infrastructure the service I can also deploy this application directly onto my Azure virtue machines what I'm going to choose right now is I'm going to choose app service but in that I'm going to choose import profile
32:00 - 32:30 because I already have a profile downloaded from my portal I'll click on downloads and I'll select my profile which I have downloaded from that this is my profile of staging slot 1 and I'm going to click on publish this will take some time and is going to publish this full code into that particular staging profile when you deploy through this first time it's going to take some time because it's
32:30 - 33:00 going to deploy full project into that second time onwards is just going to take all the changes which you are doing in your code and it's going to quickly push that thing to that deployment slot now if I go back to my portal and I can check that that my staging slot is actually having this application deployed now let me go to deployment
33:00 - 33:30 slots and assume that my testing team has already tested everything on staging slot and now this teaching slot is ready for move to production the production is a okay this is the production URL if I refresh this this is still having that same dummy page and this is a staging slot URL if I refresh this this is going to have my new application code in that now inside Azure we have a facility which call swap when we click on Swap we can swap from
33:30 - 34:00 any slot to any other slot like I can choose that right now my source is my staging slot and from that I want to swap to a target one which is my production if I click on Swap this will take some time and this will swap my staging slot to production and my production slot to staging technically this is just going to do a URL rewriting internally and it's going to perform the swap so that immediately the code which is there in your staging slot will be swapped to the
34:00 - 34:30 actual production URL and your customers will get immediate effect on that code the moment this is done your dummy page which was available on your actual production is actually going to be shifted to the staging slot for the safety purpose they do this thing because after even swap if you have any issues in your production environment then you can quickly swap that thing in the reverse back and then you can take care of those things
34:30 - 35:00 my slap is successfully done I'm going to click on close still production is running staging slot is running will go to our staging slot and we'll click on refresh it has a dummy page now will go to actual production URL and we'll click on refresh and now this is going to have my application code now assume that if something goes wrong with this score at that time you can again go back to this and you can swap to the previous one do all the remaining
35:00 - 35:30 testing which is there and you can deal with that additionally if you want to check the logs of this you can click on logs and you can see the detail logs associated with that particular swap which is just now happened with that all right so now it's a time to learn Azure functions Azure functions are allowing you to write your piece of code as a function on cloud
35:30 - 36:00 which can run independently in a stateless environment Azure functions are actually part of serverless computing if you choose to run this thing as a consumption plan to deal with Azure function we have two different flavors the platform as a service if you're choosing app service plans which you already have or if you're going with the function as a service then you can choose this thing as a server list in which Azure is going to allocate a Computing whenever you're going to execute the Azure function
36:00 - 36:30 and this is something which is one of the cost Effectiveness because you do not need to pay for any compute which is continuously allocated to that function you only need to pay for the execution time whenever your function is actually executed Azure functions are actually going to provide an open source webjob core now if you are familiar with web jobs and if you deal with web jobs in azure app Services then let me tell you one thing Azure functions are also server with web job
36:30 - 37:00 score only but the only thing is this functions will be totally independent and it can be integrated with any existing Azure resources it's support for a wide variety of programming languages including C sharp Java PHP python JavaScript and all if you are not a developer and if you are an administrator and if you want to write some piece of logical things in a Powershell or shell spring that is also allowed in Azure functions in short if I have to explain this thing in one line
37:00 - 37:30 this is the cheapest way to execute your code on Microsoft Azure Cloud Azure functions can literally connect with almost everything on Azure if you want to connect this thing with Azure storages with table blobs or queues you can do that thing if you want to connect this thing with any kind of event triggers like notification Hub event grid and event hubs you can do that thing Azure function works on triggers and that's the reason most of the time the
37:30 - 38:00 piece of logic which you have written inside Azure function is going to get triggered by some existing event it can be a new file which you have uploaded into your blob storage or maybe a new record which is inserted just now in Cosmos DB you can associate Azure function with almost all the resources of azure as well as there is an important section in which you can configure in alerts also which are based on Azure functions so when an alert is getting triggered with that that is going to execute the
38:00 - 38:30 piece of logic which is written inside Azure function there are a couple of things which I want you to keep in mind when we are dealing with Azure functions remember functions are based on triggers and that's why you can trigger them either from a blob Cosmos CB service bus message queuing or maybe it's a timer trigger based schedule function where you can configure that every certain period of time is going to get executed again and again you should keep in mind that you should avoid long running functions because when you go with the
38:30 - 39:00 consumption plan Microsoft is going to give you a 10 minutes timeout it means if your function is running for 10 long minutes at the time of 10 minutes timeout it will get terminated same way if you are dealing with any HTTP request and using the HTTP request if you are doing request and response then the timeout will be reduced to only 2.5 seconds you have to remember that use queues for cross function communication we have a concept called durable
39:00 - 39:30 function which I will cover later on in this course and if you're dealing with the function to function communication durable functions are really easier one or you can go with something called Azure logic app which is also a part of a serverless Computing always try to write a stateless function because this are going to be an ID important and a stateless function or I can say a piece of code which is going to be helping you in your number of requirements lastly I strongly recommend you to go through this GitHub repository where you will
39:30 - 40:00 find number of samples on Azure function host once you are familiar with this B6 let's have a look at this Azure function app and Azure function inside that in action okay so it's the time to create our first function app through Azure portal and I'm going to click on create resource and we have function app here I'm creating a new Resource Group
40:00 - 40:30 and inside that will deploy this the name of the function app let's say I'm giving maruti first funk app one and that's fine we will publish through code runtime stack here has couple of options as we discussed in the previous video we have different options available here we have.net core which means C sharp node.js is for JavaScript python Java and Powershell is also there let's choose.net core so that we can get the c-sharp code in that and then in the
40:30 - 41:00 version we are choosing 3.1 the region which is associated with this thing let's choose Central us only we are okay with that and we'll click on next the moment I click on next you can see it's showing me that to store the code the the piece of code which you're going to write inside your functions to store that code you need to associate this thing with the storage account you can choose your existing storage account if you have or in my case I'm creating new one which kind of operating system you want
41:00 - 41:30 to use while hosting these functions I am choosing windows and then here is that particular drop down which is making sure that this one is going to be serverless now if I choose consumption plan it's going to cost me only when I execute my function and this costing will be based on the time of that execution which is already occupied by that particular code now if I choose consumption technically I do not need to pay any amount until unless I execute that thing and suppose
41:30 - 42:00 if I go with app service plan it's something like this that if I have any existing app service plan I can choose that thing and in that case because I'm already having a app service plan technically function app will be free because it's not going to cost me anything extra even if I run functions inside that if you want to go with the premium one and you want really high compute with this you can go with the premium and you can choose a dedicated compute for your function app but in my case I'm choosing consumption so that it's going to be based on the execution
42:00 - 42:30 do we need monitoring right now no we do not want to do monitoring right now we'll move forward and we'll click on create remember the thing which we are creating right now is a function app and one function app can have multiple functions inside that so number of functions will be more inside this you can have as many number of functions you want to create in this but all these functions will follow the same runtime stack which is dot net core 3.1 foreign
42:30 - 43:00 let's click on go to resource and inside this is going to show me that I have functions proxies and Slots there are three kind of things other inside the functions I will surely not have any function right now so the first thing which I'm going to do is I'm going to click on this plus icon it's asking me how would you like to create a function you want to use Visual Studio you have to use Visual Studio code or you want to go with the in
43:00 - 43:30 Portal editor now if you do not want to use any of these tools you can go with in Portal and literally you can write your code test your code and you can see the logs associated with that within the portal itself not only that Microsoft is also giving you couple of ready-made templates available here as we know that Azure functions works with the triggers and every time the execution of the code will happen on the particular trigger by default right now we have two different templates which is webhook plus API and
43:30 - 44:00 timer and if I click on more templates we have more than 30 plus templates available within the portal itself let me click on web hope plus API and this is going to be like a normal rest base API where I can deal with the get post put delete operations with http I'm going to click on create my function is created and you can see this is a function which is HTTP trigger one function inside this list of
44:00 - 44:30 functions and it's having some basic C sharp code inside this as you can check we have a libraries which are coming from.net core and also we have some libraries which are nuget packages as well as system.net which is a.net Library we can use dotnet.net core or any nuget package which is based on C sharp into this particular code we have a very simple asynchronous task created inside this which is having a name run and every time when this function is getting triggered it's going to execute this function and it's going
44:30 - 45:00 to call this run method inside that this run method is trying to get the values from my query string and the name of the parameter associated with that is name and if I'm passing any name it's simply going to print hello plus that name or if I'm not passing anything then it's going to print please press a name in the query string or in the request body so this is a very basic hello world kind of sample which they have given us the thing which I want you to understand is exactly below the section we have logs
45:00 - 45:30 and console or your function is getting call with the help of trigger then this logs are going to show you then this log window is going to show you all the logs here same way if I scroll it right side we have a section which is for View files and taste this View files is very much similar to your solution Explorer of your Visual Studio where is showing you all the files you can add new files you can have multiple C sharp files and the accession of the C sharp files will be CSX do not get confused with the extension because
45:30 - 46:00 this is pure C sharp only the configuration of function is done by function.json file if you check on this left side the function.json file is telling me that this is a function which is HTTP triggered and it can do get and post kind of operations in this also if any other parameters are associated with this function we can Define this here every time when you change the template of the function or while creating a function from Portal if you choose a particular template this function.json
46:00 - 46:30 file configuration will get changed based on the template right now let's see whether this function is running or not let's click on this URL which is get function URL we have a full URL associated with that let me copy this URL and in a separate tab of my browser if I directly hit this URL if showing me please pass a name or a query string or a request body now I need to pass a name now if you notice this URL it's already having one query string parameter which is code which is
46:30 - 47:00 nothing but a unique token for Authentication I am passing one more parameter in this using M person and I'm saying the name which I want to pass is mental stack which is the name of our website and if I hit enter it's going to show me Hello Mentor stack now this is something which is telling me that just now when I have pasted this URL in the browser and if I'm hitting enter in that I am actually executing this function twice and logically I have to pay for
47:00 - 47:30 only two executions right now immediately the moment your core is working and the function is getting triggered you can utilize this thing same way if I click on this Plus I have plenty of templates available in this in which I have a normal Azure functions and as well as we have something called durable functions also like you can see I have a functions which are queue storage triggered we have Azure blob storage trigger functions we have Cosmos DB's trigger functions now let's say right now I do
47:30 - 48:00 not have any Cosmos DB in this account but if I scroll up we have something called blob storage figure and we know that when we created this function app it was creating one storage account also with this so I know that there is a storage account which is associated with this function app and let me configure a blob storage trigger function now if I click on this it's going to ask me that what will be the name of that function and then which kind of path you're looking for a name of the function by default is blob triggered
48:00 - 48:30 one I'm okay with that and it's asking me that which kind of path you're looking for now this path will be the name of the container with which you want to associate whatever name by default they have given I can change it but I'm not changing it I'm just copying this name so that I can use it somewhere and then the name of the connection string while creating and connection with the storage account it will be this one so I'm okay with that we'll click on Create and then it's going to create a new function now inside the same function app so you can
48:30 - 49:00 see one function app can have multiple functions the first one was HTTP trigger and this one is a blob triggered one it's just having one line inside this is having a run method inside which we have parameters which are taking that blob and then printing the name of the blob with the size of the blob and if I just move it right side you can see that this time also in The View files we have run.csx file and we have function.json file but this time the configuration of this function.json file will be different because this is
49:00 - 49:30 not an HTTP trigger we do not have something like get post and all instead of that we have a type of the function which is blob triggered and the path of that function which is sample hyphen work items that's the name of the container which it is looking for this is showing me that if I change my function.json file then it's going to change the way the execution of this function is going to happen to check whether this is working or not I need to check one thing I'm just going to click on integrate at the left side panel and if I expand the documentation
49:30 - 50:00 section it's going to show me that the storage account which is connected with this function app is actually storage account train b031 okay so I want to go into the storage account and I want to do something with that uh let me open a resource groups in the new tab so I'm keeping my one tab open in the browser which is actually having this function Also let's do one thing up to this point we have not triggered this function anytime and I'm just keeping this logs open here
50:00 - 50:30 in the separate tab I am going inside my Resource Group Training RG and then inside that I'm just looking at this storage account which is storage account train b031 inside that we have storage explorers in which we can create containers tables file shares and all I'm going to right click on this and I'm going to say I want to create a new blob container let me give a name of the blob container which will be exactly same what I have copied from that sample hyphen work items the access level for this let's say I'm
50:30 - 51:00 giving container level access and then I'm going to click on create if you're not familiar with storage account just go through my Azure fundamental course otherwise in the same course also after one module will focus on the storage accounts if I expand I have a sample hyphen work item this is that container and my function is continuously looking for this particular container let me upload one file into this so I'm going to click on upload we'll choose one of the file from our machine
51:00 - 51:30 I'm choosing one of the file and then I'm going to say overwrite this file if it's already there and click on upload the moment we upload this thing the size of the file is around 2 MB and then the moment we upload this thing automatically this is a moment when my Azure function is going to get triggered because we have created function which is a blob trigger function so the moment this is getting uploaded into this immediately I want to switch to my function app and I just want to wait here now you can see the moment I came
51:30 - 52:00 to this function app it's showing me that okay you have uploaded one file which is with this name and the size of that file is actually this one so this name and size which we are printing into this they are just triggering that function and they are printing this thing here now this is showing me that my Azure function can technically connect with almost anything on Azure and it has couple of triggers associated with that thing I hope you understood this thank you
52:00 - 52:30 okay now it's the time to deal with Azure storage accounts and especially Azure blob storage when I click on create resource and if I click on storage you'll get to know that I have a storage account which is simply telling me that it can store blob files tables and queues ultimately one storage account is going to have this four flavors of data which I can store in that if I click on the storage account inside that we have three types of storage available
52:30 - 53:00 let's say I am specifying that I'm going to create a new Resource Group with the name trainingrg and then I'm going to give a storage account name I'm giving a name of this thing maruti storage acc-123 is a storage account which I'm creating right now and seeing this already there then let me create something which is one after this in the location I am choosing East us and after that in the performance we have two options standard and premium
53:00 - 53:30 now before I proceed to any other sections I wanted to focus on this section where we have account kind and this is what a type of storage means when I click on this drop down we have three options we have general purpose V1 general purpose V2 and we have blob storage if I choose either V1 or V2 I will have table files blobs and queues all four options available in that but sometimes clients wants to use only a blob storage then they can choose blob storage and in
53:30 - 54:00 that case they'll be able to store only in the form of blobs when we are focusing on blob we have something which is a replication and Excel Style the replication of this particular storage is giving me three options in that lrs GRS and ragrs in lrs we are going to have multiple copies of your blob storage locally most of the time is going to have one Geo replication based region and in that we are going to have multiple copies stored in a separate separate racks of
54:00 - 54:30 the data center so that if any power failure kind of things happen I can get the availability associated with that in GRS we are going to have geographical copies across multiple regions but out of this three copies you are able to access only one and then in ragrs we are able to access geographical copies across the globe and out of three copies of GRS we are able to access two copies one which is a primary copy we can do read and write while the secondary copy we can do only
54:30 - 55:00 read if I choose Racers it's going to show me that that account with the selected kind replication and performance type only support block and append blobs page blobs file share tables and queues will not be available in this now this also adds one more thing that when you're dealing with blobs we have three kind of blobs in this we have block blobs append blobs and Page blobs to understand this thing let's see one slide one blob storage account is going to have this kind of
55:00 - 55:30 three kind of blobs blog blobs are actually composed of blocks of different sizes that can be uploaded independently and and parallel also this kind of blobs involves mostly in images and videos kind of files into that the append blobs are specialized block blobs that are supported only in appending new data most of the time when we are storing something like logs and all or streaming data we do not want to update or delete the existing data we just want to append
55:30 - 56:00 the new data into the existing one and that's where append blobs are useful the third type is Page blobs are designed for scenarios that involves Random Access reads and in this case which data you need to access that's not predefined randomly you can access this thing most of the time we store a virtual hardness kind of data into this so these are the three different types of blob storages which are available now as we know in the screen if I choose blob storage right now with rhgrs it's going to tell me that I cannot store any
56:00 - 56:30 page blob file share tables and queues in this in my case I'm choosing storage V2 and the moment I do that it's going to show me that okay now you do not have the restrictions you can deal with table blobs files queues as well as all three kind of blobs also I'm choosing this below that we have something called XS Tire this excess tire is having association with something called storage lifecycle and remember in this
56:30 - 57:00 access Tire we have Association only with the blocks you can see in this alert that they are saying that the access tires are associated with The Blob levels actually and basically we have three kind of access tires we have hot cool and archived now normally hot is actually that excess Tire which is allowing you to get the frequent data access as fast as possible if you have a data which you're going to access frequently then you have to
57:00 - 57:30 choose your Access Wireless hot suppose you have a data which is not frequently accessible but you want to store a huge amount of data and somehow you want to associate cost with this if you choose cool cost wise cool will be cheaper than hot but performance wise is going to be not that good compared to hot Tire also after this we have one more stage of the life cycle which is known as archived I will explain the life cycle of the storage account and
57:30 - 58:00 the blob storage account separately in the video in this course right now we know that we have a storage account creation like this we are choosing V2 and we are specifying rhs while creating this thing I am choosing the access star the default access tire for this is going to be hot in the networking section we are going to allow this thing to public endpoint I do not want any secure transfer required in this and I'm going to click on create
58:00 - 58:30 deployment of my storage account is completed and I'm going to click on go to resource I want you to notice couple of things you can see that my performance and the access star is standard as well as the hot and then because I have chosen ragrs it is showing me that the location of my storage account is not only one which is East us it is also showing me a secondary region which is West us also available and that's what R A GRS means so I have
58:30 - 59:00 a global geographical replication and that is coming from East US region West us region both and then in that also because I have done z-axis it is allowing me to get the data as a readable format from a secondary region and that's what which is West us once storage is divided into four parts and spatially this is V2 so we have containers which are actually allowing me to store blob data file shares tables and queues
59:00 - 59:30 to access the storage account you always need to access this thing through access keys and that's the reason left side we have a section called access keys in which it is showing me my storage account name and below that we have two different access Keys given to us we also have a connection strings created based on the both the keys anytime if you want to regenerate these Keys you can just click on this and you can regenerate your keys and the older keys will not be valid if you want that your storage blobs tables files or queues is
59:30 - 60:00 going to get access inside any particular existing application or if you are a developer and you want to develop code which is going to allow you to store data in Azure storage account you have to use these keys and connect your application using that we will say this in depth in coming videos thank you all right so now I am inside my storage account which I have created in the previous video this is the same storage
60:00 - 60:30 account which is having a hot access style and now it's the time to understand life cycle management of a storage account remember we know that we have three kind of storage accounts version one version two and blob storage account left side inside this version to storage account if I scroll down I have a section for blob services and remember all the services are only associated with the blobs so it's not applicable on files tables and queues
60:30 - 61:00 if I click on lifecycle management for all the blob files which are inside general purpose V2 account and a blob storage account this lifecycle rules are going to be applicable you can see I do not have any rules right now and I can add a rule now to understand this life cycle you have to understand that in a blob life cycle we have total four stages you assume like this that like a human life cycle when we get birth
61:00 - 61:30 after that we have couple of stages which we live throughout the life of the human same way when your storage account is going to have any data inside that in the beginning the data will be hot because while creating the storage account I have set the default access star for that is hot if I'm not using that data over a period of time then it's not a point keeping the data always hot and always accessible because if I keep the data hot then it's going to cost me more
61:30 - 62:00 it is advisable thing that you decide a life cycle of your data based on your kind of customized rules like I'm clicking on ADD rule and then I am specifying that this is my blog rule 1. I am going to check mark this three check boxes now I'm specifying that if my data is not changed is not modified in last 10 days
62:00 - 62:30 now you can check this is a property which is saying that days after last modifications I am purifying that if my data is not changed in last 10 days I want to move my blog to a Cool Storage so my blob storage is now going to be shifted to a Cool Storage and those files will be shifted in the Cool Storage so that is not going to cost me much suppose if I am going to access any file within the 10 days duration then that file will remain hot same way let's say the data is shifted to cool
62:30 - 63:00 and now you are still not using the data for another 40 Days so if the data is in a cool stage and for another 40 Days you're not specifying that thing so you're saying days after last modification now this also means that last modification means these dates are already part of this tin so this is counting that last time when you modified the file from that if it is 40 days older now it means this 10 plus another 30 then we are going to move this block to Archive storage when the blob is moved to Archive it
63:00 - 63:30 will still be accessible but not immediately it will take some time when you try to access the archive blob files and suppose after this I am specifying that if I upload any block file and then let's say for 365 days it means almost for one year if I am not using that file then I'm going to delete that thing because obviously this file which I have uploaded and if I am not using this thing for one year then maybe this is useless file when I'm doing this thing
63:30 - 64:00 we also have a rule for snapshot remember Azure always create a snapshot which are kind of a point in time backup of the data if you want to get any file from the snapshot you can do that thing and we can set a rule also for that that if I want to delete a snapshot after a couple of days we can do that now this customization is totally in your hand suppose if I do not want to go for a delete block kind of option I can just uncheck this and if I uncheck this thing my life cycle of the storage is defined
64:00 - 64:30 only up to Archive it means after 40 days my file will be shifted into archive and it will never be deleted it means if I want to access this thing later on it will be available after some delay same way if I move forward to next we have a filter set we can specify the filter set and we can give a specific path of the container now we do not have any containers right now inside this but when you go to store blob if you want to define the rules for a specific containers only then you can
64:30 - 65:00 specify the path and you can give a name of the container here whichever path which is associated with this will have that rule after this we are specifying next review and add and then if I click on add this is internally going to generate and Json configuration for me that's the reason when I do this thing you can see we have one more tab which is known as code View when I click on code view it's going to show me the same configuration which I have done in the Json configuration
65:00 - 65:30 if you want you can directly change this also later on or you can have a set of rules available in this by clicking on this add rule I hope you're getting this lifecycle management which starts with hot then it moves to cool then it moves to Archive and then it moves to delete thank you foreign we want to create one application which
65:30 - 66:00 will have some code and that application is going to allow me to push some data into my Azure blob storage to do that thing first I'm going inside my container section and I do not have any containers inside this so let me create a new one I'm adding a new container called images and let's say the access level for this we are giving containers and then I'm going to click on create the moment I do this thing this new container will allow me to upload multiple images into this so I can go
66:00 - 66:30 inside that and I can click on upload and I can upload multiple images into this but I do not want to upload images directly from this portal I want that somewhere I should have my application and through the code of that application I can upload things into this to understand this thing we need to use Azure storage apis and to do that thing I have one sample which I have created and I have shared with you in this Repository
66:30 - 67:00 so for you also if you just go to github.com trainer maruti slash the storage blob upload from web app this is a web application code which I have associated with the storage blog I request you to click on the fork of this repository once you go there and if you have a GitHub account this will automatically forked into that you'll have an excess of this GitHub repository connected with your GitHub repository and then you can just click on clone or download and you'll have an option association with Visual Studio
67:00 - 67:30 remember if you do not sign in with your GitHub account you will not have this middle option you will have only the options like open in desktop and download zip that will also work if you download this manually first and then you associate that thing with your Visual Studio in any ways I am going to click on open in Visual Studio because I have connected with my GitHub repository you can see it's showing me that you can connect this thing with the local git repository do you want to clone this I'm saying yes clone this thing and it's
67:30 - 68:00 going to create a new storage repos with this particular name I'm okay with that I'm going to click on clone it's going to take all the files from that repository and it will give me a local git repository association with that yes once it is done inside this folder we have a solution file this image resize web app.sln is actually a solution file of my MVC web application project which I have coded inside this I double clicked on that and now I'm inside that web app which is having a
68:00 - 68:30 folders like models views controllers and all make sure inside the dependencies you are not getting any yellow Mark if you're getting any yellow marks in this just build your application once to make sure that you are not getting any errors for me build is succeeded it means my application and the library which are associated with that is proper inside the solution Explorer I am first going to hit this section which is app settings.json in this app settings.json I have
68:30 - 69:00 association with my existing blob storage account and then we have a older account key I want you to change this thing with respect to your storage account name and the account key you can go to your portal inside this storage account left side you have a section called access keys and that's the place where you have a storage account name and keys let me copy the storage account name first and in my visual studio I'm going to paste it here same way I'm going to copy this key one
69:00 - 69:30 and inside my visual studio I'm going to replace it with the account key I'm going to save this file and now let me show you what kind of sections and code I have in this application once this is associated remember we are connecting this application with that storage account other than this I do not want to change anything I have a models folder inside which I have a very simple C sharp class which is just associating with the account name and account key which we have provided in the configuration with that I have one image container
69:30 - 70:00 property also which I am using in this application code somewhere inside the controller of this application we have images controller which is working like an API and then inside this we have two methods one is allowing me to upload images into this while the other one is allowing me to get thumbnails from that both of these methods are having a c-sharp code in which we are trying to associate with the apis associated with this class called image resizer web app
70:00 - 70:30 dot helpers now this helpers are also part of this project and inside that we have a storagehelper.cs in which I'm trying to use azure.storage.blogs this is that main library which is helping me to connect with the storage and dealing with the storage to get the things from that you can go through this code and you can understand the C sharp part of that if you need any help in this you can message me on email me anytime from the contact Section of the mental strike
70:30 - 71:00 website right now we want to publish this and we want to see whether this is working fine or not so let me right click on this and I'm going to click on publish how we like to publish I'm going to click on app service and let's say we are going to create a new app service it's going to give me a default name of that app it's giving me image resizer web app I'm just adding a name here image resizer web app maruti one I request you also change and put your name there so that it's going to be unique my subscription is like this and
71:00 - 71:30 the resource Group I'm choosing training RG Resource Group and obviously the hosting plan for that is going to be the existing hosting plan which I already have once I'm done with this I do not want the application inside so it is none and then we'll click on create on the publish tab of this deployment wizard all we need to do is we need to click on publish this will take some time and then it's going to get published on my Azure portal
71:30 - 72:00 okay now it's showing me that my application is successfully deployed on this URL it within some time is going to just restart my web app and it's going to take me to that particular browser page or apparently I can also go there and as we can see our application is there which is also having this upload photos kind of section and we have a list of photos also let me upload one image from my machine
72:00 - 72:30 from my pictures I'm just going to upload one simple image and you can see that the moment this image is uploaded we are also displaying this image below that it will upload that image and as well as is going to create a carousel here which will show me a second image also anytime I can click on this and I can change the image and actually both of these images are uploaded into that blob so if I quickly go to my storage account
72:30 - 73:00 in which if I click on overview and let's go inside a containers and that images when we go there we have these two images which we have just now uploaded into this particular container uploaded image and also is trying to get the image from this one I hope you understood the sample and I request you to go through the codes of that thank you
73:00 - 73:30 okay so we are still with the storage account and we have three containers now inside each container I have 111 image and we have access keys in which I have already shown you that we have two different access Keys already available in this and these two access keys are going to help you when you want to connect this storage with your applications or you want to access this thing securely somewhere now as you can check here left side of this section we have one section which
73:30 - 74:00 is shared access signature which is responsible for generating saf token faf Sans tokens for this particular storage which is somehow going to help me if I want to connect to my log files queues and tables securely now I want you to understand this properly and that's the reason we have created three containers with the three different access levels private blob and container now if I go inside this first container which is test one and let's say if I go
74:00 - 74:30 to container properties I'll have a full URL of this container now I'm going to copy this URL of the container and in the separate tab I'm going to try to access that container and while accessing this thing because this container can have multiple things multiple drop files or images inside that I'm going to give question mark and then I'm going to give com equals to list ensure I'm planning to get list of
74:30 - 75:00 Records inside that and when I do the thing is saying that this result is not found for test one it's not allowing me to access anything now logically placed one is actually that container which is having a private access and private access simply means that I am not able to access anything inside this container until nonetheless I provide a valid token or I can authenticate myself so obviously it's not going to work with this let me see page 2 so the URL will be similar now in
75:00 - 75:30 this also it's not showing anything it is seeing that resource not found because the second container is having the access level blob level so I can get a particular file details but I cannot get the list of the files because it's having access only up to one particular blog not for the full container while the third container is having a container level access so let me change this URL to test 3 and then if I hit enter you can you can see it's showing me that the full details of my blob file
75:30 - 76:00 and if I have multiple items also into this it's going to show me multiple blobs in this Json format right now I have a full URL of my image available in this if I copy this and if I paste this thing somewhere in the UI the container level access is also allowing me particular file access and I'm able to see the full image in this so in short the third level which is a container level access is actually something like you are giving a full a folder level access into that and you
76:00 - 76:30 can have access of all the files into that if I go inside my second container which is two so let me go inside that in this I have one image and let me scroll right side I have the three dots here if I click on this I can go to the properties of this particular blob file so you can see it's showing me all the properties of this blob file including that this is a hot image and then I can just click on this URL which is going to give me a full URL
76:30 - 77:00 of that particular image so I cannot access the list for Test 2 but if I paste the URL which is coming from Test 2 and I'm giving a full path of that blob then it will allow me to access that image so the second container is not having a container level access but if you give a access to a particular blob file then you'll be able to access that my internet is bit slow so that's why it's taking time to load this thing and even the image is also high quality so but anyhow we have access to this
77:00 - 77:30 I hope you understood the two container access blob level and container level private is as I explained this is a secure thing so even if I go inside this container I can see that we have one image in that and if I try to get the properties of this image I'll get the full URL of this image but this time even if I copy this full URL and if I try to paste this thing in the browser it is showing me that I do not have access of this image which is inside test one because test one is my
77:30 - 78:00 container which is private so the logic says that you cannot access this thing without a proper token now how can we use token actually to understand this thing in the same page I am clicking on the tab which is generate SAS now this time first I'm trying to generate SAS and then I can control which kind of token permissions I can choose in this let's say I'm saying I'm giving a read access only right now I can specify a start and end expiry dates
78:00 - 78:30 with that I can specify a particular IP address customization so that only this IPS can access that thing and I can also select the protocols which are accessible in this I am choosing right now HTTP and all the SAS token generation is going to happen with your keys which you have given in the storage and this is a place where you are either key 1 or K2 will be used now most of the time I choose a different Keys when I have a different environment for different client now what I mean by this
78:30 - 79:00 if I have two different clients for accessing my same storage then I can give key one to one client and K2 to the other client and let's say if you have two different environments like one is a testing or staging environment and the other one is your production environment then also you can use two different keys for both the environments I'm choosing key one right now and I'm saying generate SAS token and URL when I do this below that I am getting a new token which is generated automatically based on the configuration which I have given and it also showing me a fully qualified
79:00 - 79:30 URL which is with token I'm just going to copy this URL and now if I try to paste this URL which is not only a URL but it is also having a token with that and then you can see that I'm able to access my home sweet home image so yeah so this is what a SAS token is all about and this is what the security associated with the storage so not only a container level you have even a file level access also and now if you understood the SAS token if I just go back to my storage account
79:30 - 80:00 this is my storage account if I go back to my storage account this Sav token or this token the security is applicable on all four now I'm not showing you the tables files and all right now but if I just scroll down to this section which is showing me share Excel signature this one page is allowing you to control which kind of services and which kind of operations you want to restrict in this like let's say if I say that I want to allow blob and queues not files and
80:00 - 80:30 tables and I want to allow access to only Azure services not any containers and objects and I want to allow all the permissions except delete permission for this and the dates are fine everything else is fine and I'm saying that this time I want to generate key two and if I click on generate Sav the nation string is going to show me all four different connection strings with a faf URL for blob and queues both
80:30 - 81:00 so we got a connection string we got token we got a URL for blob and queue because we have selected only for blob and cubes now this URL I can copy and I can give it to my developers team and the testing team and I can tell them that you can use this thing and obviously they'll be able to use my storage only within that given configurational tag which I have mentioned in this while generating this particular token there is another way also to secure this particular storage you can go to private
81:00 - 81:30 endpoint connections as I already discussed while creating the storage or you can go to uh firewalls and virtual Network where you can just say that this storage will be accessible only in the selected Network and you can either in figure a new network or if you already have existing Network you can add on that into this or you can fix this thing up to a particular firewall with a specific IP range with that now all these ways we can secure our data which is there inside the storage and do
81:30 - 82:00 not forget if V2 storage so if by default having encryption enabled in that which you know which is visible here in this encryption section and you can see that it is also having an encryption type which is Microsoft manage keys or if you want your own customer managed Keys you can specify those key URLs into this which maybe you have generated through Azure key Vault or some other third-party service and you can secure your storage with with all these options
82:00 - 82:30 foreign okay so after Azure storage account it's a time to deal with Azure Cosmos DB remember Azure Cosmos DB as per definition Microsoft is saying that it's Microsoft's own multi-model distributed database and the moment they say multi-model they actually mean by the different options and the flavors available on Cosmos DB also there is one important word in that
82:30 - 83:00 which is distribution inside Cosmos GB we have a facility of TurnKey global distribution which automatically replicates your data to the other Azure data centers across the globe without the need to manually write core or build a replication infrastructure it have technically something called multi-region right normally if you're using Azure SQL or any other data storage provider or in the database associated with Azure we
83:00 - 83:30 mostly have multiple geographical copies where from where we can read but this one is somehow allowing you to deal with this kind of global distributions automatically for read and write both when we talk about multi model we have five different apis available for this we have a mongodb API table API Gremlin API for graph DB Cassandra API and SQL API no matter which particular database
83:30 - 84:00 platform or programming language you're familiar we have options for everyone and when you are choosing Cosmos DB ultimately is going to allow you to store everything in a form of Json files it is kind of documentdb where you are able to store data in the form of Json file and your Json file is going to have key value pairs but the way you're going to deal with that is through these apis and that's the reason the structure of Cosmos DB is bit different than other traditional databases
84:00 - 84:30 if I focus on the hierarchical structure of this Cosmos GB then you can understand that first we have a root level which is a cosmos DB account remember on your Azure portal also when you create an Azure Cosmos DB account then only inside that you can have multiple databases in each database we can have multiple collections now it has a different word actually we can call it collection we can call it containers we can also call it graph and we can also call it table now this level is actually showing you
84:30 - 85:00 multiple options or I can say multiple words in that depends upon the API which you're using like for example if you're using gramline and you're dealing with graph DB kind of Cosmos DB then you're going to call that one database will have multiple graph while in other case if you're using SQL API it's the same thing we call container so we'll say that one SQL database will have multiple containers in that ultimately inside this containers collections of graph we are going to
85:00 - 85:30 have multiple items which will be a Json document which is going to store data also you can have a facility of trigger store procedures and some other user defined functions with that let's see this thing in action now to understand properly let's see this thing in action in Azure portal in my Azure portal I am going to create two Cosmos GB accounts right now when I click on create resource we have a option for Azure Cosmos DB I am creating a new Resource Group
85:30 - 86:00 called trainingrg and then I'm going to click on OK in the account name I am giving maruti Cosmos DB and this time I'm going to choose SQL API so I'm giving sql123 you can see we have an option called which kind of API you want to choose we have course equal API we have mongodb Cassandra Azure table and Gremlin which is for graph I am choosing course equal API
86:00 - 86:30 I do not want to change any of the other options which are there but in the location we are going to choose East US which kind of account you want you want a production account or you want non-production account they are giving you options because they are saying in the production account your Geo redundancy and your multi-region rights will be enabled we are choosing non-production account which is only for learning right now so we do not want to change anything in this let's click on next which is networking we will be allowing this thing to all the networks yes
86:30 - 87:00 and then we'll just click on review and create just notice one thing the estimated account creation time which is mentioned here is 11 minutes and that's the reason in this video first we are creating two accounts and then I'll show you how you can deal with that this is a SQL API based account let's click on create now once this deployment is submitted and it is just processing my first Cosmos GP account I am going to click on create one more resource in which we will again choose Azure Cosmos GB
87:00 - 87:30 account Resource Group I am choosing same but name of the account I am giving maruti Cosmos DB and this time I am going to select graph now I am choosing graph because this time is I'm going to use Gremlin API so in the drop down I'm going to choose gremlin which is for the graph DB and everything else is fine region is still going to be East us only
87:30 - 88:00 and I'm going to click on next tags review and create and this is also going to take 11 minutes so I'm going to click on create once we have created this two different Cosmos GB accounts remember this is an account so as we discussed inside One account you can have multiple databases all right so now we have two different Azure Cosmos DBS already created for me
88:00 - 88:30 you can see both the deployments are succeeded I am going to click on go to resource in this one and you can see this is the one which is associated with my SQL API actually it's directly taking me to Quick Start of that but I don't want to go there I'm just clicking on this overview tab only and then the other Cosmos DB I want to keep it open in a separate tab so I'm just opening a new
88:30 - 89:00 tab and in this inside my training RG Resource Group I'm going to open my craft DB also okay so both the databases are the both the cosmology with other This Is My Graph one and this is my SQL one okay now inside this I just want you to understand this thing that inside the SQL one first if I click on data Explorer this is something like your storage Explorer of your Azure storage tables and all but if I go into this
89:00 - 89:30 data Explorer it is trying to connect with the cosmos DB right now and as we know this is the cosmos DB account inside this account we are going to have multiple databases when I go into this this is showing me that it's based on SQL API and on the top I have options like I can create a new database or I can create a new container as we know the hierarchy is like this that one Cosmos DB account will have multiple databases and one Cosmos DB database is going to have multiple
89:30 - 90:00 containers in this now if I'd say that I want to create a new database right now then it's going to ask me what's the name of the database give me that let's say I am specifying some name and then I have to provide something called throughput now this throughput is actually something which is it's a measurement of your request response when you're dealing with Cosmos DB Azure actually measure all the thing in the terms called request unit
90:00 - 90:30 this is very much similar if you want to compare in SQL databases we have something called DTU data transactional units well in this case we have R Used which is request unit these are used are starting with a range from 400 to 1 lakh requests per second and this request unit per second is going to give you your throughput it means your costing of your Cosmos DB account is not based on the account actually it's based on the database and in that also every database can have a
90:30 - 91:00 different throughput associated with that and you have to pay hourly based on that you can see right now they have shown me an estimated spend in US Dollars and it's starting with 0.032 dollar hourly now if I just continue this thing it's going to cost me around 23.04 dollars monthly if I go with the minimum throughput let's say if I go with the maximum or if I change the range to 1000 only it will reach 257 dollars and
91:00 - 91:30 if I go to the maximum it's going to reach to 5760 dollars per month now obviously we do not need this much higher throughput right now we can go with 400 also and the database ID let's say I am giving a database ID to do list the to-do list is the name of my database and I'm just specifying the things the database ID and then I'm going to click on OK the moment I do this thing this is going to add a new database in this Cosmos DB
91:30 - 92:00 and now inside this database I can have multiple containers okay my to-do list database is already created I can see that thing if I expand this I have options like scale and all and if I just refresh it once you can see that I do not have any options right now we just have scale one option because I do not have any containers in this now let me click on new container it's going to first ask me do you want to use an existing database then you choose from that otherwise you can again create a new one I want to select the
92:00 - 92:30 to-do list one which is existing one what will be the name of the container actually so I and I'm giving that the name of the container will be items do I need indexing in this yes it's going to automatically index that thing I'm okay with that I do not want to specify any partition key right now but as you can see here they are saying that the partition key is used to automatically partition the data among multiple servers for scalability to the Json property name that has a wide range of values and is likely to have an evenly distributed access
92:30 - 93:00 patterns that's the reason that partition key whenever you define it's going to be something like department or city name or some particular that kind of key which is going to have bunch of data into that and then it's going to have one partition which will have multiple records into that let's say I am giving a partition key is actually um name because inside this maybe we're going to have name associated with that it is saying that do I need my partition key which is larger than 100 by so I do
93:00 - 93:30 not want that provision dedicated throughput for this container we have set 400 as a throughput for this database now if I want to provision a dedicated throughput for this container I can check mark on this and then I have to specify that particular throughput in that I do not want that also right now I'm going to click on OK once we're done with this we will have one database name with the name to-do list and will have one container inside this this container is having a name items
93:30 - 94:00 now because this is a SQL API inside this we have something like Studio procedures user defined functions and triggers and also in this items this is actually a collections of all the items I will not have any items right now because it's just now a new container which we have created but this is a hierarchy which is a cosmos DB hierarchy for SQL API now parallely if I just go to my second tab which is my graph DB inside this also if I do same way click on data Explorer it's going to show me that this is a
94:00 - 94:30 gramlin API this is not SQL API and if I click on this drop down this is not saying that I have a container it's saying you can create a new database and you can create a new graph now because this is a graph DB they are treating this thing as a graph not as a item or as a container let me say I want to create a new database it's going to show me same option which was there or this time let's do one thing let's create a graph with the new database I'm saying that
94:30 - 95:00 it's going to have a new database and I can give a name of the database let's say this is my first DB that's the name of my database and the throughput for that we are specifying 400 that's fine and then I can specify a graph ID and partition key now this graph ID can be any particular key which you want and based on that you can associate this to understand the graph ID and partition key you actually have to see the code and that's the reason instead of going to this and creating this thing manually
95:00 - 95:30 I want you to see that code in the coming video and then we will see this thing but we have a same kind of options here you can specify this and I can create a new graph also here but that will not look like a graph to look that thing as a graph you need to First add a proper data into this and that's the reason we will see this Gremlin API with code in coming video thank you okay so I'm still with my
95:30 - 96:00 Cosmos DB which is based on SQL API and we know that inside this account we have one two-do list database of Cosmos GB and inside that we have a container called items we still do not have any records inside this so it's still empty and now what I want to do is left side somewhere inside this Cosmos DB account we have a section called keys like our storage account these keys are actually again nothing but a token and we have a URI which is actually a unique
96:00 - 96:30 URI associated with my Cosmos DB account I have two different Keys available here we have a read write keys and we have read only keys if you want to give access to only read only things and in this read write Keys we have also two two keys primary key and secondary key so like in storage you are having key one and key two here also we have two different keys what I'm going to do is I want to connect this Cosmos DB now with my application code and we'll see that how we can connect this thing and how we can deal with this to-do list database which
96:30 - 97:00 we have created inside this account now I have a repository available here which is uh on this URL github.com trainer maruti slash web app Cosmos and I have this project available here so when you want you can just focus this repository first and then you can associate with your account code in my case this repository is already available in my visual studio and this is actually a web application which is a MVC based application I'm not
97:00 - 97:30 using dotnet Code this time this is pure.net framework based application and if you check right now I have my model in which we have this item.cs and my goal is I want to use something like a code first approach in which I have a couple of Json properties which are nothing but C sharp properties and using the C sharp properties we are going to create a table structure kind of thing but ultimately this is not going to be a table it is going to be stored inside
97:30 - 98:00 that particular container of my Cosmos DB we have columns like ID name description and there is a Boolean column also which is completed now we'll check whether the to-do list tasks are completed or not and then we'll just give a name and description and other properties of that I have written a logic for this by providing this documentdb repository in this in which I am using a classes which are associated with Azure documentdb now we have this classif which allows me to write the link you
98:00 - 98:30 query and some client-side association with that so I have written a code in this and this code is somehow trying to connect with my Cosmos DB account in which it is looking for an endpoint and authentication key this endpoint in authentication key is associated in my web config file and inside this web config file I have my older account when I was when I have created this particular code so I'm just going to do one thing I'll go to my portal I'll take this new URL of this Cosmos DB
98:30 - 99:00 and I'll paste it here inside this account same way this key is also older so will not use this key and in spite of this I am going to put my primary key so I'll copy this and I will paste it here in this Visual Studio once it is done I'm just going to click on Save if you want you can go through this code which I have run inside this we have items controller and we have done uh insert and create kind of operations
99:00 - 99:30 associated with that also somewhere we have a conditions that if any particular to-do list item is completed then we are not going to uh you know display that thing in the UI and we have created a UI also inside this so we have some pages index.cshtml which is displaying all this thing in that now to see this thing let's just quickly run the thing in the local machine first if I want I can deploy this thing also on the portal but now I think you guys are familiar how to deploy this
99:30 - 100:00 application on Azure portal so we are not doing that thing right now I'm just running this thing in my local machine yes my application is running in this local machine but it's still connected with my Cosmos Ruby which is there on Portal so I'm going to click on create new and let's say the to do task which we are giving is let's say I am giving prepare for az204 and in the description will say that
100:00 - 100:30 we will deploy app through Azure CLI which will do and this is not completed so I'll not click on complete let's click on create uh let me do one more thing let's click on create new let's say the second one is call maruti for help this is a task and then you can say that okay description is Hands-On maybe you are looking for some
100:30 - 101:00 extra Hands-On then you can just reach to me for that I am going to say create once these things are done you can also deal with the edit and details and all the things and both are not completed right now let's do one thing let's quickly check this thing in that portal so I'm going to Azure Cosmos GD refreshing this items and if I do that I should have some records inside this items actually yes I have two records and if I click on
101:00 - 101:30 this I'll be able to see that you can see the first one is prepared for an az204 and the property which we have is complete is false the second one is called maruti for help and then Hands-On yep so both the records are inserted successfully not only that this is super quick if I just let's say if I click on edit on this and I'm saying that this is the one which is now completed I'm going to click on Save and yep that's not listed here because we did this code inside that that if it's a completed I don't want to list it
101:30 - 102:00 here on the home page but this record is not deleted it's actually edited into this so if I just refresh this and if I refresh it here also I will get that this call maruti is having is complete true so in short like a normal database table your create insert update delete kind of operations it will work with this ensure all your credit operation create read update delete operations will work very
102:00 - 102:30 easily and because you have higher throughput now performance wise also Cosmos DB will be much much better okay so now we are inside our Cosmos DB which is actually associated with the gremlin API so it's a graph DB I still do not have anything inside this data Explorer as we have seen in the earlier videos that we have not created any graph inside this now to deal with
102:30 - 103:00 this I'm just clicking on Quick Start with quick start has given me three different options available here in that the first one is actually based on the dotnet and they are saying that if you want to deal with this graph DB and if you want to see how this Grambling API is allowing you to access data and store data in and all the thing you can just do this thing in this first two steps the first step is actually adding a graph so you can click on this create a person's container the moment I do this
103:00 - 103:30 thing is going to create a new container for me and it's going to create one new graph DB also for me now once this process is done it will generate one application which is automatically going to have association with my this Cosmos DB account now you can see that the first step is already done the second step is showing that I can download this particular application in my machine so I'm going to click on download and it's downloading one zip file into this will open this thing very soon but before
103:30 - 104:00 that let's click on this open data Explorer when we do this it's going to show me that it is going to have a graph DB which is already created inside which I have one person's graph and then inside that if I click on graph I still do not have any data actually if I click on execute grammarly inquiry which is showing something like this I do not have anything inside that yeah so this is a graph DB but I seriously do not have anything inside
104:00 - 104:30 this now let's understand how graph DB is actually creating structure and how is storing data do that thing we have to understand the sample now before we go into that application code which we have downloaded I want you to understand three different terms so that you can understand this graph properly remember a graph which we are discussing right now or the graph DB which you are discussing right now is actually nothing but a structure that's composed of two things which is known as
104:30 - 105:00 vertices and ages now in this case this is not that graph which you are maybe visualizing like a a bar chart or a line chart kind of thing this is actually combination of vertices and ages now what is the vertex avertise denotes discrete objects such as a person a place or an event and then each denotes a relationship between vertices now logically graph is a relational database without a foreign key primary kind of structure
105:00 - 105:30 this graph DB allows you to store a discrete relationship between two different objects let's say if I tell you that if you take any two person like I am one person and you are the other person and if I want to find out a common things between you and me maybe with both belongs to different countries maybe with both belongs to different professions maybe we both have a different characteristics but still there are certain things which are common which you want to find out then maybe we can find out that we both
105:30 - 106:00 have a common interest which is azure now when we do this thing we both are actually vertices and the common age which is connecting and showing the relationship between you and me is maybe azure same way a person or a place can be advertised and the property which associated with that it is known as age so for example a person might know another person that is also something which is known as H maybe we both do not know each other but we both have visited a same restaurant on a different
106:00 - 106:30 different time and we both have reviewed that same restaurant that is also something which is an age a relationship between these two people this kind of relationship are mostly useful when you are developing an application which is either e-commerce or mostly social media if you are familiar with with website like Facebook and Linkedin and all they always relate one user with other user like in your LinkedIn you have your first connection second connection third generation kind of thing or maybe this is also something which you can use when you're dealing with the e-commerce site
106:30 - 107:00 like amazon.in and when your user is buying one product you want to relate multiple products with that and you just want to give some suggestions to him Cosmos GB is going to be widely useful in this kind of cases now it's the time we need to see their code so let's go through that that application is open in my with your studio right now and it's a very simple console application which is just having one c-sharp class called program.cs if I go inside this I have some
107:00 - 107:30 properties which are allowing me to connect with my Cosmos DB account with the authentication key this part I have not done or also I have not copy pasted this key it automatically got generated because I have clicked on those buttons in the quick start in my Azure portal it is connected with the database called graphdb and inside that is connected with the collection called persons now I know maybe you have never dealed with Grambling API and you have maybe never used it in your past but still as a
107:30 - 108:00 developer I'm sure you are able to understand this code which I'm showing you if you check right now we have G which is a global variable created with gramling and then we are calling couple of predefined functions with that the V function is for word type and the E function is actually for ages same way using that we have a function for properties also which is associated with maybe about ice on H using this they are trying to write some logical data inside this graph DB and they have specific buying first whatever
108:00 - 108:30 is there we are going to drop that then we are trying to add a new person with the ID property called Thomas and he has a first name f Thomas and his age is 44 it means first we are adding one vertex then we are adding one more vertex with the different values inside that her name is Mary and the last name is Anderson and she is 39 years old same way we have two more person then we are adding an age between two persons so we are seeing g dot V which
108:30 - 109:00 is a Thomas and then we are adding an H which is nose and we are specifying another g dot V which is Mary so we are seeing that Thomas knows Mary same way Thomas knows then now then do not know Mary but Ben knows another person who is known as Robin logically when we do this kind of things this is adding a relationship between one vertex and the other vertice and then is actually going to build up a graph that's the basic reason this thing
109:00 - 109:30 is known as graph TD same way we have couple of other properties in which they are just trying to connect all the thing somewhere in this logical queries the last lines are actually some dropping somethings they are actually dropping all the vertices and all let me do one thing I just actually want to remove this first two lines so I'm just going to comment this two lines because I do not want to drop anything even there is a drop age for this Thomas also I do not want this also so I'm going to comment this last three lines
109:30 - 110:00 from the bottom and then now let's try to run this thing it's a console application so it's just going to execute this code line by line and maybe it's going to create all the vertex and ages and all the things with that once this code is done I want to see whether the data which is there in my Cosmos GD is looking like a graph or not if the code is executed is showing me press any key to exit I do not want to press any key I want to quickly switch to my portal if I go there if I refresh this Gremlin
110:00 - 110:30 API is trying to fetch containers in this and now we have a person's collection into this we have a graph inside that and I want to again click on this button which is execute gram link query you can see it's showing me that I have four vertexes available in this Thomas Mary Ben and Robin and we have
110:30 - 111:00 properties associated with each and every user so we have a property associated with each and every vertex actually if I click on this and now if I just try to show you this section which is a graph you can see that it's showing me that we have a connection between Thomas and then and Thomas and Mary and then if I click on Ben it's showing me further details about that that Ben is also having some other properties and other associations maybe Ben knows another person is a robin and this full graph is actually
111:00 - 111:30 created using that particular Json document which is associated with this thing if I want to see the Json data I can do that also I can see that also or in this if I want to change the style of this or if I want to add a new vertex directly from this that also is possible in this photo I hope you're getting a basic idea about graph TV and Cosmos DB also thank you hey friends I hope you have enjoyed this
111:30 - 112:00 course so far and as per the new guidelines given by udemy as you can see on screen now starting from March 17th onwards any of the free courses which we are publishing on udemy.com as an instructor we are not allowed to put a more than two hours of video content in that now that's the reason in this particular video course we are not able to share all the modules to you but you do not have to worry about this thing
112:00 - 112:30 if you really like this course and if you like the way we are teaching you this particular video series and the technical Concepts inside that then you can go through our official site which is mantastic.com and when you go there you'll have lots of Premium content available with free of cost you just have to sign up there as a member so without wasting time just go there and register yourself there also do not forget to sign up our newsletter to get the new updated content every week
112:30 - 113:00 as of now this is a goodbye from me maruti makwana and I want you to make yourself Future Ready happy learning take care