Explore the World of IICS

Informatica IICS/IDMC Tutorial | Informatica Cloud Tutorial | IICS Full Course

Estimated read time: 1:20

    Learn to use AI like a Pro

    Get the latest AI workflows to boost your productivity and business performance, delivered weekly by expert consultants. Enjoy step-by-step guides, weekly Q&A sessions, and full access to our AI workflow archive.

    Canva Logo
    Claude AI Logo
    Google Gemini Logo
    HeyGen Logo
    Hugging Face Logo
    Microsoft Logo
    OpenAI Logo
    Zapier Logo
    Canva Logo
    Claude AI Logo
    Google Gemini Logo
    HeyGen Logo
    Hugging Face Logo
    Microsoft Logo
    OpenAI Logo
    Zapier Logo

    Summary

    This tutorial on Informatica IICS/IDMC provides an extensive overview of Informatica Cloud services, aimed at helping individuals grasp the ins and outs of these powerful data integration and management tools. Throughout the comprehensive video tutorial, led by Chandra from NIC IT Academy, learners are guided through installation processes, configuration settings, and practical use cases for IICS/IDMC, with a focus on addressing common issues and career opportunities in the field. The tutorial also covers key aspects of data warehousing, cloud migration, and how to navigate challenges with installation and integration, all provided in a structured, easy-to-follow format.

      Highlights

      • Chandra shares insights from his 14 years of IT experience, focusing on data integration and cloud services. ⚡
      • Learners are guided step-by-step through the intricacies of setting up and running IICS. 🔧
      • Real-time problem-solving advice is offered, helping to tackle installation and setup issues. 🌟
      • The tutorial discusses career growth and opportunities in the IICS field. 📈
      • Explore different data integration scenarios and practical use cases. 🗂️

      Key Takeaways

      • Discover the world of Informatica IICS/IDMC and its significance in today's data-driven industries. 🚀
      • Learn how to handle common installation and integration challenges like a pro! 🛠️
      • Explore career opportunities in data management with insights from an industry expert. 💼
      • Understand the fundamentals of cloud migration with practical examples. ☁️
      • Get versed with data warehousing concepts and their real-world applications. 📊

      Overview

      Informatica IICS/IDMC is a revolutionary toolset for data integration and management, offering scalable solutions for businesses moving to the cloud. In this tutorial, Chandra, a seasoned expert from NIC IT Academy, walks learners through the essential aspects of these tools with practical demonstrations and expert advice.

        The video series covers everything from installation challenges to advanced setups, addressing common pitfalls and providing solutions to enhance the learning experience. It emphasizes the importance of mastering IICS for anyone looking to excel in cloud data management, highlighting career paths and opportunities available to skilled professionals.

          Viewers are invited to explore various data scenarios and integration techniques, gaining insights into real-world applications and the significance of Informatica tools in the modern digital landscape. This in-depth tutorial is perfect for both newcomers and seasoned professionals looking to expand their knowledge in cloud services and data warehousing.

            Chapters

            • 00:00 - 01:30: Introduction and Agenda In the 'Introduction and Agenda' chapter, the speaker welcomes participants from different time zones to the webinar. The main purpose of the webinar is to address concerns and questions about Infrastructure as Code (IAC), especially focusing on installation issues and career prospects. The agenda for the day includes discussing these topics in the first 30 minutes.
            • 01:30 - 10:00: Data Warehousing Concepts The chapter begins with the speaker inviting questions from the audience, indicating an interactive session where all questions will be addressed. The speaker has muted everyone to prevent background noise, but mentions that participants can raise their hands if they wish to speak and be unmuted. The first 30 minutes of this session are likely structured as a question-and-answer segment.
            • 10:00 - 15:00: Creating IAC Free Trial Account and Installing Secured Agent This chapter introduces the concepts of creating an IAC (Infrastructure as Code) free trial account and installing a secured agent. The instructor mentions that there will be a question-and-answer session where all the participant's questions will be addressed. Currently, the participants' microphones are muted to minimize background noise, but they will be unmuted for the Q&A session.
            • 15:00 - 20:00: Sample Connections and First Mapping in IAC The chapter introduces the topic of Informatica, specifically in the context of transitioning from other technologies like Informatica Power Center or PL/SQL to Informatica IAC. The speaker, Chandra, addresses common queries regarding what Informatica is, and discusses the potential career scope and opportunities over the next 5 to 10 years for individuals learning and utilizing Informatica tools.
            • 20:00 - 25:00: Flat File to Table Load in IAC In this chapter, the speaker introduces their background and experience, highlighting their role in handling YouTube videos and training sessions for Informatica Power Center and IIC. With over 14 years of IT experience, the speaker shares their extensive involvement in various projects both offshore and onshore, specifically mentioning their expertise in PL/SQL and automation with Informatica Power tools.
            • 25:00 - 32:00: Reading Excel File to Table in IAC The chapter titled 'Reading Excel File to Table in IAC' appears to focus on the author's experience and background in training on Informatica Power Center and IAC, especially concerning reading Excel files into tables. The speaker has been conducting training sessions for nine years and mentions a training profile available on various platforms since 2015. It appears that they have experience with both Informatica Power Center and PLSQL, and specifically, they have been handling IAC training for the past year.
            • 32:00 - 37:00: File List Concept in IAC The chapter titled "File List Concept in IAC" begins with the speaker mentioning the start of their 16th batch, commencing on September 6th. The speaker introduces the concept of IAC (Infrastructure as Code), highlighting the trend of many companies adopting this approach. The speaker encourages participants to ask questions via chat and indicates that they will unmute participants later for further interaction. The chapter is expected to lead into an explanation of IAC and its components.
            • 37:00 - 40:00: Fixed Width File Handling in IAC The chapter titled 'Fixed Width File Handling in IAC' begins with an introduction that highlights the shift from on-premise servers to Infrastructure as Code (IAC). It describes what an on-premise server is, emphasizing how companies traditionally hosted their own database, data Aro, and web servers on-site. This setting laid the ground for the evolution to IAC solutions, although the excerpt provided does not delve deeply into specific methodologies or technologies involved in fixed width file handling within IAC.
            • 40:00 - 45:00: Handling Unicode Characters in IAC The chapter discusses the transition of companies from on-premises server setups to cloud-based solutions. It starts with a depiction of a typical company housing various servers within its own premises, such as database servers, ETL tools, and web servers. Over time, specifically around a 5 to 10-year span, a shift is observed towards cloud computing. The chapter seeks to explore the reasons driving this move towards cloud solutions.
            • 45:00 - 49:00: Replication Task in IAC In this chapter, the focus is on the replication task in the context of Artificial Intelligence (AI) and Machine Learning (ML) processes. The discussion highlights the importance of conducting extensive analyses on data lakes. A data lake is defined as a vast repository that surpasses the volume capacities of traditional data warehouses and is capable of handling all types of data: structured, semi-structured, and unstructured. This foundational understanding underscores the significance of data lakes in managing and analyzing large volumes of diverse data necessary for AI and ML tasks.
            • 49:00 - 54:00: Synchronization Task in IAC The chapter discusses the concept of synchronization tasks within an Infrastructure as Code (IAC) setting, particularly focusing on how various data types, such as log files, cookies, audio, and video files from platforms like YouTube and Facebook, are handled. Initially, environments like banking dealt with structured data only. Over time, they transitioned to handling semi-structured data, which combines both structured and unstructured data types.
            • 54:00 - 55:00: Introduction to Cloud Application Integration (CI) The chapter introduces Cloud Application Integration (CI) with a focus on handling unstructured data. It discusses the role of a data lake in managing diverse data types, emphasizing the process of streaming real-time data from source systems or front ends into the data lake. The use of streaming machines to facilitate this data transfer is highlighted.
            • 55:00 - 57:00: Creating Processes and Using Web Services in CI This chapter covers the creation of processes and the utilization of web services within Continuous Integration (CI) frameworks. It involves handling various data types, including structured, semi-structured, and unstructured data. The primary approach emphasized is the streaming and loading of raw data into a data lake without applying any transformation logic initially. This method allows data science projects to access data in its raw form, facilitating the application of algorithms including supervised learning approaches.

            Informatica IICS/IDMC Tutorial | Informatica Cloud Tutorial | IICS Full Course Transcription

            • 00:00 - 00:30 hello everyone very good morning and good evening to all thanks for joining so I mainly scheduled this webinar for the people who are having more doubts on this IAC and they are facing lot of issues on the installation and I I got many questions on the the career perspective so this is the agenda for today's webinar uh so first uh first 30 minutes let me talk about these topics
            • 00:30 - 01:00 so then I will take the questions from you and I will answer all the questions whatever the questions you have and you can come up with your questions as well okay so I will answer all your questions we can go ahead and then discuss whatever the questions you have so as of now I have I have muted everyone to avoid the background noise so if you have any questions you can raise hand I will unmute so that you can ask your questions so first 30 minutes so let me
            • 01:00 - 01:30 explain these Concepts and then we'll go for the next one okay so we'll have the question and answers in the question and answer we'll have all the questions okay I will I will take your questions and then I'll answer as of now I have muted everyone to avoid the background noise so after that we'll have this question and answer okay so then we'll have I will unmute everyone okay so first of all
            • 01:30 - 02:00 you might be working from Informatica Power Center or you might be working from you are working in plsql or some other technology or you are not so for this informatics if you're asking me what is informatics and uh so what is the scope for this informatics if I'm going to learn this this particular tool so what about my career for next 5 years or next 10 years right so you can ask me more questions so before before starting so myself Chandra
            • 02:00 - 02:30 you know already so I'm the person I'm taking all the YouTube videos and I'm the person who who is handling the training sessions for Informatica Power Center as well as iic so I do have 14 plus years of experience in it and I have worked various projects so far and offshore and onshore and I have handled a lot of projects on plsql in aut matica Power
            • 02:30 - 03:00 Center IAC and also on the cloud side so I have been taking this training sessions last 9 years and you could see my videos from 2015 okay my trainer profile in in various platform you can find and from 2015 I'm handling this Informatica Power Center as well as in plsql this IAC so this IAC session I'm handling last one year
            • 03:00 - 03:30 so this is our 16th batch I'm going to start this September 6th and if you ask me so what is IAC why the companies are moving towards on the iacs okay so in between if you are if you if you have any questions please feel free to ask your questions through chat so I have I have muted everyone or to avoid the background noise after some time I will unmute you can ask your questions and first I will EXP explain
            • 03:30 - 04:00 these topics then we will go for the other question and answers okay so why IAC right see previously the companies they have on premis server right on premise server so what is on premis server so if I have a company then I will have my database server and data Aro server and web server all the servers on on premises so that is in our company right so in the if you take this is our the below if you take this is our
            • 04:00 - 04:30 company I will have all my servers in my company itself database database server data center data warehouse ETL tool ETL server and then web server all the different different servers right so all this will be there in the company itself on premises itself so after some time recent some 5 years or some 10 years the compan is moving towards on the cloud site why they are moving towards on the cloud side so why because
            • 04:30 - 05:00 so they need to do lot of a and ml process right artificial intelligence and machine learning process so for that they need to have lot of analysis on the data Lake Side so data Lake right so what is data lake so data Lake it's more than the volume of data warehouse and it will handle so all the type of for data like it will handle structure data semi-structure data and unstructured data as well so whatever we have our
            • 05:00 - 05:30 YouTube and Facebook all this social media platform all are like unstructured data right so we are getting the log files gy files cookies and audio file video file everything like a data stand structure data so previously banking environment or any other environment we'll have only structure data okay so after that we have the semi structure data so like structured and un unstructured data both we handled semi-structured and completely
            • 05:30 - 06:00 unstructured data right so we have different data so this data Lake will handle all this data so normally we used to take the realtime data from The Source systems or from the front end and then we will load it into we are fetching the data by using some streaming okay streaming machines and then we are streaming the data to this particular data lake so data L can
            • 06:00 - 06:30 handle structured semi-structure and unstructured dat normally what we will do we will not do any transformation logic here so just we will stream the data and then the raw data we will load it into our data lake so you know already what is data varos right no need to tell the data varo when we are loading all the raw data here after loading the raw data so if data science project if they want to do some algorithms okay supervised
            • 06:30 - 07:00 learning unsupervised learning some learning algorithms they want to perform they will perform all the data all the raw data then again they will send the recommendations the user experience for better user experience they will send a lot of recommendation to the source system right yes so that's what the data L comes into picture so who will provide this data Lake environment so mostly data Lake environment providers like the cloud providers like AWS and then we
            • 07:00 - 07:30 have the gcp Google cloud provider the platform and we have the Azure so we have different platform come into picture and the snowflake yes so we are the snowflake as well so a lot of cloud system came into picture so then this our ETL process also see they companies are moving towards on the data lake side and ETL also so why they do they do have the on premis server right so they're going for the cloud cloud means somebody
            • 07:30 - 08:00 else they will handle it say for an example I I'm I'm there in my native then I can have my own home right so we can have our our own house if I'm going to other country or other places so then I cannot build everywhere the infrastructure so what I will do I will pay whatever the the amount of resources I'm going to use it so for that I will pay right I will go to some hotels and then I will pay the money and then I'll come back so
            • 08:00 - 08:30 that's what we will do right the similar way so if I want to use some resources I can use the cloud system they will handle all the infrastructure whatever the resources we want we can handle it so that's what they have started on the data leg site to store huge volume of data so then this the company started thinking okay so why we are handling two systems two different systems Cloud as well as our in our on premises we we can move our data VAR goes also on the cloud
            • 08:30 - 09:00 right so then they have decided to move this say for an example if you take AWS we are the red shift gcp we are the big Cory right aure we are the data analytics and the snowflake also it's like one of the data vales environment see like all this Cloud providers they have the cloud data vales as well so then companies decided okay we can go ahead with the the cloud system so in this case so Informatica Power Center we were
            • 09:00 - 09:30 having the Informatica Power Center previously like we have some some 10.5 versions up to 10. five versions We had the Informatica power center right so then Informatica Power Center decide that okay there is a chance of moving all the companies on the cloud so we we also should move on the cloud side so that's what they decided during 2017 okay 2017 they started the POC okay so proof of concept on Informatica
            • 09:30 - 10:00 site they released first version of iacs okay so first version of IAC on the cloud site it was very very minimal Transformations and minimal web based tool the Informatica iacs Informatica Corporation they informed like okay no need to handle this Informatica Power Center on your premises so we will handle we will take care all the upgrade upgradation and everything thing so
            • 10:00 - 10:30 whatever the services you need for that Services you pay for that so remaining infrastructure we will handle it okay so that's what they told so that that's a so here if I'm going for any inl infrastructure so only source system right we have the olp system online transaction database right transactional processing so online transactional databases or the front end so we may get the data from the front end tool right we have the front end tool that means we
            • 10:30 - 11:00 have the some apis so we can get the data from apis right so we we can directly we can connect with apis the front end through API can connect this front end and we can get real time data as well so I they they come into picture they were saying that okay we can connect with realtime data that is our apis and all the rdbms all the rdbms I can connect okay so we canect connect with all the rdbms with nothing but
            • 11:00 - 11:30 Oracle SQL Server paradata all the systems and again we can go for the cloud system as well right file system yes we can go for third party vendors if they sending the files we can go for file system as well so that's what they have decided and they decided to go with this IAC IAC okay we will handle all the infrastructure whatever the amount is you want to use the Resources pay for that okay upgradation network security
            • 11:30 - 12:00 everything we will take care and this is what the storage right the cloud storage we can go for not only for cloud storage other services also the company if they want they can start using the services like what is AWS AWS we have different Services 100 plus services are there in AWS right so we have the we have S3 we have our red shift lot of services are available right so we have the different Services Ina a lot of services are available whatever
            • 12:00 - 12:30 the services we want we can use that services for our company then pay for that so this AWS all this they are provide even AWS also we have the ETL Services right called glue we have and gcp yes we have all the ETL we can perform aure also we have ADF a data Factory so this way we can have it but the companies who are already working on Informatica Power Center wants to migrate all this component from this
            • 12:30 - 13:00 particular okay on on premer server that is our 10.5 right so we have Informatica already announced that we are not going to release any more versions after this 10.5 so we are going to support next four or five years before that the companies can move from this on premiss server to This Cloud Server so that's what they have announced so in that case every companies are in need of migrating
            • 13:00 - 13:30 the existing Informatica Power Center component to this IAC component that's what we are getting lot of openings on the IAC Informatica Power Center to IAC migration project mostly like most of our student are getting placed on this migration project only so in that case we should know the concept of Informatica Power Center as well as iacs both so that is the use case for this Informatica power center and IAC so that is what they have started right so this
            • 13:30 - 14:00 is if you take on premises then we have the cloud Informatica is going to run in the cloud right then what the companies will have say for an example ABC is one of the bank you take one of the bank is called ABC so this Bank towards they wants to move from Informatica Power Center to IAC they already moved on the data storage side on the AWS or gcp or azur or snowflake or any other cloud system as well so in that case what they will the companies will do company will
            • 14:00 - 14:30 go and migrate this power center to IAC and not only this iacs it will not have only ETL right so it has all other tools as well like for ETL we have we have the data integration service and we have the API we have CI right Cloud application integration we have the application integration service and we have sloud cloud data quality also so cdq we have and we have the MDM
            • 14:30 - 15:00 so MDM as well as they're going to bring all that all other on premis server the tool to this web based tool that's what they're going to bring it so in that case they're going to bring all the power Power Center ID idq MDM address doctor EDC right big data management all this they're going to bring it as a service in this particular IAC so that's what they they changed their name from iacs to idmc so now this is called idmc
            • 15:00 - 15:30 it's like intelligent data management Cloud okay so idmc they have announced it so we have lot of boot camps are going on on the idmc tool as well right so that is what the scope so if you take this this Informatica iacs right the scope for iacs is very good actually even if you want to say for an example you have some doubt okay whether this will be used for next 5 years or 10 years how can I go with right so anyhow
            • 15:30 - 16:00 we are going to work on multiple Cloud systems and anyhow we are going to work with AWS or gcp or Azure or anything right so that cloud system we are going to work on along with our ETL tool and the storage side we are going to work on so that's what the scope even you are going to work on any cloud system you can learn some Cloud concept and you can move on the cloud side as well so on the on the career perspective yes yes we have a lot of scope okay so that's what so no
            • 16:00 - 16:30 need to have any negative thought so definitely this particular tool we have lot of openings and nowadays we have migration migration openings are that coming up okay so that's what the the tool we have and as of now most of the students are are coming back to me in the WhatsApp group as well as our direct messages they're coming okay so I'm trying to install this I we are facing lot of issues right see what happened this Informatica Power Center previously
            • 16:30 - 17:00 they had this particular tool but now they have changed the tool right they change the tool and because of the changes we are facing lot of issues so if you take the Informatica so always go to this Informatica trials so just go to this Informatica trials here and if you see this is the one page you can go here so previously if I'm going to install Informatica power Cent sorry Informatica ACS we normally will go to this cloud data integration then we will
            • 17:00 - 17:30 enable the the ACF Pacific data center right so we are the different data center for an example if they going to provide the services across the globe to cover the globe so they are placing the data center in three different places so if you see here so they are placing the data center here like US regions and North America we have the Canada regions okay and then we have EU regions and AC of Pacific regions so different data center we have
            • 17:30 - 18:00 so whoever is the company say for an example my company is here and here the Informatica is uh there and then I can connect with this data center right so this is what the data center previously we had two different agent one is cloud hosted agent another one is secured agent okay so that's what we'll have and now they have changed this this particular this particular uh architecture to give
            • 18:00 - 18:30 only with secured agent okay so they are not going to send the cloud hosted agent to visible to us so previously we have both so nowadays we are seeing only one so in that case if you need only data integration Services you have to go here so if you need both like a process application integration you want to connect it you have to go here on the application integration and Hyper automations and also we are facing some issues Javas issues on the Windows 11
            • 18:30 - 19:00 machine even I'm using Windows 11 only but I'm not seeing any issues but if you face any issues you can use the previous downloaded secured agent then you will not face any issues okay so that's what I want to give update but anyhow so all this issues and everything we are handling it in our uh live training sessions and if you are so I think most of them some of them are asked me to join the live training if you are joining the live training so we will
            • 19:00 - 19:30 have the hands on this particular installation session as well okay so if not then go to my web page so here this is what Nic it Academy okay you can go to this particular web page and if you go here in iacs I have given all the steps here okay so how to install the secure agent and for CDA alone for CI so if you follow this particular steps so definitely you can install this I very very easily and still if you have any questions you can you can contact me so
            • 19:30 - 20:00 I will guide you but if you if you are joining on the live training sessions then it will be like interactive sessions I can guide you more okay so that's what I want to update you and also if you ask me so what is I'm not getting calls why because this this particular questions I'm getting more more from the students so what about the calls for the iacs developer I'm not getting okay see always the calls will be there even though the recession is happening On Us and other regions but in
            • 20:00 - 20:30 India we are not seeing any recessions so definitely the calls are there but how much skill set we are updating okay so how much skill set we are updating how frequently we are updating the resumes and how much that the tax and everything we have used so that that is very very important okay so after preparing your resumes I will tell you okay even some some tips on the to pick your resumes see go to some some a website okay so remove your personal
            • 20:30 - 21:00 identifications like remove your mobile number remove your email ID all this and go to this particular particular website called skill Syer this is one particular website okay so like this we have lot of uh techniques to improve our calls to pick up so why because nowadays companies are using so one of the method called ATS method okay so applicant tracking system so that that particular ATS Tool itself it is going to pick the
            • 21:00 - 21:30 resumes from the job portals and Linkedin all the portals and it is going to give to the the talent acquisition team so in that case your resume should be picked up right so if I'm going to take okay so this is what I'm going to some some some resumes I'm going to upload it and I'm going to I'm going to check in this particular say for an example here I can put the job descriptions and I can up up upload my resumes okay so as I told remove this
            • 21:30 - 22:00 your mobile number and your name and your email ID and just check it here okay so check it here it will check and then it will show you so what is the score for your res resume score on this particular job so then you'll get clear idea okay so you might be getting 70 80 or 100 if you're getting 100 that's okay so keywords it will tell you so this is what the keyword you have to put and uh it assumes how many times it is there
            • 22:00 - 22:30 and you have to use these keywords and it will give you some suggestions okay so this is what the suggestions it will give you so based on the suggest suggestions you can update your resumes and you can you can go with the the resume updations and you you have to update your profile so every time so that it will be picked up okay so you have to fill your skill set and everything so this is one thing I want to uh I want to give more Clarity and also in our session in our live session
            • 22:30 - 23:00 I will give you more informations how your resume should be picked up right so why because these tools our students are using nowadays and they are saying that okay so after using this tool I have improved my resumes and I have posted in the the job portal I'm getting more calls okay so to get to get the more calls the first uh thing is your your notice period right so your if the notice period is very less so definitely you will get more calls so that I can uh
            • 23:00 - 23:30 I can tell so this is the notice period and then you can go for more number of the skills tools and what is the project you are using all these are very very important so that's what you can use some a tools and remove your personal information and then try okay so I do have some lot of uh techniques this way so I will tell in the live sessions so that you your your resumes will be picked up and also you may ask questions how you will your going to take the session right so always this is the
            • 23:30 - 24:00 content for CDI the cloud data integration I will take first CDI then I'll go for the CI so each day we'll have the sessions so just uh for this particular batch so we have two more sessions on the API related CI concept and if I'm going to take then I will I will start our session on on Wednesday okay so this is what 37 sessions on all this particular topics so so once completed I can take the mark interviews so
            • 24:00 - 24:30 interview and certification preparations and project level so this is very very important right so why because see we can learn anything from our internet also but the knowledge sharing on the the project level session is very very important okay so so I will take three different project level sessions to explain this concept so what is enterprise data VH project architecture what is the life cycle and how we will work on the aile okay
            • 24:30 - 25:00 aile project and what are the different documents we'll prepare the tools will be used in the i in the Informatica project and jira Tool so how the jira they are going to manage the projects and the service now tool how we will raise the incident service request problem ticket and change request all this and third part is scheduler right this is what control M different schedulers we have so that Schuler and different load strategies like incre M mental load the Delta extract so stage
            • 25:00 - 25:30 load so different load strategies we have impact analysis data analysis code analysis the source to Target mapping sheet unit test case preparations the deployment process so after preparing our our code so how are we going to deploy this into multiple environments right QA and then uat and production how we are going to deploy it so some different questions they will ask so can you explain some complex scenario you have faced in your project right so this is what they will ask in interview so
            • 25:30 - 26:00 interview perspective and production support roles and responsibilities that everything I will cover it in our live training and definitely you can you can get 100 percentage knowledge so from me okay and CI once CDI completed I will start Cai so this is what the CI CI is nothing but if I want to take the data from the from the direct front end tool say for an example here I want to book some flight ticket okay so I want to book some flight ticket so how can I a
            • 26:00 - 26:30 flight ticket I will go here right so this is what so this informations I'm going to pass it as a input to these service providers right to these service provider here like Indigo and then we have different service providers right the airlines providers and they are going to send us the data how they will send through API only so why because as a AS application they will not allow us
            • 26:30 - 27:00 to take their database control so we are going to request they will send the response on the apis and we will collect that information say for an example if IAC and idmc S Informatica Power Center we are going to take batch data only mostly okay 90% previously we are taking batch data that means so the from the apis from the front end it will get loaded into the olp system then we will take okay daily load monthly load
            • 27:00 - 27:30 intraday load all this we will take it from here and then we load it into our data varos this is batch data so batch by batch we will move the data right but nowadays we are taking direct realtime data as well okay so real time data so that we can take it through apis so whatever the information we need so that information we can request through API and this particular server will send the response in the AP so that response and
            • 27:30 - 28:00 getting the activities file connections and database connections everything we are going to see in the cloud application integration so how we are integrating through our applications also Frontline applications like we have Salesforce so different applications right all this applications we can take so this is what normally we are facing right this is one of the best example for apis but this is what we will explain in know but but this is our real time but I
            • 28:00 - 28:30 will I will explain the third party some some third party public apis okay so here if you take our iacs okay so whatever we can use this tool is free only okay so this tool is free so this is my 16th versions and you can use any number of versions here so every every time you have to create one new account and whatever we have created the previous account we can
            • 28:30 - 29:00 import all this previous mapping to this particular particular tool itself okay so we can bring it so this one if you if you're going to the administrator you can see the the services should be up and running for for cloud application integration as well as data integration so whatever we have created previously so we can take that particular component into this new account also in my system if you see this two different agents are there right Cloud hosted agent and
            • 29:00 - 29:30 secured agent so this one but nowadays we are seeing only the secured agent so they have removed all this this concept they they were bringing this one to this one so why because in real time this is what we will see normally we will use dev1 QA one particular link and production we will use other link okay so if you go to this particular uh data integration we can see the data integration related mappings and I have handled uh various mappings okay so like 50 hours of training I used to take for
            • 29:30 - 30:00 CDA and uh 10 hours of 10 10 plus hours of training for our CA also okay so going forward I will move more on the ca also if you if you see this this is what the normally we will use the mappings okay so we can create and we can export this mapping to new account also okay so new account you can export it and CI yes we can have we can connect with the different CA also service connector app
            • 30:00 - 30:30 connector and different process we can create so all this we will see in the ca also so this is what the tool anyhow this tool if you are learning then definitely it will be so you have to do handson so even if you are joining the live training or not do more handson okay so do more handson so that you will not any uh you will not dependent anyone in the realtime environment okay so do as as much as possible the Hands-On here lot of
            • 30:30 - 31:00 scenarios you do it then if you're completing the scenarios then it will be very easy to explain the the interview questions okay so even this one I want to tell to our students so they are straight away they going for interview questions on ACS okay so don't do that don't take any interview questions directly so first learn the concept okay so what is Json so what is Den rank what is expression right so what what is the Joiner so what are the different task available what is Task flow so all this
            • 31:00 - 31:30 try to understand first okay so do the hands on and then go for go for the interview questions don't go for interview question directly then you will end up with some always some questions okay if you see in our Informatica Power Center we are the workflow but in I we have a task flow so we have to create a task flow so whatever we have in our information Power Center that should be migrated right so if you're if you're going to
            • 31:30 - 32:00 work on the migration project you should know so in Informatica Power Center we are using worklet so how will you use worklet in IAC right so worklet is nothing but we are to use one more subtask flow only so we have to take one subtask flow inside you have to go for one more subtask flow so you here you can take theflow here task flow so task flow you are creating that is nothing but our worklet see this why because this migration it cannot be done for
            • 32:00 - 32:30 this kind of asset so in that case we have to migrate manually we have to create it even some project Informatica Power Center were running on Unix Mission but I is running on Windows machine in that case we are to create the script on B script Windows script so we have to migrate all this Unix script into the script right so everything we have to handle it here here some some migration is going on 50% like automated tool
            • 32:30 - 33:00 utilities some project we are using 50% manually so we are doing it all new development is going to happen in this particular tool only it's very very easy tool if you know already Power Center yes definitely this tool will be very easy so even I have put a lot of uh uh videos on Power Center installation and everything you can go to this particular web page and always you can see my my web page and say for an example and I'm getting
            • 33:00 - 33:30 some some questions so I do not have two months to learn all this concept so is there any way to get some immediately uh some crash courses or something yes I have already placed all this 48 Hours of videos into this particular courses you can go to this particular page and here you can definitely get all the the live courses so whatever the courses we have so I have posted here so Informatica IAC
            • 33:30 - 34:00 and Informatica power center and Informatica Cloud CDA this is live training but always this is live life access lifetime access we have so you can purchase from here if you go to this particular um module I have given 48 Hours of uh training here okay so 48 Hours of training you can get all this stuff here okay so if you go here 48 hours 48 hours 39 minutes I have already uploaded so from day one to day 41 I have uploaded with all the resume
            • 34:00 - 34:30 preparation project level sessions with the day wise notes I have put some Zip Zip copy here so day one to day 40 notes assignments will be there and sample resumes I have given some quizzes and interview questions so everything I have given in this particular uh particular module itself it's like a 48 Hours of training I have uploaded it's a lifetime access okay so already 12 members has purchased
            • 34:30 - 35:00 so before that I used to give as a Google Drive Link but now I started with this particular web based the tool anywhere you can start learning so the similar way Informatica power centrer also it has 30 modules and I have already posted here all the 30 modules okay so here you can see all this modules are available the similar way you can you can start learning this particular uh self okay so if you do not have the time to join the live training yeah somebody is already working on
            • 35:00 - 35:30 the night shift or some project is uh you are occupied with project yes you can go with the selfish courses as well okay so this is the way you have to put your effort to learn this concept you can have 100 percentage confident on this particular tool okay and this one today morning also I got information from one of my student he got project in informatic Power Center he joined in IAC he learned all the concept but he got
            • 35:30 - 36:00 project in Informatica power center from us and the his project is migration project Informatica Power Center to IAC so that's what he informed me so that that's the way I have trained a lot of people so far this is my uh 16th batch on Informatica po uh sorry Informatica iacs and Informatica Power Center I use I I'm taking the sessions from last nine years okay so always you can go to this particular page so Hereafter I will
            • 36:00 - 36:30 update more more content I'm trying to update more content interview questions and everything as of now I started okay it's in the draft versions but uh going forward I'm going to upload all this into one single page so Oracle SQL yes definitely I have updated all this uh content in one single place so whenever you have some time just to go through this you will get all this content free Okay so completely free so this content I used to take like for different uh
            • 36:30 - 37:00 different Learning Centers in Chennai and online it's like a 15 15,000 they will charge for this particular training so that particular training content everything I have taken and I have posted for our students okay it this completely free you can start learning it you will if you are practicing it definitely you'll get 100% confident on SQL so mainly you have to practice it Okay install this particular tool SQL
            • 37:00 - 37:30 Developer okay start practicing it you'll have more knowledge right so you have to practice this if once you are joining my light training definitely I will explain all this tools the installation everything we will do it and uh we have to do lot of practices here okay so a lot of tables you need to create and we have to load the data from one schema to another schema one database to another schema right so and also if you ask me so what is the cloud you are going to teach here okay so
            • 37:30 - 38:00 Cloud I'm going to teach AWS and gcp Google Cloud platform so that means I'm going to tell the introduction about a AWS and gcp AWS S3 and gcp big queries and we have the the snowflake okay Snowflake and CA we are going to see the the Salesforce so this Con this concept we are going to learn as part of the Cloud systems okay so I will explain this concept we will create account in AWS
            • 38:00 - 38:30 gcp Snowflake and everything snowflake is very very easy so if you learn all this all this SQL concept right then you can make okay you can make yourself as the snowflake as a developer it's not complete snowflake developer you can say as a ETL developer we have loaded the data into snowflake so that information even study materials on snowflake I will give you how to start with snowflake all this so that is what I can support you on this particular part and I think I
            • 38:30 - 39:00 have explained more uh more on this our training and everything so what is the scope for our uh this particular tool and always you can you can contact me on this number anytime okay so so you you give the you can contact me on Whatsapp even if you are clicking on this WhatsApp link so directly it will go to Whatsapp you can connect with me and then you can ask your questions okay if you have anything so if you have any personal questions you can ask me always so we will have the question and answer
            • 39:00 - 39:30 now so in the question and answer no need to tell your personal uh details no need to tell your name and no need to tell your company name or personal identification so just ask your questions I will answer all your questions okay so before that let me I got so many questions in the chat chat so let me go to one by one I will answer your all your questions okay so basic level please tell me so this is what the basic level okay so
            • 39:30 - 40:00 basic level is so why the compan is moving towards on the Informatica IAC I explained right this is what the compan is moving towards on the IAC and they're going for if if you know already iacs what should I uh what should I learn next if you know already IAC you can go with any Cloud systems like AWS gcp other uh uh tblo powerb so complete ETL consultant
            • 40:00 - 40:30 you can you can develop yourself okay so why because even I worked in tblo also and I worked in power powerbi project as well so why because if you know this so since we are working on the agile environment we may get a different different projects right yes definitely it will help you so I got some questions is it mandatory that we need to very good knowledge in SQL yes definitely so why because even if I'm going to work on this particular tool AWS or gcp or azur or anything so all we are going to do it
            • 40:30 - 41:00 on the cloud data varos environment for an example here I'm going to store the data in air in AWS red shift so if I want to query this AWS red shift then I need to know the SQL concept right so this snowflake is completely our Oracle SQL only if you know Oracle SQL very well then this particular snowflake you can handle easily so that's what you canar learn okay so I already I have provided all this information on the SQL
            • 41:00 - 41:30 you can go to my website so click on Oracle SQL start preparing this SQL all the notes are there in this particular link itself but going forward I'm going to upload a lot of uh Pages here okay so I'm going to upload lot of pages on the right hand side you will get more stuff on this particular concept as well okay so uh give me some time be there don't uh leave the meeting I will answer all your questions okay one by one I will take your questions okay so let me complete the chat questions and then
            • 41:30 - 42:00 I'll go for the uh the question and answer I will unmute you you can ask your questions okay is there any subdivision for QA in this IAC no QA and Dev Dev right we will have the same secured agent mostly most of the project on production we will have the separate secured agent but as a company they will have on on their premises they will have the secured agent only they will install it so to create the connection secure connection so how will you explain all
            • 42:00 - 42:30 this concept yeah so we have the all the content okay if you go here the CDI this is what I will take all the sessions okay so day by day I will take all the content the architecture wise so what I'm going to for explain all this I will teach at this uh architecture wise installation different data load AWS concept right AWS S3 Google B so all this I will explain one by one in our training session itself okay so no
            • 42:30 - 43:00 worries on the concept wise knowledge wise I will take care so all this so that I will uh I will take care and I'm taking week days as well as weekend session so weekend session just last week I have started this weekend okay so this weekend we'll have the session okay so tomor this today evening we have weekend sessions and those who are not able to join weekday they can join weekend and weekday session I'm going to start from
            • 43:00 - 43:30 September 6th okay so SE September 6th I'm going to start it the the fees for this particular session is the 12,000 for CDI and 5,000 for CI okay so this is the charge I'm I'm charging for uh this this particular training so this is from the starting itself this is the place I'm starting I'm I'm charging okay so this is the one if you if you want to join my
            • 43:30 - 44:00 training so please contact me okay so after this webinar then I'll guide you how to join okay and also I got some more questions let me answer all your questions sorry I need CIA selfed videos only so is that is it available yeah so CI this is the first session I'm going I'm I'm taking that I already completed six sessions so once this particular batch is completed I'm going to upload this courses so if you see here I'm going to upload here so after some 5 to
            • 44:00 - 44:30 5 days you can check it here so CI the similar way similar way you you can find C also okay so here but it will not have this much amount I will reduce the the fees for CI self based one okay and if you need the these are the lifetime access anytime you can access it any any devices okay so I do not know knowledge on Informatica can I start learning iacs yes Define uh yes definitely you can
            • 44:30 - 45:00 start but you should know IA Informatica Power Center concept at least I have already posted 15 hours video right 15 hours video I have posted you can see here I have already posted 15 hours video so start learning this video okay it's completely free in English I have posted 15 hours so I have posted like 11 hours video so start preparing that defitely you can get and also I'm working on plsql self faed videos so I
            • 45:00 - 45:30 already prepared that particular Pages I'm working on that okay so why because I haven't uploaded fully so once uploaded so definitely it'll be available by uh I I I'm hoping that it should be available by Monday okay so I'm I'm going to upload it so I already uploaded five sessions so remaining session I'm going to upload I need to upload all the the content and everything then I will make so that particular course as well online okay so
            • 45:30 - 46:00 if you want to register for the selfast you can go to this particular link uh you can buy okay you can buy now then you'll get email then with that email and password it's a lifetime access so any devices you can connect at anywhere so that's what you can connect then always you can um connect with this particular the self-based courses okay so self-based course is nothing but our entire training material only but you have to take care all this so this one
            • 46:00 - 46:30 and yeah sorry I will take your questions CDI and Cai yeah I have explained the scope for CDI and CI so I'm going to upload this recording on YouTube so if you have some time you can uh watch it okay this video I registered for self-based iacs training is uh will be the materials shared through Google Drive no if you have registered for s P you can download the material you can go inside as I told
            • 46:30 - 47:00 you can go inside and then you have the module right you can download this module you will have download options once you have registered so you can download it so API concept yes I will take API concept on CI module okay and when is the weekend session on CI so CI weekend sessions maybe another one one month I will start after CDI completed I will start the CI so be there in my WhatsApp group and follow my uh YouTube pages so I will update all this whenever
            • 47:00 - 47:30 I'm going to start a new batch on CDA or CI either it's a weekend or weekdays batch ETL testing I have already posted five hours of videos on ETL testing and definitely you can watch that testing uh sessions okay even I a working professional right so I do not have much time to take all the sessions that that's why I'm not taking the other sessions as of now yeah so as of now I'm in Canada and uh
            • 47:30 - 48:00 yeah I'm taking that onshore responsibilities so I could see that my previous projects are moving from Informatica Power Center to ADF so how much scope will be there for IAC in future yeah IAC we have as I told IAC definitely we have the future okay so don't worry on that so if you go to this IAC right so you can search idmc so what is the scope for idmc all this Informatica so they Informatica Corporation is a big company right so they have already started all this idmc
            • 48:00 - 48:30 related stuff right so it's a completely on a and ml Pro process they are going to bring it see this their annual growth so how this idmc is going to impact on the project right so this is a realtime data only so I'm not saying that uh if you see the chart and Informatica on the top on the ETL process side okay so they're going to make they're going to make all this product in this idmc tool and so the A and ml process also they going to bring
            • 48:30 - 49:00 in this particular tool so anyhow you you have the good scope on this tool so don't worry on that so every companies are moving towards on the cloud right so that's a scope for this iacs also I'm not saying like so only the training perspective I'm saying like the scope of our future scope right so you have joined live training sorry you have joined self-as courses is there
            • 49:00 - 49:30 possibility to join live classes so no selfed courses is uh differently managed uh I'm not managing it and uh if you have joined there if you want to join live sessions you have to enroll for live sessions also okay so why because live session is different right and uh is idmc different from IDM IDM even I'm not sure what is IDM idmc is like intelligent data management Cloud so what is the difference between IAC and
            • 49:30 - 50:00 idmc there is no differences I already explained right so there is no differences only the naming change they have uh taken so will you provide any placement assistance see placement assistance I'm not providing the assistant means all this interview support resume preparation Mark interviews everything I will give you so with that training and everything right so if you are already I'm not giving any job directly I'm giving the training complete training so after getting the complete
            • 50:00 - 50:30 training you can update your resumes on job portals after attending my live uh training live Mark interviews and project explanation and everything you can explain the same thing to our interviewer you can get placed okay so not only for interview perspective you can independently work on projects okay so don't worry on that the project and this one you need to have a proper CER certification your experience certification that's it you can work any
            • 50:30 - 51:00 other tools that's fine okay so you need to have some it related experience letter so that's important one okay so I will take your questions now so self-based videos and live training is same okay I already explained right I already explained if you have if you have joined late and uh you you watch this this uh recording
            • 51:00 - 51:30 okay okay so those who are having doubt so I'm unmuting everyone okay so those who are having doubt please ask your questions through uh this session is going to record it and no need to tell your name okay uh hello everyone sir I have already installed uh Power Center in my laptop uh so my question is now when we start this and I'm going to enroll for the iic uh live training with you uh
            • 51:30 - 52:00 from September 6th so my question is for iic will there be uh any how much space will I require so iics there is no uh constraints okay space all this we are going to install only secured agent in our mission okay this secured agent we are going to install it so this secur agent normally will take less space only okay it will will not take more space but if you have 6 GB RAM and around 50
            • 52:00 - 52:30 GB of space in your C drive that's fine okay okay and uh if I've already installed Power Center in my machine will this impact the iics no right no I have already installed in my system the 10.4 I have installed I I'm taking both Informatica power center and I in my same Mission yeah I'm handling it okay okay sir thank you looking forward to the sessions next week yeah yeah yeah thank you thank you hello sir
            • 52:30 - 53:00 yeah I have experience in Tings but nowadays no calls I'm only yesterday yesterday you can connect with training regarding but how to careers to because right now is I'm jobless two months yeah actually uh you are from ETL testing you know all the SQL concept very clearly right and also some of the concept on Power Center also you might be having the knowledge so start learning this IAC
            • 53:00 - 53:30 okay start learning this IAC you can put your resumes on Informatica Power Center ETL testing and I everything all the skill set okay so you have to put your all the skill set and everything so definitely you'll get calls okay you are like you are immediate Joiner now so for that immediate Joiner definitely you'll get more calls you will get calls okay so that otherwise I can okay just I'm ask you question regarding can you buy the course is in Informatica power centers is a record
            • 53:30 - 54:00 yeah you can see if you want you can get it this Informatica Power Center as I told you can go to this courses right you can click on the courses you can directly go to this Informatica power center you can buy it so there is no differences only difference is I will not be there to interact but this is interactive session okay so somebody else asked questions and I have answered yeah I'm training this training join Informatica Power Center yeah direct ICS join sorry you want to take Informatica
            • 54:00 - 54:30 self fa and iacs live training yeah yeah first which which training join both you have to join Informatica Power Center okay so at least basic concept that's it basic concept you should know basic concept I know about then then you can start withs yeah yeah then you can start with i because same concept okay but I'm not creating mappings but I know about mapping which site is creat yeah yeah you need to have some hands on okay so I will tell you
            • 54:30 - 55:00 how to install Informatica Power Center also start learning this tool okay so you'll get a clear idea okay okay okay okay so you start learning this mapping map plate work plate all this you have to start learning okay it's very simple if you know Power Center then uh this this one I'm going to start from the beginning okay from from the scratch only you'll get complete IDE idea on all the Transformations I'm going to take all the Transformations yeah okay this
            • 55:00 - 55:30 only language is SQL language or any other Python language is requ no no no SQL SQL is important and Unix scripting you need to know scripting Unix just basic command yeah everything yeah basic command yeah I even I have posted two videos on Unix that you can start learning okay okay thank you sir yeah yeah good morning sir hello yeah good morning yeah yeah please go yesterday I posted you on the in WhatsApp because I installed I but the
            • 55:30 - 56:00 thing is as per your video I can see the four Services were up and running at the time of installation of secure agent but the thing is when I installed the ISS then only two service Services were up and running so how that is I mean I'm not that's what that's what I explained in the beginning beginning of our session right that's what I explained so what is the issues on I installation I think you joined late I'm not sure whether you have you have listened to this one okay so please watch this
            • 56:00 - 56:30 recording and yeah if you join my live training I will explain that how to install it you have to go for I already explained you have to go for I already created the web page for IAC installation please follow this if you need both CDA and CA you have to go to this particular application integration and you have to do this uh do this installation okay so if you do this this one application integration hyper automation with the the data Cent is our apaca Pacific then you will get all the
            • 56:30 - 57:00 nine services like in my system I have all the nine Services right even 10 Services you will get it okay fine yeah I explained already yeah yeah okay okay not isue thank you so much sir yeah yeah so please follow this page you will get you will you will install okay you will install it yeah yeah please go ahead yeah this is w I just need like how we can migrate the code from Power Center to I where I have existing
            • 57:00 - 57:30 mappings and workflows all the things yeah yeah so this is like a realtime scenarios we used to get it right so migrations we have different uh strategies actually so we have the migration factories one of the uh the utilities coming from Informatica corporations they are migrating it but it is not migrating all the 100 percentage of Informatica Power Center mapping to IAC assets so we do have some yeah please go on mute those who are not speaking and so
            • 57:30 - 58:00 it is not completely migrating all the components we do have the utilities like a migration Factory from Informatica Corporation they will support it and other third party utilities as well and in addition to that we are migrating manually so the compan is migrating manually also so I will explain this migration concept as a separate session in our uh in our live training okay when you say manually sir it is just like creating same replica of
            • 58:00 - 58:30 mapping in yes correct correct soor sorry uh so I have one question other than the general Transformations what are the advanced Transformations you would be covering as part of the training here whatever the Transformations is available right so that we are going to cover in addition to that we are going to cover on the cloud as well okay cloud and the API all
            • 58:30 - 59:00 this so we have the API right so this is for Informatica Power Center see uh data integration and application integration so whatever we have for scope for this uh this particular tool so here all this Transformations you are going to cover it so all this all of this Transformations but two Transformations are licensed so one is velocity transformation and the data masking so these two we cannot do it here but remaining other transformation we are going to see so web services is for API
            • 59:00 - 59:30 calls so that I will cover it connecting third party apas and other Transformations all the transformation I will cover okay the Transformations as well as the task and task flow so all the components so whatever the components here mentioned all the components so all the task FL steps the command task and everything and maplet mapping and task so all the different task right we have the different tasks here so this task I will cover so
            • 59:30 - 60:00 Informatica that that's what see interview perspective or the knowledge perspective I will cover all this concept it's like 50 hours of training okay so last batch I have taken 50 plus hours so definitely it'll be like inter uh interactive sessions only you can always uh you can ask me the uh questions right so what is the difference between the self faced one and live training so self faced one the the one is most of the even I have I have enrolled a lot of Udi courses but
            • 60:00 - 60:30 uh you know right due to our work schedule we will not complete all the all the sessions on time but even if I have joined for live training so definitely I will join daily a daily basis and daily I use I will ask some questions to the trainer and I can get it clarified okay so that's what uh the advantage of uh live training and is cloud is equally powerful as in-house Power Center
            • 60:30 - 61:00 or or more powerful or less powerful it's not uh we cannot say Informatica Power Center is not powerful Informatica Power Center is on premises and IAC is on the cloud okay so both it will make Cloud as well as on premises so everything it will integrate into Data Lake environment or data warehouse environment okay so that's what I can say say so we have uh those the companies moving towards on the cloud so they are moving towards on the IAC it
            • 61:00 - 61:30 has more capability to build a ETL logic okay even the companies those who are using other ETL tools like a data stage or Abino or any other tool they are migrating to iacs now so why because the Informatica Corporation is they already moved on the cloud right on the ETL side so that's why the companies those who are using ssis are abin or data stage or or some other tools like sap bods or any other ETL tool they using they started
            • 61:30 - 62:00 migrating to I as well the scope wise we have lot of scope okay so you can you can check in in us also we have lot of scope and in in India also we have a lot of scope I can say yeah yeah please go ahead any other questions uh there is a TK where can say import from PC what is that that is a that is a task okay separate task we
            • 62:00 - 62:30 have so whatever the Informatica you have the workflows we can import into I happing okay so but the thing is if you're importing it you cannot edit if in future if I want to edit it can I edit here Informatica Power Center task no we cannot edit it so for time being you can use it but that is not the migration task okay so time being you can migrate your informatic workflow into iacs that two we do have limitations not all the
            • 62:30 - 63:00 sessions we cannot take it so one workflow it has it should have only one session one mapping Associated then we can convert it into Power Center task if you have two sessions that cannot be converted into Power Center task and Power Center task you cannot you cannot use it you cannot edit it in I say for an example today I'm using for one particular logic I have migrated this particular task Power Center task then after that if I want to change it you are to change it in the
            • 63:00 - 63:30 Informatica Power Center export as XML you have to again you have to convert this as a power center task okay so this is for time being only it is not a migration got it it's kind of idq to Power Center yes sorry Power Center toid yeah Power Center to iacs okay uh got it yeah but normally we not using our realtime projects that particular task so why because in future if you want to edit it you need to have
            • 63:30 - 64:00 Informatica Power Center by the time we will decommission that Informatica power center right so that's the reason we are not uh we are not using that Informatica Power Center task okay going forward in another one or two years most of the companies will migrate from Power Center to iacus definitely you take cdq also data quality no no no no no only uh data integration and application integration
            • 64:00 - 64:30 okay are we going to cover any data Concepts H yeah so first I'm going to talk uh on the data varos concept only so first two sessions right so we are the data varos concept olp oap what is Cube all this I will take okay so data host data Lake IAC architecture the complete architecture I'm going to take so fourth we'll have the IAC
            • 64:30 - 65:00 installation one complete session then we'll go for different data load taking the data from XML file Json file Excel file even I can handle data with Excel right and snowflake connection AWS connection working with gcp right different Transformations so one by one we'll have all the hands on so that I'm going to cover it web service transformation how to connect with the API rest API what is soap API so one by one I will take the different task
            • 65:00 - 65:30 different task flow different component parameters and variables right all this concept I will take Performance Tuning so I developer resumes migration I told right so this concept I will take in our live training so I cannot explain all this into our demo session but I will explain this on our actual sessions are we going to uh get the uh information on each and every option that is available with every
            • 65:30 - 66:00 transformation right yeah definitely definitely that's what 48 hours I have taken so far is it possible to get answer on the how to handle mq datas so message Q are asking yes normally yeah so that we are not taking here why because that environment we cannot bring it here but I will explain okay so how we are uh making that so what about the recordings do we
            • 66:00 - 66:30 get the recordings every day yeah all the recordings you will get it all the recordings so whatever the sessions we have right so all the recordings you will get it all the 37 sessions on Informatica iacs or 10 sessions on uh CI uh so actually uh I'm audible right yeah yeah so I'm working as a manual tester with no previous experience or knowledge
            • 66:30 - 67:00 on Informatica or no coding knowledge also is it a right step to uh start a learning of this I course yeah yeah I can understand so as a first step you start with SQL okay okay so SQL is very very important for this ETL start with SQL and also I have posted a lot of videos on uh this Informatica Power Center also right I already posted all the transformation logic and everything on this 15 hours
            • 67:00 - 67:30 video so start learning this is completely free so then you can join iacs training so that will be good options okay even if you want all the materials and everything so buy this selfed course which I have posted already so this is complete training material it has already 40 plus hours of training with nodes all this it has you can buy it from here so only C CI videos yes it is available
            • 67:30 - 68:00 I'm going I have already started I'm going to upload it here okay so in this page I'm going to upload it so you after after two to three days you check it here so one more uh one more training CI and plsql will be there here okay will be available plsql selfed videos and CI CDI and we have the power center all this will be available here you can always check here so courses. niit Academy always you can put here and then you can
            • 68:00 - 68:30 go to this particular page so even front page you can click on here you can go to this particular page so you'll be getting this this page okay uh Surin I'm not able to hear you I'm not sure why you are uh you are you have I have given access to speak you can speak but I'm not able to hear you hey any other questions from others yeah sir I have I'm work yeah please goe anyone yeah yeah
            • 68:30 - 69:00 I'm currently having uh I'm working in Informatica Power Center only into monitoring kind of stuff so I don't have any power center developing kind of knowledge so once I if I enroll ICS this program so will will I need to will I also require Informatica development knowledge also or yeah so any any we are going to learn all this concept Hands-On okay so complete handson I will give you lot of exercises okay daily basis so
            • 69:00 - 69:30 that you can learn it our Center parall you can just you can watch the sessions which I have posted already here right you can start learning from here so that will help but for IAC SQL is mandatory you need to know all the SQL concept okay SQL concept you should know Concepts should I got some calls related to is but I don't have the knowledge on I yeah start learning this okay see we are spending lot of money for other
            • 69:30 - 70:00 stuff right so whatever we are going to learn definitely it will help you on your career okay even I'm the developer so still I'm learning uh new New Concept AWS I have started learning I I I paid some money okay 30,000 to one trainer and I started learning right so similar way definitely we are in it definitely we have to enhance our knowledge I'm confused here you are saying for iic SQL is very much mandatory than the
            • 70:00 - 70:30 power center is that statement is still true or it is required for both Power Center as well as I also both both SQL is mandatory for both for Informatica Power Center as well as IAC but he wants to directly join on IAC course so that's what I informed for IAC I'm going to take from the scratch only whatever we are going to have in our Informatica Power Center that we will have it on iacs itself like all the transformation and everything so that knowledge he he can get it in Informatica power sorry
            • 70:30 - 71:00 IAC course parall you he can check on this my my training itself right he can check it in this particular video itself you can get all this right so anyhow I'm going to start Informatica Power Center also after some time so those some some some students already asking why because I used to take this power center last last 9 years uh I'm taking this course you can see my videos on YouTube platform as well any I'm going to start
            • 71:00 - 71:30 Informatica Power Center Live training also after some time okay hello yeah yeah please go ahead yeah I'm yeah I'm working as a SQL Developer what is the scope of IC is in my carrier development yeah so yeah definitely as I have explained already SQL develop in addition to SQL if you going to learn this Cloud tool right whatever the project we are doing on the plsql or something they are already
            • 71:30 - 72:00 migrating to this this ETL project right so in ETL most of the banking and insurance and Retail companies they have Informatica Power Center only so most of the US project if you if you go and check it for their ETL they're using Informatica power center and IAC so nowadays they are moving from our Center to iacs migration so that's what we can have the scope so that the scope scope voice definitely you will have at the time of interview we cannot learn all this concept in single day okay so take
            • 72:00 - 72:30 two months of time and then start learning all this concept completely from the scratch to complete level so then go for interviews okay so it will help you thank you Chandra yeah yeah definitely we are the scope even I'm there at onsite yeah so because of Informatica only I'm I'm here right yeah so definitely it will help you on your career okay thank you
            • 72:30 - 73:00 yeah yeah Unix are shell scripting required to learn this uh iic not SK cell scripting level but some project architecture right some framework uh some fra framework if you are going we are we are having different different kind of projects and different different frame framework So based on the framework some project they will trigger the Informatica itself from the script some project they will trigger the
            • 73:00 - 73:30 Informatica from scheduler so the the way of triggering the Informatica and handling the files it it will be differ from Project to project the architecture framework layers and everything so some project they will use more units some project they will use very less units okay so just they will go inside the Unix server they will check it the files then that's it so in that project you need to have very basic knowledge about Unix even I have already start I have posted two videos on Unix so how to
            • 73:30 - 74:00 start with Unix you can start with your Unix also okay and that is the way you can get uh Unix but some project they will expect shell scripting as well okay okay I got some questions let me answer this questions can you say few words about Kafka just for introduction see cka is for live streaming right so we have the we have the different system here as I told this this system for ETL project the similar way we used to take the data from Source system okay so from
            • 74:00 - 74:30 live live data we will take it stream the data and we will load it here so if you take Facebook or any other system we are taking the live data from different data centers right so all this throughout the world we are collecting the information from iot devices like all the mobile devices sensors and everything we are collecting so that means the data is live data it's coming from all the all the places every now and then so in that case the cka is like
            • 74:30 - 75:00 collecting the information it will store all the streaming data it will collect and then through Kafka only we are streaming the data to our data Lake environment law raw data will be pulled by using spark you you heard about the spark right so through spark through this particular Kafka tool we will connect and then we will stream the data raw data will get loaded into the data Lake and then the raw data we will apply the transformation logic like a learning process and everything so that will be
            • 75:00 - 75:30 given to the source system for recommendations okay YouTube you are seeing so whatever the your job so the search history based on that they will give you the next next video right so how they are giving so you're going to Amazon and then you are buying some product you are searching some product So based on that they are giving the next product list so how they are giving so they are collecting all the information they are processing it learning process then they are giving feeding back to the system as a recommendations for better user experiences right so that's what you
            • 75:30 - 76:00 need to have so IAC developer need a CMI knowledge so CMI is nothing but Cloud Mass injections yes some project they will use mass inje as well so here if you take big data and this this streaming they will go for Mass injections it's depends on the projects okay so depends on the project what is the JD the job description is for data integration application integration or both are based on this okay so I will explain the realtime project we cannot
            • 76:00 - 76:30 bring the realtime environment here and also I cannot share any real time documents okay it's a complaint issue and also like I do not want to give any false statement okay I do not want to give any false statement on this I will explain the project architecture with that definitely you can go and explain your project and everything you will get get 100% knowledge so why because even I'm working professional I'm working on this all the data Lake environment this Informatica Power
            • 76:30 - 77:00 Center all the ETL environments so definitely you'll get the knowledge okay no worries on the knowledge part is plsql needed for uh informatic Power Center or IAC some project very few project they will explain they will expect plsql why because some of the stored procedure running in the back end and they'll be calling the stor procedure in Informatica Power Center ACS so they will expect some knowledge on plsql okay so if you need plsql concept I as I explained it will be
            • 77:00 - 77:30 available here after two to three days then start learning that very uh very minimal rate it will be applied and then you can start learning that as well okay some project they will expect plsql knowledge also but not all the projects so I have taken IAC selfed learning and can I get access to your videos on website no this website I'm not handling it the the back end it will automatically handle uh so ping me I will tell you uh how to uh proceed okay
            • 77:30 - 78:00 so bring me directly I will tell you so after IAC training can we capable of working other ETL Cloud tools as well in the market yeah you can the concept wise are similar ETL is concept wise uh concept wise is similar and uh the the tool is might be different that's fine okay even I worked in ssis I worked on Informatica Power Center IAC
            • 78:00 - 78:30 and sap bods also so all are like same concept only the tool is different so ETL testing I have one question ETL testing belongs to both Power Center IAC and also plsql for this now ETL testing whatever as a ETL developer we are going to load the data into Data Vos as per the requirement ETL tester is going to test it so mostly they'll be doing testing based on the SQL queries so SQL query knowledge is very important for itail tester
            • 78:30 - 79:00 okay so is it mandatory to have idea on power center before joining IAC yes basic idea needed definitely basic idea is needed but anyhow I'm going to take from the scratch only so ETL plus power ba ETL plus IAC which one is more scope see that's what I'm saying powerbi is like another tool okay reporting tool it has it its own capability and it has its own cope okay ETL plus I is its own scope right so
            • 79:00 - 79:30 even testing people are there in on-site along with us it doesn't mean that testing is not uh scopeless right so we have equally testing eil tester also there on on site so they are having uh all the projects and everywhere so that's what the how much knowledge you have how much uh deep knowled you have how much experiences you have and the handson you have that speaks rather than the tool okay so powerb has its own uh
            • 79:30 - 80:00 the scope and everything okay so no worries on the part whatever you are going to learn learn completely that's it okay so simple if you know Informatica Power Center learn Informatica Power Center fully IAC completely so tblo completely right so go in depth on tblo or go in depth on AWS that will help you not with all the tools you need to have complete knowledges so I think I have answered all the
            • 80:00 - 80:30 questions uh so live recording sessions will be available on cloud will you provide out sessions separately only daily basis session yeah so daily we have the 1 hour 15 minutes right so in that case we will take 45 to 50 minutes of sessions then remaining 20 30 minutes are like interactive sessions only so you always you can ask your questions in the session itself and if you're finding any difficulties WhatsApp groups are there so you can post that
            • 80:30 - 81:00 and I will respond to it okay separate WhatsApp group for the training program okay the thing is you need to put some effort to come along with me okay it is not only for the trainers uh job is to explain everything and uh simply watching will not will not work out okay so you have to put your effort so to come along with me Andra can you able to hear me yeah yeah yeah please call it okay so like suppose if you have an existing uh I mean we are
            • 81:00 - 81:30 using informatic power center right like u i mean we have maybe close to th000 plus jobs mappings are workflows okay so how we are going to migrate into ISS that's what I will explain that those Concept in L live training okay okay I explained already so all the sessions on the
            • 81:30 - 82:00 yeah so we discussed already that concept so I'm going to upload this video on YouTube so please uh if you have time just uh walk through this recordings you'll get it okay so I already discussed all the migration okay so how we will do the migration our realtime projects in this session you already explained yeah okay any other
            • 82:00 - 82:30 questions so if you have questions please raise your hand I will unmute then you can ask your questions so if you have your questions please raise your hand you will you will find raise hand options then I will unmute you can ask your questions yeah you you can go ahead hi Sandra m which is good for career growth CDI or C yeah both it depends on the project
            • 82:30 - 83:00 right both has good scope only the companies are started using cloud data integration they will use CDI okay and the companies they have the realtime data integration application integration they will use CI it's not like one single tool so since the the Informatica already uh they they brought all the tools into one single tool whatever the services we are going to use you can use it right so that's what the ipas solutions so we have integration
            • 83:00 - 83:30 platform as a service they're going to bring all the services here so whatever the services are needed you can just use the services right it's like a idmc tool they're going to bring it so data quality yes I need my project you can start working on that so application integration yes I need yes I can work on okay it depends on the project requirement we will use the tool that's it okay the scope wise everything is good only so you need to know but application integration more on the API
            • 83:30 - 84:00 side front end you need to know API rest API all this you need to know so that I will cover it in our session okay so I worked as a developer and I will get a chance to informatic admin can I handle it yes you can handle it for Informatica admin you need to know the some uni concept okay but I CS there is no much uh needed for admin so why because everything will be handled by the cloud so only user Creations user
            • 84:00 - 84:30 handling connections creation so that information only so this particular informatic IAC admin they will do it in the Real Time Project so can we expect 10.6 no already they already informed right 10.5 is the last version so after that we will not get any 10.6 see why they need to bring 10.6 already they migrated into I right so if they're going to bring 10.6 that is it's a the
            • 84:30 - 85:00 controversy right so whatever they have announced already it's like they're not going to bring one more version so 10.5 is the last versions even after 5 years they are not going to support this 10.5 already they the Informatica Corporation already informed all the companies the vendors to migrate from from Informatica Power Center to iacs on on the business use case okay so we have depth knowledge on Unix
            • 85:00 - 85:30 yes you need to have knowledge on Unix it will be better if you have knowledge on Unix okay hi chra uh I have one question like suppose we are having Informatica power center and if you want to make connection with the like Cloud databases is it possible in power center or not power center yeah see what happened right to 2017 itself or 2018 Power Center they started bringing the connections like S3 snowflake connection
            • 85:30 - 86:00 power exchange connection all this in the power center tool itself but they decided okay this tool we are not going to have we will create the cloud services so that's what they have taken all this Cloud connections and everything in IAC itself but if you ask me is there any Cloud Connection in Informatica Power Center yes we have connection for S3 and we have the connection for power exchange connector through that you can connect with Cloud but we are not doing that much so why
            • 86:00 - 86:30 because the company's already moved from on premises to Cloud Server so we have the 100 plus connectors right so we are the connections here you can connect with all the connections like the different connections we have the gcp connections we have AWS connection the rest API connections snowflake connections so different connections even I I will explain in our session okay how to connect with how to load the how to uh extract the data from our on
            • 86:30 - 87:00 premises server our databases to Cloud Server so how to load it but admin admin they will do this user roles user groups users and licenses that is what the admin will handle it so one more question chra like while while learning this I course do we need to install like SQL Developer in our system or like uh yeah yeah we need to have SQL Developer yes we need to have SQL Developer so why because you need to take the data we need to check the data right so how the
            • 87:00 - 87:30 data is being loaded all this so I hope we will provide us link to download that SQL Developer yeah yeah yeah installation everything we will do handson session not for installation okay okay so here this is the timing you can check here timing will be week days Monday to Friday 7:00 a.m. IST to 8:15 a.m. so the weekend session 7:00 p.m. IST to 10 p.m. okay so any one session you can join based on your timings and uh the duration is 2 months and this is the fees okay so every day you will get
            • 87:30 - 88:00 the recordings and notes so I already explained this so if you if you need more informations please contact me on my WhatsApp number I will explain okay I hope I have answered all your questions so what is the duration of the IAC training yeah I explained right duration is 2 months and daily like 1 hour 15 minutes still if you have any questions you can contact me ask me your questions on WhatsApp on this number
            • 88:00 - 88:30 okay so I'm in Canada right now so you can ask me your questions on morning or evening okay anytime or morning or evening so that uh I will I will explain okay so I'll be available on WhatsApp so you can ask me or you can call me yeah so thank you all thank you uh very much for your time and uh start start learning this tool definitely it will help you and even if you do not have time as
            • 88:30 - 89:00 of now you can join any forthcoming session as well so always you can go to my website ni it Academy you can put in the YouTube search also and I do have YouTube s YouTube sessions the channel you can check all my SQL sessions Informatica sessions IA sessions already I posted here okay so thank you all uh CI weekend session tentatively like after one month so why because last last week itself I have started so next one
            • 89:00 - 89:30 month like six weekend I will have CDI then I will start CI so we can say approximately one months or five weekends CI is uh included uh why because uh I do not want to repeat repeatedly say the same answers CI is there in this this selfed one as of now it is not available I'm going to upload I'm working on that plsql as well as CI
            • 89:30 - 90:00 it will be available after two to three days so here in this page it will be available go to this course. nic.com then you'll be having okay or so here you can find courses so just go to this page you will get it CMI I'm not covering so Mass injections here there is no licenses for the CDA so that I'm not covering okay so some certain certain concept we cannot cover it here we can cover only
            • 90:00 - 90:30 the in in real time environment only yeah thank you all so if you have any other questions you can uh ask through WhatsApp and uh you can contact me always thank you all if you are willing to join live training so please contact me I'm going to start the live training by 6th September Wednesday so this sessions on 7 to 8:15 a.m. it's a two months program okay still if you have any questions you
            • 90:30 - 91:00 can contact me always on WhatsApp thank you all in our today's session we are going to learn what is business intelligence what is data warehouse what is IAC and why do we need this iacs why the companies moved from on premis server to Cloud Server so this concept you will see so
            • 91:00 - 91:30 if you take any organization right any organization we do have the the management people right so for an example the companies if the companies having any businesses they will have the front end tool they will do business through online and then they will do business on offline whatever the transaction they are doing it they will C capture it in the front end tool like the Java developer or net developer fullstack developer so they de they will develop all the code and it is for end user so
            • 91:30 - 92:00 the customer so if you are the Java developer or any other developer so definitely it'll be a end user so they'll be using it the front end tool and data will be captured okay so day by day the data will be captured here and it will be stored in one particular database even even if it is online offline okay Gateway transactions all the banking okay transaction all the transaction will be captured so here so
            • 92:00 - 92:30 if you are the Java Java developer or net developer or any other developer you'll be working on this front end development so I'm going to particular site and I'm registering myself and I'm ordering some product so first of all I need to create a customer portal right in that customer portal user account creation account creation so all this we will be doing it so whatever we are doing it this is for the business right so in order to run the business I need to have a database so this database will
            • 92:30 - 93:00 capture all the data so this particular database we will call it as it's a oilp system transactional database okay this is online transaction processing so online transaction processing so that's what you have the oltp system it will capture all the details in order to run the business each and every company should have a olp okay so think about the business
            • 93:00 - 93:30 people the manager senior manager directors VPS and CEOs from CEO will have the hierarchy right so what the CEO will do on day today so they will have the meetings so all the meetings they will discuss on the business so how the business is going on so what is the sales for today what is the profit for today so all this they will disc and they need to take aition so if you take some banking banking system think about the business so how much we have
            • 93:30 - 94:00 given loan today how much we got their deposit and all the analysis they will do right so we do have in all over India we do have these many branches these many Regional zones so these many states so Regional wise statewise so what is the profit everything they will discuss it so what is the target uh the marketing Trend everything they will analyze it so they will have so many VPS so many business areas Business Development uh teams they'll be working
            • 94:00 - 94:30 on to to increase the business right so day by day so if they are starting their day they will look at some reporting they'll go for some reporting say for an example if it is TBL reporting so we have the Tableau reporting right so if you go to the Tableau dashboard so they will look at like this right Tablo or powerbi or any reporting so if you look at this so this is what the company they will look at the data data in this way then they will
            • 94:30 - 95:00 take a de okay this is for regional wise sales some bar chart or P chart so different dashboard will be created from this dashboard they will analyze the business like a numbers like a chart so all this they will discuss the business people like manager senior manager directors reps and everybody okay they will analyze the data only statistics only then they will take addition right so if we have the tblo so we have the power B right so we have the
            • 95:00 - 95:30 powerbi powerbi dashboard yes we have the same way we have the powerbi from Microsoft and tblo is from Salesforce and we do have lot of business intelligence tool right so from Amazon we have the quick site right so so quick site we have lot of uh different different tools are there quick site and from Google looker we have from gcp if you have all the gcp system Cloud then they will have the Google looker here so looker so different tools will be using
            • 95:30 - 96:00 it and then they will be discussing say for an example this is the one statistics they will Analyze That statistics okay so what is the sales for for this particular week this particular month so all this this is like a aggregated data aggregated data this is not one single day data single day how many transaction will have for business right so if it is banking lot of transaction will happen per day per hour so Peak business hours if they have multinational companies they have
            • 96:00 - 96:30 business in multiple countries then they need to have lot of currencies right and lot of languages the the government rules and regulations the the culture of the people so everybody they need to customize the product and then they need to sell it right so they will have different different products and different different decision different different business rules so all this we need to apply and we have to load data into some
            • 96:30 - 97:00 particular database we cannot analyze any business in the system so if you are going to analyze that the data so here this system will get impacted impacted means you will feel the slowness so that's why in real-time project they will say like okay so don't do any analysis or don't run any queries on the production database on the business servers right so they will say in realtime project also they will give us instruction so don't run any uh some some queries okay aggregated queries in
            • 97:00 - 97:30 the production environment during the business hours if you take ictc they will say like okay don't so they will not even allow take any booking PNR status on that colar that means they're expecting lot of transaction happening on that particular days and particular time so even if you if you take uh us- based companies they will have some moratorium right moratorium on uh Thanksgiving Day on Christmas days they even they will not allow deployment okay
            • 97:30 - 98:00 any deployment on the servers also even the olp also they will not allow any deployment on the servers why because they are expecting more transaction on these days they will announce some no deployment day something like moratorium They will announce it so this is why because they are doing it so they they are expecting more transactions on the day we should not disturb this particular system that's why we'll go for one more database called o AP system
            • 98:00 - 98:30 see o AP is nothing but online analytical processing see analytical processing the analytical processing means whatever the business analytics we are going to do so do it on this particular database not in this particular database so we have to transfer the data from this database to this database so left side also we have the database right side also database only but the thing is here it is a o TP
            • 98:30 - 99:00 system here it is a o AP system it has huge volume of data from my day one to till now whatever the data we have all the data we will store it in oil AP so even in oil AP we will never ever overwrite or delete the data we'll go for historical data will be stored for the future reference so we have to transfer the data from here to here we have to load the data from here to here so this is what the ETL process we will do ETL means extraction transformation
            • 99:00 - 99:30 and loading so we do have ETL process all the pipeline this is called pipeline ETL pipeline so how we are taking the water pipeline right water is Flowing from one end to another end so that's what we have the water pipeline so we will transfer the data from oil TP to oil AP system so here then from this particular oap they will take that reports so previously we do have on premises server so what is that on premises right
            • 99:30 - 100:00 so on premises server is nothing but if the company say for an example you have a company you have an organization you want to have a data warehousing for your company then what will you do so you will approach some companies like Oracle Terra dat Amazon some approach some companies so so they will give us the code okay for this particular data volume so this many users and this many uh transaction per day you can go for this particular
            • 100:00 - 100:30 system so based on the volume based on the business based on the budget and everything they will decide okay this particular database I can go with and this particular data model I can go with and these many server capability I can go with So based on that they will go for the O TP system oil AP system and then they will go for the reporting from here then they will have okay so if you have company on premises server means so in your company itself you are going to deploy the server like database server
            • 100:30 - 101:00 so here database server so unique server and then you are having a data Vos and Reporting server also so everything maintained by us okay the organization so data they will securely they will maintain everything maintained by the the company who is having that organization they will have data center they will manage it so they will get the license and then within the license they will manage it and then data they will
            • 101:00 - 101:30 have it any servers or Oracle or SQL Server db2 Terra dat or any servers they will maintain okay so that's what they will go for the servers and everything and nowadays the company is moving okay towards on the cloud why because they wants to store the data on the cloud they do not want to maintain the servers here so whatever the amount they are the cloud system they are not going to maintain the server right they are not going to maintain the server the cloud
            • 101:30 - 102:00 providers like we have the the Amazon they will be they are providing the AWS right so different servers are there different services so we have the red shift we have the S3 so atina lot of services are there so based on the usages they will pay the amount AWS then we have the Google so we have the gcp Google Cloud platform then we have Microsoft Azure so we have Azure uh so for an example Azure cloud data Factory we have right so then we have the snow
            • 102:00 - 102:30 flick like we have the cloud systems so you can have any cloud system here we have to pay to the money to them so whatever the amount of data we are using whatever the amount of services we are using so for that we have to pay and they will maintain the servers they will maintain the servers the ability and everything they will handle it only thing is data will be stored on their servers and they will handle it okay they will have the control and whatever
            • 102:30 - 103:00 the data security and then everything will be taken care by this Pro the cloud providers so when compared to maintaining the servers and uh the on premises server and for the availability and everything the compan is deciding that okay so we can go for the cloud servers that's what uh we can have the so very the faster rate we can transfer the data and we can have more number of users and concurrently we can use many users and we can go for larger audiences that's what the compan
            • 103:00 - 103:30 is moving towards on the cloud system so either they can have the on premis server or the cloud servers okay in Cloud servers so all the ETL process will be running on the cloud only if they have all the data Lake and everything on the cloud they cannot have Informatica Power Center so why because Informatica power center now we have S3 plug-in and snowflake power exchange connectors and all this we are having it and even we do have BDM
            • 103:30 - 104:00 so BDM is nothing but big data management so in the big data management we can have the big data connectivity so all the big data connectors and everything but still the servers we are we are the organization are maintaining but in iacs so who will maintain the ETL server Informatica Corporation okay so these provider are like a data Vos or data Lake providers but Informatica iacs right they are providing the the cloud data integrator
            • 104:00 - 104:30 CDI okay CDA cloud data integrator or cloud cloud data integrator right CDI or CI Cloud application integration so while taking the IAC I will explain what is CI and CDI more detail so CI is nothing but in order to connect with different applications right so through online then we'll go for the the CI connectivity so now they are improving lot they have started this one some around five years before but now only
            • 104:30 - 105:00 it's booming up okay the informatic iacs is booming up so those who are having knowledge on Informatica Power Center so start learning this IAC so definitely it will be helpful so even if you starting your Informatica Power Center so after some time so at least have some idea on IAC start learning it whenever you are going for any interview so they are looking for nowadays most of them are looking for the I also okay so they will
            • 105:00 - 105:30 ask do you know IAC then only I can process your resume that's what they will say if you know Informatica Power Center very clearly then IAC is very very easy the only thing is how to use the tool and how to create a logic business logic in this tool that we have to we have to learn okay so that's what the business intelligence so business intelligence is nothing but good decisions by effectively managing the data so if you take manufacturing company so they do have plant in different locations right so if you take
            • 105:30 - 106:00 uh any any car manufacturing or bike manufacturing or Life Sciences okay if you take Life Sciences project Healthcare project so they will have they will collect more clinical data from different uh Regional areas and then they will do have analysis on the data and before launch ing the product they need to fix the price for the product so cost Associated to the product okay from the clinical research to the end product okay it's coming up
            • 106:00 - 106:30 so how much we have spend overall all the amount so how much the the production I'm going to do it and what is the supply and demand based on that they will fix the price okay so if they're going for mass production then they will obviously reduce the cost right based on supply and demand based on competition they will decide the the the prices right the product cost so it depends various analysis on the data so various analysis on the data here they
            • 106:30 - 107:00 will analyze it they will take addition accordingly so if if you take any CEOs or any VPS so their own the the business areas so at the end of the day while starting their task the day they will always look at the the business okay up to say for an example today the business started so up to 7:00 a.m. so how much volume we got it okay how many transaction happened what is the amount so everything they will check
            • 107:00 - 107:30 in the online nowadays mobile itself so lot of mobile application they have come up with all the uh tabl Lo powerb all the business intelligence tool right they have come up with mobile app so they will even they can watch all the business so in Mobile also they will go for all the security and everything they will watch it they will give accesses and everything for the mobile service also so mobile application also that's what they need to have the business intelligence tool okay so if you take any manufacturing company they do have
            • 107:30 - 108:00 manufacturing supply and demand the cost so all this right they will analyze it they will analyze it then they will take addition accordingly say for an example if you take any one one Hospital okay so this is one real time scenario this is called data mining our data forecasting right so data forecasting is nothing but like you have last 10 years of business you're having huge volume of data based
            • 108:00 - 108:30 on your previous data you can analyze it and then you can come up with okay how the business will be there in this next year so they will predict it okay they will forecast it so last four years minimum four years of data if you have more data point the accuracy will be very high so you can do the data analysis data science see artificial intelligence data science we have forecasting right forecasting and the artificial intelligence they will analyze all the data they will take
            • 108:30 - 109:00 addition okay they will predict the data and then they will give you the forecasting okay in this particular date so this is what the the sales so in Tableau we have forecasting in powerbi we have forecasting if you give four years of data or five years of data or 10 years of data so it will give you next year data so this is what your business so if uh if if you're going to do this one this business like this next year you will reach this particular Target by this year this date so they
            • 109:00 - 109:30 will they are giving it okay so if you're going if you are growing this much uh the rate then this is what you will achieve this much sales or this much product this much customer on this particular day they will forecast it how they are forcasting so all this based on previous data so they will analyze your previous data they have a certain algorithm so it will generate some report so it is not accurate but it is 90% it's accurate only so nowadays business is going for the forecasting
            • 109:30 - 110:00 also in powerbi in tblue we have the forecasting here so we can take the uh data forecasting all this they will predict so if you take okay so even 10 years before okay we have seen in one of the company they have submitted for the data forecast so I have have this particular these many Hospital facility okay these many uh locations these many countries these many patients I have as of now so
            • 110:00 - 110:30 what could be the so how many patient I can expect tomorrow and based on the diseases okay disas wise they are splitting it inpatient out patient all this they are giving all the forecasting so on this particular date this particular hospital this many patient I can expect okay out of this this many patient for this one and then they are even they are booking the doctors before getting confirmation from the the patients okay so why because they are expecting these many patients for this particular date they are booking it so
            • 110:30 - 111:00 the Innovation submit they have projected all this they were telling that okay we are we were getting only 83% of accuracy we want to go for Beyond 83 85 percentage accuracy so they they need to build a business logic or algorithm to create accurate data so they will keep on doing it right so that's what they will collect more data so if you take all this the Alexa or anything right Alexa or s or anything so
            • 111:00 - 111:30 they are collecting Google so everything they are collecting all the data they are learning from us they are training the mission and then they are giving the all the question and answer and everything right so this is all because of the data so how they are storing all the data they will store all the data in the oil AP system only so this is for they will have it on the data lake side for the business the forecasting and the recommendation all the process all the artificial intelligence Mission learning
            • 111:30 - 112:00 algorithm it will be applied here and business intelligence algorithm Merchant reporting all this they will take care on the oil AP system so data warehouse is nothing but they will do all the data in a table they go for business analytics report okay that's what the data varos will will look like so okay so I hope you are you are getting like a basic idea about why do we need this data Aros and all this so if you take the left hand side so you can see this left hand side and uh is database is the
            • 112:00 - 112:30 server yes database is the server only so database server means you have like the database servers is nothing but Oracle server SQL Server all the different different servers we have okay if you take left hand side you have all the oltp system so olps not only one particular server you have Oracle server yes we have Oracle server we have SQL Server Microsoft SQL Server sap db2 flat
            • 112:30 - 113:00 files say for an example as I said the clinical data right healthcare industry we are the clinical data so all the clinical data we are getting it from different different vendors so when there different places how they will place the data so all the pl data will be placed in some remote servers so remote server SFTP RS FTP servers so from the remote server we have to take and we have to consume the data and then we have to load it into
            • 113:00 - 113:30 the data warehous for the reporting purpose okay that's what we will consume this is what data varos we will call it as heterogeneous Source system so what is the source system heterogeneous heterogeneous means heterogeneous sources means data from various Source system like Oracle SQL Server db2 flat files all this so what is the difference between data varos and data Lake yes I will explain that yeah I will explain that okay so if you take heterogeneous
            • 113:30 - 114:00 sources this is what in IAC we have the replication so we can replicate the data from any sources to the data link directly okay you can replicate so you can synchronize so all this you can do like you can go for backup synchronization all this you can can do in IAC we can do uh very easily so that's what we'll go for iacs okay this is what heterogeneous sources this is nothing but oltp system oltp means
            • 114:00 - 114:30 online transactional processing so oltp in order to run the business I need to have online transactional processing so transactional database transaction processing that means oltp system to run business I need to have olp system to run the business so in order to analyze the business in order to analyze the business I need to have oil AP system okay so I need to
            • 114:30 - 115:00 have oil AP system so if you if you see here this is what my oil AP system oap means we have online analytical processing analysis I want to do so here we have the oil AP system online analytical processing okay so analytical processing that's what we'll go for oil AP system in order to analyze the business left hand side we have the to run the business right to analyze the business okay so I will go for I go for oil AP
            • 115:00 - 115:30 sometimes what will happen right this oil TP also they will have the cloud system no issues we can have the cloud system the companies who are having huge volume of data they will have it so what they will do so day by day they will send the data from Source system to Target system it's a day byday or intraday jobs so different jobs will be running batch so sometimes you are cancelling a product right or you are
            • 115:30 - 116:00 transferring the money or something you are doing it they will say like it will be transferred like after 48 hours right so it will be reversed the account will get reversed or you will get the reverse or settlement after 3 days like 2 days within 48 hours right so why they not immediately doing it they will do all the reconcilation so the back end they will do the reconcilation they will check all the balances and everything uh the intra day or intra Bank
            • 116:00 - 116:30 communication will be happening so the transaction will take lot of bank right so you'll be using One bank so we Are One bank so we you have the card ABC One bank card and the Gateway you are using some different Gateway UPA or any other Gateway you are using it so we have lot of uh Gateway right so PayPal we have Gateway different gateways we are using it and the merchant they might be using different bank so lot of banks will be
            • 116:30 - 117:00 involved after getting the product you are cancelling a product then I have to do the reversal right reversal cancellation dispute so lot of transaction will happen all the data will be captured and it will take lot of ETL process we will do so many tables entry then we will reverse the amount it will take some ETL process that's what whenever they saying like 48 hours or one day or something they are doing it in the back end ETL process only so all are because of ETL process only so
            • 117:00 - 117:30 that's what it will take some time so it will be on the scheduled timing so we have to extract the data from source so they will do extract so they will do extract and transform okay so extract transform and load the data so this is for even for the cloud also same terminology only extract transform and load this is called ETL process ETL process sometimes we'll call it as El process so El process means extract the
            • 117:30 - 118:00 data like a replication server so you do replication extract the data load it into the data lake or data vales here and then the raw data will be loaded here after that we will process it so that is called el process extract okay so extract load and then process it okay extract load and then transform it so that's called el process so most of the
            • 118:00 - 118:30 El process on the data Lake Side the cloud site right so if you take uh YouTube or Instagram or Twitter or Facebook so they will have huge volume of data right so each and every second all over the world they are generating huge volume of data so they will they will not go for batch data so these two are batching batch data batch data processing means so they will have some frequency okay one hour once they will extract the data from olp
            • 118:30 - 119:00 and then transform the data and load it into the data lake or data varous this this is called batch processing so batch processing means batch by batch like Inay intraday 1 hour once half an hour once 4 hours once 6 hours once and daily ones monthly ones quarterly backup jobs so a lot of different different jobs will be running this is called batch processing so bulk data will be captured from source and then we we are doing it
            • 119:00 - 119:30 some transformation logic and then we are loading it so ETL process and then loading so different stages will be there not directly we are going to load so different stages will be there this is what batch processing but we do have live processing so batch processing mostly on the data warehousing side data warehouse so what is data warehous what is mean by warehous warehous is nothing but it's a the place where we will store the warehous means it's a the storage place the same way
            • 119:30 - 120:00 data barous means it's like data we are storing for our historical purpose we are to analyze so where we are where we are on the like last 5 years before and where are we now so all this analysis we will do we have to do right so that's on the data warehousing side like historical data we will do it so historical data we'll store it here so this is called Data varos but mostly the data varos is mostly on the rdbms side so what is rdbms like
            • 120:00 - 120:30 previously we do have rdbms like Oracle server right so rdbms still some of the companies are still using the rdbms on the data warehousing side they will have Oracle okay Oracle and they do have MSS SQL Server see even the source side they will use this kind of rdbms and even in in the cloud system also in Amazon also we have the RDS right relational data
            • 120:30 - 121:00 source RDS we have uh we have rle SQL Server instances you can create on the cloud also so we can do it so here Oracle mssql server so like Nisa we have right so Nisa and Terra dat so these are all some of the data warehousing side this will be on the data like rdbms side it will have only batch processing okay batch processing data it will handle
            • 121:00 - 121:30 only the on premises server this is called on premises server so on premises means the company they will maintain the server they will maintain the Oracle server SQL Server Neta Terra dat all this on premis server okay so but we do have data Lake so data Lake means like more volume okay so more volume Enterprise data varos also it's like Enterprise wide it's a commercial one licensed one they will go for the
            • 121:30 - 122:00 Enterprise data varos and this is what data Lake data Lake means we are the this is also we have the historical data only but the thing is on the cloud mostly on the cloud only lake house right lake house we'll call it as lake house more than the volume of data vales so this will handle only structure data this rdbms they will handle the data only on the the structure data like tables only
            • 122:00 - 122:30 tables all the data will be stored in tables only like the transaction table like a dimension fact table right they will store it in the structure data only but if you go to the data Lake Side so we can use any rdbms also that is structure data semi structur data yes you can use semi structured and unstructured data also all your unstructured data so you can use it so all your unstructured data so like all
            • 122:30 - 123:00 the cloud systems so what is on the cloud so we have so Cloud systems like we have as I told whatever the services you want on the AWS you can go for services different Services the AWS different services on AWS and gcp we have big quy lot of services are there right so we can use it and we are the Azure yes we can use and snowflake snowflake is like a framework on top of any of this data Lake they will build this
            • 123:00 - 123:30 framework to have this okay the snowflake also so you can create any RDS instance also here on the data L side you can create uh you can go for data warehous red shift is like a data warehous only right in AWS the same way you can you can have RDS different Services RDS Services you can create so all this you can go for this one so you can have the streaming the data so you can stream the data so not normally they
            • 123:30 - 124:00 will go for streaming the data so this is batch data right batch processing they will stream the data from here to here streaming means so continuously they will be collecting the data so from here to here from Source system to the Target system continuously they will be collecting like how the water is FL flowing right they will stream the data like packets by packets the data will be streaming from different servers continuously the WhatsApp Facebook and everybody they are sending it right they
            • 124:00 - 124:30 will do lot of business analysis intelligence mechanism artificial intelligence everything they will be doing it the data scientist they will do different process the ETL developers we will do different process okay we are not going to do all this on the same thing data engineer or data data scientist everything like so different the the different way of processing the data this is called streaming the data streaming process so we will stream the
            • 124:30 - 125:00 data like uh continuously we are generating the data from here we have lot of streaming processor right we will process the data we will load it here some recommendation system all this we have so that will be generated data will be generated then feeding back to the olp system again for better user experiences and recommendation lot of uh machine learning algorithm artificial algorithm are running here so if you take iacs if you take Informatica Power Center Informatica Power Center they
            • 125:00 - 125:30 they are connecting more on the rdbm system okay rdbm system so Informatica Power Center is one of the ETL tool one of the ETL tool we do have differentl tools in the market I will tell you so Informatica Power Center so Power Center is the tool okay Informatica is the company name they do have different tools like Informatica Power Center MDM address doctor like we have uh data quality EDC Enterprise data catalog so
            • 125:30 - 126:00 different tools are there so Informatica Power Center is one of the ETL tool and we do have IAC so IAC is nothing but Informatica intelligent cloud services so this ETL server instead of on premises server the Informatica Corporation itself they will maintain the server and so the only thing is we have to go for secured agent we are to install it even in our mission and then realtime server also they will go for
            • 126:00 - 126:30 the secured agent they will install it in the on premis server they will connect their own rdbms to the cloud databases through secured server but this is what we will go for the streaming the data and everything okay IAC will not stream the data like we'll take the data like from The Source system you can connect with rdbms yes different flat files like different flat files you can connect different uh FTP
            • 126:30 - 127:00 SFTP all the servers you can connect Dropbox or anything you can connect and then cloud system also you can connect so all the cloud connectors around we have 50 Cloud connectors so each and every day they are connecting it increasing it and they are enhancing their servers on a ACS side so that's why most of the companies moving towards on the ACs okay so in in future so we can find a lot of IAC inside of Informatica Power Center
            • 127:00 - 127:30 okay they are going for so nowadays itself the companies will have both Informatica Power Center as well as iacs on their project they are migrating from Informatica Power Center work workflow to IAC workflow so you can have some some concept in IAC where we can convert that XML file Informatica Power Center XML file to Informatica iacs XML file so you can convert that into a a task flow and then you can run it in iacs directly
            • 127:30 - 128:00 we have that facility so that's where the companies are moving towards on the informatic intelligent cloud services so this concept we will take it later but this is what we will do it on the the data Ving s okay so we are loading it after that here here we will have the we will have the data Mark so what is data Mark here so data Mark is nothing but based on my business area based on my subject of the business I'll have
            • 128:00 - 128:30 different data Mark it's a logical division of our data warehous only or the data Lake only so logical division we do have some we have the one TBF hard disk we are splitting that hard disk into multiple drives right the same way here also we are splitting into m multiple logical division this is called Data Mark okay so in our in our realtime project we have the like a different data Mark customer data Mark Merchant data Mark all our Merchant related
            • 128:30 - 129:00 reporting so we will take it in this Merchant data Mark only so from the data Mark we can have the merchant reporting and business intelligence reporting right business intelligence reporting is for business people people Merchant reporting for the merchant the end users okay if you take banking so banking CEO or the VP and everybody they will use the business intelligence report within their company they will
            • 129:00 - 129:30 analyze it but they do have lot of the merchant right so they will get the merchant every day they will send the merchant reporting okay so by using our bank your customer purchased this many product and I have taken this is the the uh gate a processing amount so some for one rupe they will have some amount right so for 100 rupees they will have some amount for their to to use their banking right so they will take that amount they will place it so that
            • 129:30 - 130:00 reporting they will send at end of the day to the merchant so Merchant reporting customer reporting and even though we have the customers for the banking at the end of the month they will send the PDF file right this is what you have done the transactions all this so they will have intraday Inay com in day sorry intra Bank communication and interbank communication settlement all this will be running so this will a report they will take it on the data mark from here only business intelligence report they will take so
            • 130:00 - 130:30 Downstream application we will call it as Downstream so Downstream means so we are con they will consume the data from us so that's what the ETL we will do it okay so this is what the ETL process so difference between data varos and data l so data Val go mostly we will use structure data on data Lake on the cloud system we'll go for structured semi-structured unstructured data so that's what we'll go for the data lake so even we are going to see more on data
            • 130:30 - 131:00 Lake and data varos and IAC Target will be a cloud data varous yes definitely so o TP and oil AP both are like servers only so both are like server only like we have this is online Server Like We Will Not Disturb This server we will have some replication server here then from the replication server only we will collect the data from here and to do the all the analysis here analysis business reporting I will tell you some some reporting names some
            • 131:00 - 131:30 metrics so that you can get clear idea so that reporting will take it on the data Vare hosing side the role of ETL tester is nothing but so whatever the developer we will do right whatever the developer we are doing development okay o ap2 o sorry o P to T AP we are transferring it so they need to test it with all the requirement whether they are doing it correctly or not they need to tested that's what ETL tester responsibilities so in in an
            • 131:30 - 132:00 organization do they have both El and yeah if they're going for they're having like all the both like data Vos and data lake so still they will not migrate from all the system from Informatica Power Center to I I directly right so whatever the new project is coming up if they are going to IAC new project they will do it on iacs the old project still it is running on Informatica Power Center they
            • 132:00 - 132:30 will have some migration plan so all the component okay before this particular year this particular they will have some cut off date so they will decommission this old servers they will have go to the new servers so step by step they will have the plan okay they will have the all this right so that's what they will go for the decommission all this so what is the difference between mssql database and Oracle so mssql is nothing but Ms is nothing but Microsoft so Microsoft they have the servers mssql
            • 132:30 - 133:00 server is nothing but it's a database okay to store the data Oracle is like another database like we do have HP laptop D laptop different different laptop we have right all the laptop will that do the purpose same purpose but since based Bas on the budget based on the the the the people and then uh their own wish they will go for different different laptop right the brand same way companies also based on their Project based on their budget data
            • 133:00 - 133:30 volume and everything they'll go for different different uh databases that's what okay so what is El ETL yeah El means e ETL batchwise El also we will do replication server like we we can do the replication from source to Target one to one load we will load it raw data will be captured most of the data L like like a el process only we will consume the data directly we will load it into Data Lake after that we will process it why
            • 133:30 - 134:00 they are doing it El process right so if you are taking from Oracle then do the process and then loading then again the data availability from here to here that will be very difficult if anything goes failure again and again you have to connect with this hetrogeneous Source system right so instead you connect it don't do anything just to load it into the staging layer or any layers here raw data after that on the raw data you do all the process here only so it will not
            • 134:00 - 134:30 impact right this is a huge uh the database right analytical database so for the business they will have this one they will not go and touch this system anymore that's what they will go for the oltp and O AP system okay is iacs ETL tool El so the both okay the tool wise IAC will do ETL as well as El replication it will do it
            • 134:30 - 135:00 will do the synchronization everything the the company how they are using it whether they are using it ETL process or El process that's the ETL the process wise it will differ not the tool wise okay it will do all the both so as table we are having the structure data yes semi structure data means say for an example you have the email ID email if you're sending a email your CC your two address your BCC your subject is everything like a structured one but
            • 135:00 - 135:30 your subject the body of the content right and attachment that is unstructured data so that's what you will go for semi-structure data unstructured means completely like the logs say for an example so if you're going to some site they will say like they will take a cookies okay they will read they will ask your permission to read the cookies from your system so that means whatever the the analysis whatever the the if I'm going to hover my mouse here so they will collect the data so how how much time I'm I'm there
            • 135:30 - 136:00 in this particular site and what are the the product I'm seeing this is what impression all this they will take right the site impression everything they will collect and then they will stream the data so that's what they will collect all the cookies and everything they will do it so that cookies logs and audio file video file everything like unstructured data so mostly in our retail business process we will go for structure data only so why El is used we
            • 136:00 - 136:30 have ETL so El is for extract the raw data here to here replicate the data from here to here then do it all the transformation logic here recommendation system here so don't do any transformation Logic on this oltp side so this system will not available for all the date and times right keep on go and hit this servers so do one time raw data do the replication server here stream the data here do all the analysis
            • 136:30 - 137:00 on this data Lake site that's what we'll go for the elt process hi everyone welcome to Nik Academy in this session we are going to learn how to install ICS in our windows machion it's a web based tool installation is very very easy just to watch the session and parall you can start installing it if you have any
            • 137:00 - 137:30 issues on the installation or if you have any queries add your queries in the comments I will answer it if you haven't subscribed our Channel Please Subscribe the channel and click on the Bell icon so that you'll be getting all the notifications let us start our instution in our today's session we are going to learn how to create IAC free trial account and how to install secured agent in our mission so in order to work
            • 137:30 - 138:00 with IAC we need to have IAC free trial account it is a web- based free trial account you can use it for next 30 days so now let us start with how to create a free trial account just to follow these steps you can able to create a trial account without any issues so as a first step go to this particular site so you can just Google it ww.in informatica.com you
            • 138:00 - 138:30 can go to this particular link and so you can see the free trial account the options so just to click on this free trial you'll be seeing this choose your free Cloud platform trial so just go a little bit down you can see one more free trial account here click on this and here this is the page for creating a free trial account so it's a free for next 30 days so we have to provide all the details here just to provide the
            • 138:30 - 139:00 details so first name so we can give our first name so I'm just giving the first name here and job title so you can provide the job title as a developer and work email so no need to provide the work email so you can give your your personal email so just provide the personal email ID so I'm just providing my email ID I'm just providing my email ID so you can provide your own email ID it's a trial email ID I'm just providing
            • 139:00 - 139:30 here so I'm just providing the trial free email ID account here so you can provide your own email ID so user role you can choose any role here so I'm just choosing the developer role and you have to provide your phone number so just to provide your phone number and organization name so you can provide your organization name so I'm just providing the organization name and you can go to the so you can provide any name here you can go to the country you
            • 139:30 - 140:00 can select the country and state City postal code and your nearest data center location and click on this I have read and agreed to the subscription agreement then this will be enabled click on this start your free trial you'll be getting this particular message congratulations on starting your trial account so check your inbox for more information you can go to the email ID here you can find so
            • 140:00 - 140:30 you should have received one email so go to this email then verify this account click on verify me now so just to provide the password so email ID is our username and password you can choose and confirm password you can choose the secret question and you can provide the secret
            • 140:30 - 141:00 question answer then submit to your account so this is what the page you'll be getting it so in this page you can click on don't show this again so click on this so this is the first page you can
            • 141:00 - 141:30 see here so we have successfully created an iacs account based on the role we have chosen so you'll be seeing the services if you have chosen that administrator role or any other role so you can find other services or or any enabled if you're a data integrator yeah these services are enough so just click on the data integration to create a mapping to create a secured agent you can click on
            • 141:30 - 142:00 administrator and go to monitor so to see the jobs so just we have created a 30 days free trial account so this is the way we will create IAC free trial account so in order to create a secure agent we are to go to this particular page after successfully creating this Informatica cloud account go to this
            • 142:00 - 142:30 administrator and you can go to runtime environment so you'll be seeing that Informatica Cloud hosted agent is up and running so This Cloud hosted agent is running in the cloud machine the data center we have chosen right so in the data center it is up and running in order to run any iacs mapping we need to have our local secured agent so just click on the download secured agent so for Windows machine we can
            • 142:30 - 143:00 choose whatever the OS we have we can we can choose the OS here and copy the installation token so just to copy this install token you can copy this and click on download so it will be downloaded based on your internet speed it will take one or two minutes so once downloaded just click on the agent click on S so you'll be getting this
            • 143:00 - 143:30 particular installation so this is the page you'll be getting it so you can leave this particular location as it is and just click on next click on install make sure that make sure that if you have already installed iacs just uninstall iacs go to C drive you can go to the program files and here if you have already a folder
            • 143:30 - 144:00 just delete that folder before installing it okay so here we have to provide the username so you can provide the email ID as username and the pass the install token is whatever the install token we have copied so that token we have to copy here and we have to provide username as the email ID which have we have given just to provide the email ID as username and whatever the install token we have copied while
            • 144:00 - 144:30 downloading just paste the token and click on register if you have forgot this token or you have copied something else different then go here generate install new token okay so just you can you can generate it so now just click register so it is showing that secured agent is starting up so you'll be seeing that not all the services are ready is
            • 144:30 - 145:00 not an issue you can go here you can reload this particular page so now you should have one more secured agent should be added so you can see here desktop the secured agent has been added it is showing like not all the services are up and running you can go inside you can see here right so it's starting up so you wait for so one or two minutes one by one it will get you can refresh
            • 145:00 - 145:30 the page one by one the services will be up and running we'll wait for some more minutes so very first time the data integration service will take more time to start up so we'll wait for some more time and then it will get automatically it will up and running yeah you can see here so all the services are up and running now so just we have installed secured agent and you
            • 145:30 - 146:00 can see here the cloud hosted agent is up and running and the local secured agent is up and running it should be up and running both should be up and running so then you can create the mappings and you can run it if you are facing any issue issues on installation please watch my session on issues in the IAC account I have explained how to solve that thank you in our previous session we have seen
            • 146:00 - 146:30 how to create a trial account in IAC and how to install secured agent in our mission in this session we will see how to create a sample connections and how to create a first mapping in IAC so for this example we are going to create a sample connection so I'm going to take a data from Oracle table I'm going to take the data from Oracle table and I'm going to load the
            • 146:30 - 147:00 data into Oracle table itself so left hand side we have the source and right hand side we have the target so I'm going to use Oracle table here and here we can have the same Oracle table I have installed Oracle 21c in the machion you can also install article 21c or any other databases so Source might be a different data sources Target might be a different data sources so in this example I just created Oracle to Oracle connection I'm going to here I'm going
            • 147:00 - 147:30 to bring the informatic iacs so through Informatica iacs I'm going to read the data from source so this source and just I'm going to load it into the Target and and I'm going to write the data into this target table so here I have the HR schema in the source we have one schema called HR schema in the HR schema I have a table called employees so I'm going to read the data
            • 147:30 - 148:00 from this employees table so through iacs and I'm going to write the data into the core user and we have the table called tore employees here so this table I'm going to read it's a table to table load so we can see see here it's an rdbms table to rdbms table Lo in real time project we'll be seeing different connections so from here I'm going to read the data from here I'm going to write the
            • 148:00 - 148:30 data I'm going to read the data from source and I'm going to write the data into a Target table it's a table to table load so that to rdbms table to rdbms table so in the Real Time Project you may have rdb to cloud or a flat file to cloud or a cloud to cloud or any other different varieties of source and targets so if you look at here I have
            • 148:30 - 149:00 the SQL Developer you can also install the Oracle database or SQL Server database just you can see the connections and here in h schema I have the table called employees table so you can see I have 107 record in the source I'm going to load this data into the target table so this is the target table you can see as of now I do not have any data so first of all whatever
            • 149:00 - 149:30 the data sources we have so just we have to go to the administrator Services just click on administrator services so make sure that your both of the services are up and running go to the connections so we will create a new connections so right side corner you can see the new connections click on new connections so through IAC we are going to create a connection to connect source
            • 149:30 - 150:00 to connect with the target so whatever the connection we are going to create just to provide the connection name I'm going to provide the connection name is Oracle SRC and if you want to provide the description you can provide the type you can provide the Oracle so type you can provide as Oracle and runtime environment you can choose here the local machine and the Oracle subtype is Oracle on premis server and username you can provide the HR schema and HR user
            • 150:00 - 150:30 and password also HR so whatever the password you have the user credentials so provide all the credentials and here you can use the Local Host port number is 1521 and service name so here in my system the service name is xcp DB1 so in your system if it is XC or orcl any connections you can provide and code page you can provide any code page I'm just providing the utf8 so whatever your source connection
            • 150:30 - 151:00 details you you have just to provide the connection and click on test connections so you can have SQL server or you can have any other connection also so just to provide the connection details you should see this the test should be successful so click on Save the same way we will create one more connection so go to connections here just to create one more connections Oracle unor TGT so just to create Oracle _ TGT so here you can
            • 151:00 - 151:30 create the article connection and you can go to same way you can choose the local runtime environment username I'm just using the core user and password also core and host name you can provide Local Host and 1521 and xcp DB1 so you can provide this details so please watch the description
            • 151:30 - 152:00 I have provided the installation for this Oracle 21c or 11g you can follow the steps and you can create you can install Oracle or SQL server or whatever the databases you want provide the details here and click on the code page and go to and test the connection the connection should be successful click on Save we have created two connections here so now we will create a mapping to read the data from this employees table
            • 152:00 - 152:30 and load the data into tore employees so this connection name here we have used so Oracle Sr this is the connection name we have used so here we have used the connection name is called our orle _ TGT so whenever you are using Oracle _ Source it will connect to this particular database whenever we use Oracle _ TGT it will connect to this particular database so now we will
            • 152:30 - 153:00 create a mapping you can go to the data integration page so go to this particular page this is a homepage so before creating a mapping we will create a folder we will create a project and we will create a folder so you can create a new project here so just to create a new project so you can create a project and you can create inside the project you can create a folder so just I have created a project
            • 153:00 - 153:30 then I'm creating a mapping you can go to the new mapping so click on mapping here and so this is what the mapping so initially the mapping will be invalid state so you can go to the mapping name the same way we can provide the name here so mcore the mapping name will be our test mapping so we can provide the test mapping or employees details so just I'm giving the mapping name is
            • 153:30 - 154:00 mcore employees uncore details the location is nothing but our project so description if you want to give a description you can provide so to read data from hr. employees and load it into core. eore employees TP so just you can provide this and click on Source here so if you
            • 154:00 - 154:30 want to give the source name you can provide the source name as employees it is not a necessary but if you want to give you can give here employees and click on the the maximize window here go to source so you can provide the connection as Source here and connection Source type single object you can click on select so it will be the source object will be selected you can provide
            • 154:30 - 155:00 select whatever the object we want just I have selected employees table so you can see here field so all the field will be poed from the source and Source we have defined so later I will tell you this this filter s and other options here we have lot of options same like our power center you can go to the Target now so you can make the target name is tore employees and maximize the page and go
            • 155:00 - 155:30 to Target so choose a Target connection which we have created it's a single object select the target table which we have created the the table name called tore employees so click on okay and click on truncate Target table so truncate Target table in the sense so this option will truncate before loading the data into Target table it will
            • 155:30 - 156:00 truncate the data and then it will load so go to field mapping here so you can verify the target fields and go to field mapping you can see the field mapping so you can provide Auto map or you can provide manual mapping so here you'll be seeing that Auto map so you can choose the smart map to provide the mapping here so this is what the column level mapping we are providing in IAC just
            • 156:00 - 156:30 click on Save the mapping should be valid now see here the mapping we have created for the mapping we can create a task and then run it or without task also we can run just click on run here so make sure that your runtime environment is our local machine so by default it will be selected with Informatica Cloud so if you are choosing Informatica Cloud then the connection will not be
            • 156:30 - 157:00 there then it will fail so go to desktop choose this option then go to run so my jobs so you can see this job is running now so click on on refresh so you can see the success and it has loaded all the data here right so you can see the data has been loaded so all the one7 record has been loaded into the target table so we
            • 157:00 - 157:30 just read the data from The Source table and load it into the target table so do the same way you can install orle 21c I have given the 21c installation step the description you can follow it and then you can install and create the two schema one is HR schema another another user schema called core and just load the data thank you so practice this so next concept we will see in the next
            • 157:30 - 158:00 session [Music]
            • 158:00 - 158:30 hi everyone welcome to Nick it Academy in our today's session we are going to learn how to read the data from flat
            • 158:30 - 159:00 file to a table in IAC so we have seen the same scenario in Informatica Power Center the same scenario we will see it in IAC also in IAC if I want to read the data from a flat file yeah we will keep the file in real time project we will keep the file in remote server or uni server so we'll read it so but here we are going to keep the file in this path so any path you can choose it so just I'm
            • 159:00 - 159:30 going to keep the file in this particular path and I'm going to read the data from this flat file and I'm going to write it into the table so this particular table I'm going to write it into this table so we are going to read the data from a file so this is our CSV file and this is our IAC so through IAC I'm going to read it and I'm going to write the data into Oracle table here so this is a table here I'm going to write so this table I'm going
            • 159:30 - 160:00 to write it so in realtime project so inside of this table we might have Cloud Target also so we might have S3 we might have Amazon or we might have gcp or we we might have any other cloud system also so here also in Source we might have in real time project we will have the file in Unix server or Linux server or Linux server so from here we are
            • 160:00 - 160:30 going to read it and we are going to write the data into the table so how we will check now so in order to get the data first of all I will create the file so all the columns I will take from here employ ID first name last name email so since we have the HR schema employees table I'm going to export this data as a file since I have the date and time column in
            • 160:30 - 161:00 IAC we need to have a particular format the date and time format we need to have particular format so that's why I am I'm taking the higher date in this format okay that's why I have created here two care of this okay I'm going to execute it you can see this and right click on this particular data you can click on export I'm just creating the source CSV file format you can choose CSV file left enclosure you
            • 161:00 - 161:30 can make it like none and right also automatically it will be none and you can choose the path we can go to the path here C drive SRC files I'm just going to save this data in this file Source flatfile uncore employees. CSV so this is the file I'm going to create it click on okay save and this will be created here you can see here a CSV file has been created you
            • 161:30 - 162:00 can open the file in notepad++ and then you can see here the date and time format all the data so here I'm going to rename this one of the customer one of the employees name into Nic it Academy tell you that okay so from this file only I'm going to read the data so just we are renaming one of the employees name and we are making sure that from this file only we are reading the data so this is the source
            • 162:00 - 162:30 file and this is the target table so we have the target table in our Oracle database I'm going to read the data from this file and I'm going to write it now I will go to the I you can check both runtime environment are up and running go to explore so we can choose the project so I will go to this particular project you can open the project then click on new just to create a new mapping click on new mapping create a
            • 162:30 - 163:00 mapping create here the mapping will be created so you can provide the mapping name so it should be a mcore mapping name should be mcore sourcecore flatfile employees just time providing the naming convention and location is the project name which we have created already and description if you want you can give the description here so I can take this one to read the data from flat file to table just I'm
            • 163:00 - 163:30 mentioning the description here then I will go to the source here click on source so one time on the source source will be selected you can click on this window maximize and go to Source here and click on the connection so very first if you have already created a connection we can make use of that connection if not we will create a new connection since the date and time format is different here I will create a new connection here so we can create a
            • 163:30 - 164:00 new connection here also in the administrator page so go to new connection so here you can provide the connection name yes I I'm giving that Source flat file EMP so if you if you want you can give the description also to understand better type you can choose the flat file then runtime environment you can choose this runtime environment directory you can give this directory and the date format so this is very very important so
            • 164:00 - 164:30 that's why we have created in this format so then this format DD I mm i y format so which format we have so as of now in IAC we do not have any custom date and time format so only these format the data type should be whatever the customer is giving us the file right we are to tell them as of now iacs we have only these dat and time format we have to get the DAT and time format in this way only if they are providing any
            • 164:30 - 165:00 other format I cannot able to read the data so instead of date and time format we have to read it like a string after that by using expression we can convert it that's one option we can do it so as of now we can do it in the state and time format so click on test so this format it will this directory it will go on PING and then it will test it so click on okay now we have created a connection so now we have provided the
            • 165:00 - 165:30 connection Source type is single object single file then go to select object select the object here so this is the object CSV file click on okay and and since it is a file we will not have any other options here so it will be disabled then go to field so since it is a flat file you can see all the field so 11 field we are right all the field data types are string so we have to define a correct
            • 165:30 - 166:00 data type you can click on options edit metadata so you can click on edit meta data we are to provide native data type so our source data type flat file data types flat file to Informatica data type so here what is the data type of our source so employe ID we can keep it like number first name so number of number of four so you can
            • 166:00 - 166:30 keep it like number of six or whatever the preciation you have and first name we can keep it like string last name String email also string phone number yeah it's a string only then higher date higher date is date and time so it will automatically take the Precision date and time then we can go to the job ID yes we have and salary column it's a number data type we can provide number commission percentage number but it is a
            • 166:30 - 167:00 decimal right so if it is decimal make sure that you are providing a decimal point correctly so that's very very important go to that number decimal we can provide the native data data type like we can give some 8 comma 2 so we have to provide the scale scale is very very important then we can go for manager ID it's a number data type then you can go for four digit and Department
            • 167:00 - 167:30 ID also number data type four characters you can verify one more time all the data types we have given correctly and click on save so we have defined the source now then we can go to the Target since it is a direct load so just we are reading the data from a source and then we are going to load it into a table we can go to the Target table here you can maximize it you can go to the Target
            • 167:30 - 168:00 here I have already created a connection Oracle _ TGT this connection single object the object we can choose this object tore employees right so we can choose choose this object tore employees click on okay then it's a insert only if you want you can truncate Target table that means before loading the data into the table so I
            • 168:00 - 168:30 will truncate the target table and then it will load so here we can do the truncate and load if it is increment load we should not do the trunet and load okay go to Target Field here so we can see so this is the Target Field the database and then we have to do the field mapping so field mapping is nothing but we are mapping from Source flat file to Target table in iacs so just we we have to do the field mapping so you can do Auto map click on Smart
            • 168:30 - 169:00 map it will automatically do the mapping so you can see the employee ID to all the columns has been mapped now you can save it the mapping should be valid now so in I just we can run the mapping so first of all we can run the mapping after that you can create so if you want you can create the mapping task so for this if I want to create a mapping task so you can create a mapping task so mapping task is nothing but a session in Informatica Power Center so
            • 169:00 - 169:30 mapping task uncore sourcecore flat file _ employees see the locations choose the runtime environment and mapping you can choose the mapping here so this is the mapping we have created so click on select it will be selected click on go to next so if you want to give any other options you want to schedule it yes we can provide the schedule if you want to get any email notification you can
            • 169:30 - 170:00 provide all the details so rest of the details we will see later just to click on finish so mapping task will be created so here has been created now it's it has been saved so this is map mapping this is mapping task you can click on run before run I'll show you there is no data in the table so I'm going to run it so it will load the data see here it
            • 170:00 - 170:30 is starting you can refresh click on this refresh button so if if is showing that update is available then you can click on one more time it might be Su so next state it will show you row processed 107 success if it is warning then you have to go inside you have to check the log here so you can see same way how many record from The Source how many record return into the target is there any error all this you
            • 170:30 - 171:00 can check if you want to download the session log you can check the you can download the session log you can check it so where is the error or anything so if you want to restart one more time you can restart it this way just we can check here so this is the way we can find okay the data has been loaded right so you can see if you click on this and then you can find the data has been loaded from the flat file to table so this is what you have to do the flat
            • 171:00 - 171:30 file to table so this is simple file it has only comma separated value file that to a single delimiter in our next session we will see with the different D limitter with the different options thank [Music]
            • 171:30 - 172:00 you [Music]
            • 172:00 - 172:30 hi everyone welcome welcome to Nick Academy in our today's session we will learn how to read the data from Excel file to Table in iacs so in realtime project mostly we will get the data from CSV file or txt file Json file XML file
            • 172:30 - 173:00 other different file format but very rare scenario we may get the data from Excel file so in that situation how can I read the data from Excel file you'll see now in order to read the data from Excel file in IAC first of all we have to create intelligent structure model so two things we have to do the one is intelligent structure model so first we have to define the Excel sheet structure
            • 173:00 - 173:30 model then to read the data from this Excel file then we need to have a structure parser so these two components we need to have mandatory so that's what we will do so first we will create ism so this is the Excel file you can find the Excel file here so I kept the file here C SRC files just will open the file you can see here I have in single Excel workbook I have two files one file
            • 173:30 - 174:00 contains employees details the other sheet contains transaction details so here I have transaction date transaction ID account number name credit or debit amount and balance amount these columns I have so I have to read the data from this Excel file and I'm going to write the data to Oracle table so in single Excel workbook I have two Excel sheet through iacs I'm going to read it and here I'm going to write it so we instead of
            • 174:00 - 174:30 Oracle table you can go and write in any cloud system as well so here we are going to write it in Oracle table so for this first I will create a intelligent structure model so first of all I will go to explore so make sure that the environment runtime environment in our system both environments are up and running then go to explore we have created one project called na
            • 174:30 - 175:00 Academy so in this project I'm going to create a mapping so go to new so very first we have to create a structure model components and we we can go to the intelligent structure model it is nothing but the structure parer okay it will read the data from an Excel file so just click on this click on create it will ask name of this ISM so you can provide
            • 175:00 - 175:30 ismore empore txn okay this is the one we can create and location so where I can keep in this project folder I can keep and based on file sampling are entire file so you can provide file sampling so it will take the sample data and then it will create a model otherwise we can go for entire file just we can provide file sampling here you can choose the file so this is the file
            • 175:30 - 176:00 we can choose it it has two different sheet right so display as employee you can see all the employees data and you can see the txn data transactional data so if this data is coming yes it's correct then you can click click on Discover structure it will discover the structure you can find here one pencil Mark right so if the colum name the header is not correct we can change it whatever the way you want if underscore our space is
            • 176:00 - 176:30 there if any space is there or any special character it will not read correctly then we can Define correctly click on edit then you can do so as of now it looks good and then you can go a little bit down you can see here it will show you the Excel and then element it has two Excel sheet one is EMP another one is txl in the EMP you can see the element if it is having all this plus symbol and element that means it is reading the file from correct from the Excel sheet correctly so you can find it
            • 176:30 - 177:00 has employee ID all the columns it is listing down right and click on the transaction also it will display the transaction detail also here so you can find here so all this TX and element everything showing correctly okay we have done the ism correctly click on save it will be saved it has been saved correctly so first thing we have done with the intelligent structure model now we will create a mapping to read the data from the Excel sheet so go to new
            • 177:00 - 177:30 click on mapping new mapping so just create a mapping mcore empore pxn uncore X Excel source exelor source then location we can choose the folder we can choose the project click on okay then description if I want to give description yes you can give the description so read the data from Excel
            • 177:30 - 178:00 sheet so just I'm giving the description here so it's not mandatory one if you want you can give okay source so we have to create one config file for this so wherever you keeping the source file so just create one config file in this way so file underscore path then you can Define the path of the file source file just we have to provide in this format where is
            • 178:00 - 178:30 the file we have kept it the source file just you create one config file in this way the extension is very very important we are to provide the extension correctly okay so this path so first first of all we can go to the source so we can Define The Source path so we have already created the connection for this particular path you can create the connection here so if you want you can create a new connection also so while creating the connection
            • 178:30 - 179:00 you can see here while creating the connection we have to make sure that the date and time format okay date format it's very very important so in the Excel file if you open the Excel file you can find the date date and time format the date format datea of birth we have DD i emm i y format in EMP as well as dxt same format here we have so that's why
            • 179:00 - 179:30 here I have given DD I mm i y format so this format we should make sure very correctly we have to provide and directory is C drive SRC files this path I have given these two are very very important just you can test it I is just pinging the path and then it is finding okay is correct only so now you click on close select just we can select the input this
            • 179:30 - 180:00 file path so this one the file is opened that's why we are getting this second one so you can close it then you can select this path this file then it will not have any field simply it will have only one file path that's all then now you can go to the structure part sir this transformation we have to use so you can choose a structure parser just click on initially it will not have any connections see here it will initially it will not have any
            • 180:00 - 180:30 connection just click on the structure parure one time and maximize it and you can go to the structure parure here you can choose the ism which we have created already you can go to the project you can choose this ISM and then you can see the incoming field so there is no incoming field as of now you can see this is the file path and then it will show you the data so how many characters are there it will read it then you can
            • 180:30 - 181:00 go to this connection you can connect it now so from here to here from here to here so from source to structure parer you can connect then go to structure parser so now you have the file path right the field mapping now you have the file path just you have to map it just you have to map it this file path to this file path you'll be having the mapped field here so mapped field you should have the file path so output field yes you can see here you'll have
            • 181:00 - 181:30 element one that is sheet number one element two is nothing but sheet number two you can choose all this element all the field from element one element two also yeah so this is the way you can choose it just you can choose it just you can see all the field and data type also so if everything is correct yes it will read correctly so you can see here it is showing like a string only but it will take the next time it will take
            • 181:30 - 182:00 correctly then here I since structure passord it contains the date and time format you cannot load directly so we will do it expression transformation so we will convert the date and time format then we will load it so first of all we'll go to it may fail that's why we will take expression transformation one then this is Target one so we have two Targets right I have created two Targets tore
            • 182:00 - 182:30 EMP the same structure I have created so you can find and one more table tore txon I have created one more table to load the data into the target table so if you look at the structure of this table so you can see this is the structure employee ID number employee name dat of birth salary and skill set all this the same way you can see the
            • 182:30 - 183:00 structure of this particular table you can also create this table to load the data or you can load it into a flat file that's also fine just you can take so I am loading into a table right so in order to read the data from here so how to read the date and time that's what so I can take from here so element one since it has two Excel file you can take the element one now so you can click on okay then you can take the target
            • 183:00 - 183:30 one so you can open this expression so you can expression you can go to incoming field so you can find all the incoming field from here right so employee ID employee name data birth all this you are seeing right so here if I want to see the data of birth in any other format okay instead of string if I want to read the data from different format in the structure parer we have to
            • 183:30 - 184:00 read it so we have already read the data so you can in the expression go to expression here so since the date of birth format is different in order to to load the data into table we have to convert the transaction date as well as data birth column so in this column so I'll go here I'll use _ datab bir for EMP and then this should be a date and time
            • 184:00 - 184:30 format date and time then description yeah so here I have to click on config so go to configuration here I have to so you have to use two underscore date of the format so which column it is Data birth so what is the format we are seeing the file it's a dd/ mm/ y so that's a format we have seen right you can validate it so you
            • 184:30 - 185:00 have to convert this one you may ask question so where did you see this format right so you can go to the source you can go to the source here the structure path par so you can go to the structure parer here so you can find so here so if you look at this one the transaction date the format right mm DD y format same way in EMP also it's
            • 185:00 - 185:30 reading the same format mm DD y format so you can go to mapping now we can Define it so we can go to expression go to expression mm DD right so I can change it the format is MM DD y format so copy this and click on okay then go to one more
            • 185:30 - 186:00 expression for next file so you can take this one and take element two the same way you can go for Target here so take the Target and drag and drop and expression one you can click go to expression you can create a new field _ PX and date it's a date and time click
            • 186:00 - 186:30 on okay then go to config in the config you can provide the same way so what is the format so transaction date mmdd y format if the format is not correct it will fail so we can see the log and then we can change it what the format okay validated the expression should be valid so click on Okay now click on okay then go to Target the
            • 186:30 - 187:00 Target we have to do the field mapping right so we can do the go to Target so first Target I'll go to here first Target you can go to Target you can find I have already created a Target connection for Oracle you can choose the target so tore EMP is the first table so the table is tore EMP is the first table
            • 187:00 - 187:30 I can choose a table here click on okay and if you want you can truncate the target table and load it so through iacs is trying to connect the loow local database in my system and then it is creating the establishing the connection so if you are enabling the trate target table before loading the data into Target table it will truncate the entire data and then it will load so you can go to the Target Field you can find this is
            • 187:30 - 188:00 the field from the table you can go to the field mapping you can do all the field mapping so you can do smart mapping here and instead of data birth you can remove this and you can take this data birth this column okay instead of this one Whatever We have created in expression that we have created here same way you can go to the second
            • 188:00 - 188:30 one go to Target here go to click on Oracle _ TGT select the object it's a tore TX and table click on okay if you want you can truncate the target table so this will also create an connection go to Target Field you can see all the field here field mapping same way you can do this smart map and we can remove this transaction date
            • 188:30 - 189:00 column and take oore transaction date column and then save this the map pi should be valid now mapping is valid if you want you can create a mapping task so this is the mapping that's we have created then I will go to the new mapping task so just mapping task mapping task is nothing but session in our Informatica Power Center just
            • 189:00 - 189:30 create mtor Excel empore dxn uncore Source underscore Excel source just I have created one mapping task just I have created one mapping task so runtime environment I have chosen and then go to mapping so select the mapping which we have created already this is the mapping go to next do I need to run the task on schedule no then email notification no and all the
            • 189:30 - 190:00 details if you have any details we have to provide otherwise we can click on finish then will be created the mapping task has been created mapping is available ism is available just I'm going to run the mapping task now click on run go to my jobs you can see this job is starting you can refresh this page so it'll be executed so it's running you can find if the status is changing you can find here update is
            • 190:00 - 190:30 available so then you can refresh it so you can find the job has completed right you can find the job has completed it's a success so now we will go and check whether the data has been loaded or not it has loaded the data so you can see here all the data has been loaded now right from the excels file to the previously the table was empty now it has loaded so this is the way we can read the data from Excel sheet to a
            • 190:30 - 191:00 table so this one why I have created expression right so to load the data from the since it has Excel file has date column since I have the the date and time column here in Target table I have used I have used expression transformation to convert it so if I want to load the data to a flat file no need to have this expression transformation just you can create a Target you can load it so this is the way you have to do it so practice this
            • 191:00 - 191:30 session then you'll get clear idea so if you haven't subscribed the Channel Please Subscribe the channel I will post more videos on iacs keep watching my channel thank you
            • 191:30 - 192:00 welcome to Nick Academy in our today's video we will talk about file list Concept in IAC this is the agenda of for today's session what is file list how can we call more than one file in the The Source how to dynamically create the file names and how to call an automation script in IAC if you are looking for Informatica IAC power center and plsql selfed courses please go to nacit academy.com and go to the courses here
            • 192:00 - 192:30 just click on courses you can find all the courses here you can purchase the courses and start learning it thank you we will start our session hello everyone very good morning and good evening to all so now we'll see One More Concept is called file list in iacs so what is file list in iacs so it is similar to indirect file loading in power center so we have the indirect file loading right
            • 192:30 - 193:00 the same concept so here we have indirect file loading in iacs see your customer you assume that our customer is placing the source file so here I have the source file right so we have seen already how to read the data from source file see assume that some flat file they are loading the customer data or employees data so they are having employees data here so we have employees data it's like one file they are loading so First Data we have loaded here yes so
            • 193:00 - 193:30 then they just place the file here we will start reading the file we will load it into our data Val system yes we load it we will process it and we'll load it into Target system right this is what we'll do so they kept the file in the drop Landing Zone we'll start reading the file so through iacs right yes so throughs we are reading but you assume that so they have more number of files and some more team okay it's not only
            • 193:30 - 194:00 one single team some other team also they are placing the file it is not one single team assume that so different teams are placing it is not one single file so they're going to send more number of files okay so one file second file third file fourth file and fifth file so this way they're going to send multiple files for today okay you assume that multiple files for today so they're going to send more files four and five right so five
            • 194:00 - 194:30 files they are going to send for today so then we have to read all the file files and we have to load it into a one single table here so we have one single table _ employees right so we have to read so how can I read all the files at a single time and I have to load it here so what we are going to do now all the file structures are same structure okay all the file structures are same
            • 194:30 - 195:00 structure that means if this one file first file contains 10 columns the remaining file also should have the same number of columns same number of data type so we have seen this kind of scenarios in our set operations right Union Union all intersect minus we will have the same structure table the similar way so here we are going to keep all this file in the same structure okay so they are placing the file in the same structure so then one file with the one
            • 195:00 - 195:30 file we will create a mapping one file structure we will create a mapping but we will inform iacs that it is not a single file so we are the list of file that is called file list here okay so file list so we'll we'll inform iacs okay it's a file list and the input file is there in this particular config file okay so config file or some other file name you can give config.txt something
            • 195:30 - 196:00 you can place what is config.txt so we are informing IAC that it is a list of file and instead of one single file we have to read today five different file with the same structure so all are like employees details only assume in that way so one HR is creating employees one another HR is creating employees 2 it has some 100 records 200 records 150 records 500 records 600
            • 196:00 - 196:30 records all these records we have to combine and we have to load it here okay so how to load it so this is a table here okay so it's Oracle table you assume that it's Oracle table or any other Target system so it might be a Terra dat cloud or anything so how can I read at the time from all the files and how can I load it into the target table okay so how to do it so we'll see now all the file structure should be same it should not have
            • 196:30 - 197:00 different date and time format if one single file is this format all the five file is same format only so if I have different format then we have to inform them okay so all the files should have the same format it should not have different format okay you assume that customer is placing the file so I will take the file in this particular path input files so here see here I will take the file in this particular path here so how many files are there today 1 2 3
            • 197:00 - 197:30 four files four files are there we have to read all the four files we'll check it here okay so our file is so this is the structure right so so D limiter is this is the D limiter the till symbol at symbol Ash symbol so this is a d limiter for one file and we have to read all the file so all the files are there in the same format only so then I will create one config file okay I will create one config file in the outside so outside I
            • 197:30 - 198:00 will create one config file so this way I will create one config file it's you can name it with different name also Master file config file or any file if you open the file you can see here so this is the file we have right yes see make sure that the file path should not have any space okay in between we should not have any space we should not have like this so then Informatica cannot
            • 198:00 - 198:30 read it so it should have the same this way path or you can put underscore here that's fine wherever you are keeping the file yes you have to keep the file in this particular path same path you have to keep all the file if you are keeping any other path then fully qualified path you have to mention wherever you're keeping see as of now we have four files right so we have defined four files here this config so tomorrow if I'm keeping five files
            • 198:30 - 199:00 yes I have to put all the five files names here itself so this is the config file we just created so now I will go to IAC we will inform okay so this is my file list right yes I'll go here new I'll go to mapping new mapping I will just create a new mapping normal so this is the normal process we'll create right so mcore file list this concept is called file list so for employees data
            • 199:00 - 199:30 so I will take employees data here so then I will take the source go to Source take the source here connection so wherever we are keeping the file we have to put the same connection we just created already this connection I'll take this connection so now the source type so Source type is single object for normal files right we have to choose file list here what is the option we have to choose file list okay so what is the file here the config file we have to
            • 199:30 - 200:00 choose okay so config file I have to choose so do we need to configure anything on the structure no need see now Informatica start reading that particular file it will Define the structure here it will go for formatting right go for formatting it's a d limited formatting go for this one and you can Define what is the D limiter here yes the D limiter is this is the D limiter right so we can choose the D limiter
            • 200:00 - 200:30 other D limiter we can put the D limiter here and we should have all the D limiters are same delimiter so for each file so which we process we need to create the connection no need so wherever you are you have placed the file here okay so wherever you have placed the file so this connection we already created right so we can put the file here if file is in different path okay so different path but this is the
            • 200:30 - 201:00 root path so this is a root path but inside we have some different path we have so then we can no need to create a different connection okay so we can use this part but realtime project we will keep one root directory for all the file system so inside we will have multiple folders for the project level so this is the way we will create it and if it is root folder itself is different then we have to create a new connection otherwise no need to create right yes we have defined each characters and import
            • 201:00 - 201:30 from row number one and First Data we have to read from second character yes we have to put for second character and click okay here second line we are reading the data from the second line so after defining all this it's very simple after defining all this you can go to the preview data you can just preview whether it is reading correctly or not right yes see it is reading correctly so from all the data
            • 201:30 - 202:00 it started reading we can click okay here then we can go to the field so don't forget here this is very very important so since it is flat file so we have to Define this field data type right go to options edit metadata based on one file we will Define the data type okay so employee ID number of 10 and we'll go for higher date it's a date and time format and what is the format here so date and time format if it has
            • 202:00 - 202:30 different date and time see here one notable point is my date and time is it's mm I DD IUN 4 y okay mm I DD I 4y so then we have to change the connection right yes we have to change it I will tell you where to change we have to define the connection changes so why because we created file with different format now the file has different format
            • 202:30 - 203:00 the connection is different right we have to change it I will tell you where to change jav ID salary so salary is number only can go for 10 digit and commission percentage number we can use 4 comma 2 two is nothing but scale decimal point right is total four digit out of that two digits are decimal point so we have the number and four and Department ID is number four digit see
            • 203:00 - 203:30 based on the data we can Define all the column data types yes this is the origin from here it's coming okay so now I will go to the source so this connection is mm iph DD I 4y yes it's correct only right so if if you have doubts just click on view on the connections so what is the connection format we defined so we defined date format is MM iph DD i 4 Y is the same
            • 203:30 - 204:00 format mm I DD i 4 y yes so then no issues okay so we we created correct way the connections and everything so just we have defined the data types right so everything we have defined and here if you see this after saving it you are you are seeing it you are seeing only type precision and scale but we have seen previously native type also right so yesterday I explained so
            • 204:00 - 204:30 what is this native type okay so one more time I will tell you so this native type is nothing but we have the source right so we are the source from flat file so we are getting the data from the flat file so what is the no native data type and this is Informatica data type it will convert so this into this Informatica data type so this one we are defining native types this is the one we are defining in the flat file okay it will read so from there and that it will convert so any if you are going to save
            • 204:30 - 205:00 it automatically it will save this way okay so we defined so this one so now I will go to the Target I'll go to the Target here I will take the target table so here I will choose the target connection and this is the target table I'm going to take in our course schema we have the table called tore employees so this table I have to load it right so this table I have to load it we have already some data so what I will do
            • 205:00 - 205:30 manually I will truncate it manually I will truncate it to show you that okay so there is no data as of now we are going to load so again you can say okay you can do truncate in the mapping itself why you are doing here so just I'm informing you as of now we do not have any data so any I'm going to select Rank and load only so here I'm going to select Target here tore employees truncate and load field mapping so just
            • 205:30 - 206:00 do the field mapping so remember all the four files should have the same structure okay same structure if it it has different structure mapping will fail so we have created one mapping so can I run it do we need mapping task for this no need you can just read it you can run it if you have any task level properties you want to make some email or task level properties pre-processing command postprocessing
            • 206:00 - 206:30 command then you can use the task but now we can run with the mapping so just I'm going to run the mapping it will execute it is starting now and it will be cued on the server on the Cloud Server and then start running it will allocate the resource it will start running it got succeeded how many records 428 records
            • 206:30 - 207:00 has been processed so 428 records has been processed right yes we have 428 so we can we can check it here whether it has been loaded or not yes it got loaded all the data right so tomorrow you assume that there are some some more files six files we have okay six files we have and five seven files you assume that
            • 207:00 - 207:30 seven six and seven you assume that as of now I just copied the file but in the realtime project the files are different number of Records it may have but the structure is same seven files if I'm going to run whether it will automatically read all the seven files no why why because in the configuration file we put only four files right it will not read all the seven files even
            • 207:30 - 208:00 though the customer placed seven files it will not read all the files right so in that case we have to change the config file so what is the changes we have to do okay four then this is fifth file sixth file seventh file okay so you can ask me the questions see we do not know how many files they are going to place it the customer this is the landing Zone they're going to place the file and how can I manually do it on daily basis no
            • 208:00 - 208:30 we should not do on daily basis it should be automated way so this config file we are to automate so how to automate I'll tell you but now I will run this mapping whether it is loading all the seven files now we'll check it one file contains 107 record so seven files we will see that okay 749 record should get loaded right so 749 records if it is loading that means it's correct
            • 208:30 - 209:00 right so we have the 749 records check so it has loaded we can check here and count the number of Records here 749 records has been loaded okay but we cannot we cannot take this one on daily basis right so we cannot create the config file on daily basis so in that case we have to create it see here the secured agent has been installed on Windows machine secured
            • 209:00 - 209:30 agent so where the secured agent has been installed security agent installed on Windows machine so in that case we have to use the batch command so dot batch command so we have the batch command for Windows so if it is Windows we have to use batch command if it is a Unix or any other system then we have to use shell Command right so shell scripting or command we have to use so Unix or Linux we will use this this type of command okay so here we will use the
            • 209:30 - 210:00 batch command okay so what is the batch command to create okay so we have the files in this particular path so we have to go to this particular path navigate to this particular path so check how many number of CSV files available here so create one files so here also or inside also anywhere but you have to keep the file here itself so how many files are there see to avoid the
            • 210:00 - 210:30 confusion I created one folder so I will ask my customer to place all the input file here so then we have to ask them to create the file okay so what I will do today I will place only three files so this with this three files this configuration file should create so do we have the command yes we are the command so we have the list files so we can keep the windows batch file or we can take so this one as command and then
            • 210:30 - 211:00 we can use this command also so if you see this so this is the file okay so it will go to the directory here input files C drive SRC files input files and it'll list down the files over there and it'll create one file called config do EXT so where it will create SRC files so in this particular path C drive SRC files config path okay we will check it whether it is creating the file correctly or not so just copy this
            • 211:00 - 211:30 command okay I will remove this this particular config file now shift to delete okay so see here I do shift delete here there is no file as of now I'll go to the command here okay so command prompt I will use so command prompt so go to the command prompt here so just paste it and run it whether it is running or not so how you are running the command here
            • 211:30 - 212:00 it's a batch Command right Windows command just we can run it if it is executed yes it will create the file here has been created see here so how many files it should be created with these three files we kept three files right see here three files so fully qualified path it has created so this file will be a source file okay this file will be a source file for input for our Informatica so then it will start reading so tomorrow if they
            • 212:00 - 212:30 are going to place the file with more number of file okay so five files you assume that okay so here I have different name so five and I'm making some 61 something okay different name name is not an issue but how many files we have five so I will run one more time the same command it should create the config file with five different file see here has been created
            • 212:30 - 213:00 right so we we should write the command this way so you can use any other command okay you can use any other command if you are you can search okay if you going to have any other file here so you can search only CSV file you can list list down I just created this particular command okay in the Real Time Project based on the secured agent where it has been installed so you have to use the command here okay where can I call this Command right so first we have to
            • 213:00 - 213:30 create a task for this I'll go to the task here that is what the task has been used go to task here mapping task so mapping task we have file list okay so this way I'm going to have file list employee data so then I will choose this particular map mapping then this is the runtime environment mapping go to next schedule details we are not going
            • 213:30 - 214:00 to schedule it if you need email yes you can put email notification so here we have the pre-processing command so what is mean by pre-processing right before start running the mapping so this command okay before start reading the data from the file this command will be executed yes so in that case I can use the pre-processing command here so I will copy this command so whatever the command we used here okay so I will copy this command contrl C I will paste it here
            • 214:00 - 214:30 okay so this is the command I can use and also you can ask me so can we put this command in one one particular batch file so can we call the batch file yes you can call the batch file also so enter pre-processing script so that should be executed before running this mapping task so post processing okay after so what is the post processing after loading the data to Target table if I want to zip all the file and you
            • 214:30 - 215:00 have to make it zip copy you have to move all all the file into archive folder yes so you can put those command in the post processing command so you can just try that options so just I will finish it so if you want you can put any other options okay so just I'm going to keep more number of files this time okay so four five all this so here I'm going to keep four and this is 5 six 7 okay seven
            • 215:00 - 215:30 files I'll put one more file also so eight files I'll keep so more number of files I'll keep it here so you assume that today the customer has placed Source team they have placed eight files so now I'm going to run this mapping whether it will read we'll check it similar way this time the mapping
            • 215:30 - 216:00 task is running starting sometimes I'll tell you this IAC sometimes this command will not execute properly so in that case if it is you have if you have executed the command in the in this particular Windows it is executing fine but it is not working in the pre-processing OR postprocessing in that case you can just restart your machine so next time it will work okay you just restart your machine then it will
            • 216:00 - 216:30 work so here see here I kept the file and it is running it has executed 856 records have has been loaded loaded into the target table you can check here 856 you assume that next day only two files they kept it our source team they put only two files and I'm starting the I have scheduled you assume that I have scheduled the mapping next day next day
            • 216:30 - 217:00 it is running so this time just I will schedule this mapping right that's it mapping task so next dat it is running and particular time I'm seeing on only two files has been arrived okay so today there is no data much data we have only two files and I will start reading that file I'm just running only these many records right so only two4 records have been loaded so based on the number of files it will automatically create the
            • 217:00 - 217:30 config file so this way you have to put automation script okay so this is one example I have I have explained you one example for pre-processing command and similar way you can use post processing command okay after loading the data if everything successfully loaded then take this file okay so take this file so move this file into archive folder so openend some date for this file and then move it into archive folder so this way you can
            • 217:30 - 218:00 put so every time do we have to check source to check for updated files no see whatever the number of files which is available in this particular path we will schedule it right we will schedule this particular job and automatically it will go and check here and it will find okay so number of files today two then start reading it okay if there is no file then it will fail right so it will fail so not recorded if you have automated the population of the config
            • 218:00 - 218:30 file yes it is not recorded here so why because we already automated by using the command so here we just created the Windows command so you can check in the command prompt itself so your own you can do it then you can put the similar Way ZIP file all this you can you can check it and play in this particular pre-processing postprocessing command so I hope you you are clear so this is the way we will automate this particular task
            • 218:30 - 219:00 okay hello everyone very good morning and good evening to all let us start our today's session yesterday we have seen how to read the data from a flat file and load it into a table so we'll continue that so whatever we have seen yesterday so source is like a CSV file
            • 219:00 - 219:30 right like a comma separated value file see if the CSV file so mostly they will place the CSV file our customer will place our source team will place the CSV file in the source path okay so some some folder so they will place the file so even they're going to have a cloud system so they can place the file in cloud system as well like AWS S3 so they can place the file at Google Cloud
            • 219:30 - 220:00 platform so we have the GCS Google Cloud Storage so there also we can keep the file it is not only like a CSV file file so they can place CSV file Json file XML file Excel file and Arro file Park file so different kind of files they can keep it so CSV is nothing but comma separated value file right so we have this this kind of file so can we have any other
            • 220:00 - 220:30 type of file yes so we can have one more file is called so fixed with file so what is this fixed with file right so each and every field in this file is fixed width so we have a different width here so we may get some source file I will show you so that we can load this particular data so if you see this so I will take some fixed with file here so we'll try
            • 220:30 - 221:00 to load this particular data so I will take this customer file this is the customer file we have and I will show you the what is fixed with file so if you see this so this is the fixed withth file do we have header in this file no so this particular file doesn't have a header so yesterday we have seen a file with header so today we'll see how to read the data from the file without header as well and also it's a different
            • 221:00 - 221:30 format so here the format is fixed with file so what is fixed with so each and every field here so normally how informatic will identify so we need to have some identifier right some D limiter we need to have so if Informatica finds this particular D limiter it will assume that okay this is my employee ID so this is my first name but fixed with is nothing but so we'll go for okay we will Define like employee ID is five characters you see this so
            • 221:30 - 222:00 this is five characters for employee ID so first five characters they are allocating for employee ID so next so next for an example 12 characters see this 12 characters we are allocating for first name so up to here we are allocating for first name so maximum we are allocating the number of character we are allocating maximum like a 12 characters for first name so it should have first name so if I have last name
            • 222:00 - 222:30 it should start from this particular position only okay it's like a boundary we will set the boundary so for each and every field we'll set up boundary okay from this particular position to up to here it is nothing but last name so whatever the name we are going to keep it here so that should be like last name okay so this is what we'll keep so then from here say for an example here I'm starting so from here while starting I will take it like okay this is my data
            • 222:30 - 223:00 birth okay customer data birth our data join or whatever it may be so how many characters here so I will take 10 characters right so next 10 characters will be data birth so from here it will be like we have so from here I'm going to take so from here right hand side how many characters so we can take here so 10 characters it's like a customer name so this way we will Define it okay so for each and every field for an example
            • 223:00 - 223:30 you may ask question if if I have more than 10 characters of this last name so what should I do so we have to Define in the structure itself okay in the beginning of the structure itself in the design pH itself we are to Define okay maximum we expect the last name will be 50 characters you assume that here so for safer side I'm saying so then I will put 50 characters here okay so this way
            • 223:30 - 224:00 so then up to here it will be a last name sa First Site okay so if you're getting four characters or 40 characters it is not an issue so we will Define okay so next three characters for last name so this way we will Define it so this particular data is called Data file is called fixed with file so in order to read this kind of this kind of file okay so do we get this kind of file in real time yes definitely
            • 224:00 - 224:30 we'll get it so then how can I read this file we kept the file here it's a customer file and I will go to the Target table so I will create one table for this and you can also create this particular table so it's a customer fixed withth customer ID first name last name date of birth city so with the five columns and number of characters so Target we can have more number of characters also so what I will do I will rate all this table okay so just I
            • 224:30 - 225:00 deleted it will get deleted manually I did it so as of now there is no data in this particular table so this table you you have to create it so I will give you the create table statement or you can also create FW is nothing but fixed with so that is what I just named it so you can see there is no data in this particular table so what I'm going to do I'm going to read this so in order to read this file so first we have to define the structure of the file so
            • 225:00 - 225:30 where can we Define go to data integration so just go to data integration click on new it's taking some time to load go to new so normally we will create a new mapping and then we'll start mapping it right but for fixed withth file we have to first we have to define the the fixed withth file format so go to components here in this particular assets just go to components the components you can find fixed with
            • 225:30 - 226:00 file right yes so configure reusable formats for fixed with flat files to use in a mapping and mapping task we'll create it we are informing IAC okay this is the format we'll get it here the fixed with file format three so we can take this name itself no issues so this is the location I'm going to keep it if you want to put some descriptions you can put sample file connections okay so where is the connection sample file so we can take
            • 226:00 - 226:30 this connection itself yesterday we created right so this connection we can take it and Sample object we select the object here this particular path so this is the file here I can select it see this file doesn't have any header as of now so you can see this particular column boundaries so you can select from the zero so zero position so up to five characters up to here you just keep the
            • 226:30 - 227:00 cursor here and first we are selecting this should be okay 0 to five characters one boundary it's nothing but employee ID or customer ID so whatever the field you have so then up to here 5 to 17 12 characters for first name then keep the cursor here so then keep the cursor here if you have selected wrongly then we can remove it so we can just uh you can remove this one and you can go up to
            • 227:00 - 227:30 here it's like a next character so how many field one 2 3 4 five field we have so it doesn't have any header right yes so then we will Define the header here so go to edit columns edit columns here we can put the column name so what is the column name First Column customer ID yes then second column we are the first name then we have last name data birth city we'll take it to this week last
            • 227:30 - 228:00 name data birs take the city so data birs so what is the data type here we have to Define each and every field data type customer ID is number first name we will Define it's a string only last name String this is date and time and city is string only but here the date and time format alone it's asking the format so what is the format of this particular date we'll open it we'll see see this
            • 228:00 - 228:30 format is we can Define okay it's like mm DD y r DD mm y slash so we can Define in this way right we can put here mm/ dd/ y format so whatever the format we are getting the source we can Define so then you may ask question here why you haven't described the column length for customer ID see customer ID we have defined here itself five characters right so automatically it will take that
            • 228:30 - 229:00 boundary so no need to Define here it will automatically Define so this way we will Define this one characters okay if it is decimal here decimal also we'll go for number only so number only we'll Define total characters here so first R you can read the decimal as character and then in the in the expression transformation you can convert it function two number you can use it or two the conversion function we can use
            • 229:00 - 229:30 it okay so if the file contains any any header for example if the file contains any header you have to go to additional attributes so go to this additional attributes you can edit edit the additional fixed with attributes here if the file contains any header so then we have to put here okay so number of rows to skip so if the file contains header then we have to make it like a one if it
            • 229:30 - 230:00 doesn't have any header then we can make it like a zero itself okay so based on the file properties we have to define the extra properties here okay so we just defined it click okay here so that's it so we haven't changed anything save here it will get saved here okay so we Define fixed with file format so now we will go to mapping so go to new mappings similar way we have to create a mapping here go to the mapping and
            • 230:00 - 230:30 create new mapping mcore fixed with file right yes so we'll go to fixed with file so data so we are going for customer data so this way we'll Define it so first I'll go to the source so here I have to take the source source I will take the source connection here yesterday we created the connection right so you'll take the same connection but Source type here right so we have to choose single object only so as of now
            • 230:30 - 231:00 we are seeing the single object we have to choose the object here yes we have to choose the object the object is this is the object object right source file yes we'll choose then where we will Define it is a fixed with file or or CSC file go to the formatting options here right yesterday we have seen the previous session we have seen so how to for how to define the format so go to formatting options so here we have the flat file type in flat file type we have seen D
            • 231:00 - 231:30 limited file so now it's a fixed with file so just we have to Define it is a fixed with file right so to use a fixed with file so you must have already configured a fixed with file format yes we already configured it so this is the configuration we did it right today so file format three so just you can choose this then click okay here so it will automatically it will populate so you
            • 231:30 - 232:00 can if you want you can just preview the data so you can run the preview and so here itself you can check whether it is reading the data or not right so it is is correctly reading so even if you have three records or or any number of record it doesn't matter so only thing is so how we are defining the columns that is the important one okay so we have defined we'll go to the field now so you can see the field customer ID decimal five characters right so we have
            • 232:00 - 232:30 here so if I want to change it scale or something yes you can put so decimal so precision and scale we can Define here for an example you are having some salary column so 10 comma 2 yes total character is 10 out of that two characters are decimal so that we can Define it here so string 12 characters all this we have defined everything so date and time format we have defined already yes that's it okay so we'll go
            • 232:30 - 233:00 to the next Target we take the target here we Define the target here I will go to the Target table and take the select options we will we will take the target table here so Target table is so this is the target table so you can choose this target table click okay here and just search it so then we'll have the table here right yes so then if you want to truncate the target table before loading yes you can
            • 233:00 - 233:30 choose truncate and load go to field mapping so here if you have exact name yes you can go for map the smart map right even if it is if it has partial name it will automatically smart map it okay so but look up and all we will not go for smart map we have created the mapping now so do we need to create a mapping task for this compulsory yes we have to create the Real Time Project we will create to Define some task level
            • 233:30 - 234:00 properties but here not necessary so we can run so I'll go to run I'll click on run here has been started and you can see here so we'll check it is starting okay so three records got loaded so you can see here all the three records has been loaded now so even if you have millions of record this is the same process we have to follow but one notable Point here it is see this is my first name right if
            • 234:00 - 234:30 you see the first name since we have defined the remaining characters it is taking null space so on the right hand side see here John how many characters four characters but we defined like 12 characters right so remaining eight spaces it is taking actually this is Jung data we are loading right so we are loading Jung data so we should not load this way you can see here some space if you're loading with the data it's Chun data so
            • 234:30 - 235:00 we have to handle this so how can I handle this instead of four characters it has loaded with some 12 characters with space right so we have to handle this one so we have to make some changes so where can I make some changes so here I'll go here expression transformation right in between but I will take expression transformation later but here I'm going with the expression so where can I change it so I will do it for one
            • 235:00 - 235:30 particular field oore lastore name right so this field I will take our first name so even I can take first name String can go for some 30 characters and I'll go for the configure here go to the configure I have to make it like first name right so how can I remove the space on the both the side so we have to go for r r trim function so we have to use rrim so if you want to
            • 235:30 - 236:00 remove space on the left hand side you have to put l trim so there is no trim function directly in Informatica either you have to use R trim or Elm okay so just validate be validated so this is _ first name output so then we have to configure here so this should be done for other field as well right last name as well so just to go to first name remove this it's coming from Source directly we'll take this first name here so it's a manipulated column right yes
            • 236:00 - 236:30 so we we Chang it so then I'm just trying to load so what will happen now it will truncate the existing data in the table it will load it okay so here you may ask question so what is the job status it's going for starting and then q and everything see here it's like a Cloud Server right it's a Cloud Server so in the cloud so there will be not not only
            • 236:30 - 237:00 our jobs there are lot of jobs will be there right so based on the organization ID based on the the resource and everything so first here it will be started and it will be on the cube on the Cloud Server and the queue will be cleared one by one then automatically it will get executed why because it's a Cloud Server right so there will be lot of lot of jobs will be there based on the organization ID so they have to run
            • 237:00 - 237:30 all the jobs based on the priority so it will be running in the queue only but the queue will get cleared quickly so that's what they have so how many multi- node cluster they will execute the job quickly so that's what we have the different stage so now it got loaded you can check here so there is no space on the right hand side right so similar way we have to handle it see all this so based on the customer need we can handle it in our Informatica IAC based on the
            • 237:30 - 238:00 requirement okay so this is the way we have to load the fixed with file so I hope you are clear so how to load Lo the fixed with file so you can take some fixed with file by yourself and try to load this particular
            • 238:00 - 238:30 data hello everyone let us start our today's session you assume that the source team is placing the file it's like a CSV file okay it's a comma separated value file they are sending okay this is our source file it's nothing but it's a CSV file format so we already seen that okay so how to read the data from this particular file and how to load it into a table so
            • 238:30 - 239:00 here I'm going to bring informatic iacs and here I'm going to bring the target so Target might be a cloud or any rdbms so we are going to read the data from Source system source is a file okay for this time source is a file here this is our iacs and we are going to load the data to any rdbms or Cloud so we can take it like R dbms here or even Cloud so anything okay so Cloud also we can
            • 239:00 - 239:30 load R Target also be a flat file so in the rdbms I'm going to have the table so we already created one table right it's a flat file to table Lo we already created this table tore employees table so in our today's session so this particular file contains some uni code characters okay so uni code characters so maybe the different character so Chinese character Japanese
            • 239:30 - 240:00 character so different character it has if it has different character how to read the data we need to consider code page okay so in iacs we need to consider the code page which we have seen this code page in our connection so how to use this code page in our connection so what is the significance of this code page to read this kind of data so if you're going to work on the realtime project yes definitely will receive this
            • 240:00 - 240:30 kind of file system from the source so in that case how can we read you assume that this is our source file so this is our CSV file I kept the file in our C drive SRC files I will open the file I'll show you if you see this file the file has some uni code characters you can see here some uni code characters it has some characters here right yes different character set we have so here we have
            • 240:30 - 241:00 some different characters some Sim symbols we have and then different character set you can see here so this is the different character right so based on the character set our source team will inform us okay this is the character set we have So based on that we have to put your code page and we have to read the data from the source so how can we read and so what are the things we need to make sure we will see one by one and for this I'm going to create a table Target Target as of now
            • 241:00 - 241:30 there is no table so I'm going to create it by using admin I will create create table in the core user I'm going to create it tore employees so this is a table as select start from hr. employees V 1 equal to two so just I'm creating a table in the Target so why I'm creating the table
            • 241:30 - 242:00 already we have the table right I dropped it again I have created newly if you see this so this is the table structure and it has 11 columns and if you describe the table so this is the number of columns and data types so why I have dropped and recreated I will tell you see this the employee ID is number first name is far 2 and all the characters right yes we will start reading the data so how can I read the data so source is file system
            • 242:00 - 242:30 right I'll go to the I'll go to data integration service so we already created the connections we'll go to new mapping create a new mapping here and I can name it mcore sourcecore flatfile underscore employees with uni code data so what is uni code first of all if you see this so uni code characters if you want to learn more so just to check this
            • 242:30 - 243:00 page you'll get more idea so what is this uni code characters right if you go down you can see different character character set different character set we have so different character set from here only I have taken some uni code characters so some Copyright symbol symbol so all the different kind of symbols I have taken okay so this kind of so this is like a utf8 characters so different character set we have so just to go through this you will
            • 243:00 - 243:30 get some idea about so what is this unic code characters and from here only I have taken different characters okay so we may get some kind of okay the thumbs up symbol and emojis and everything we can get it so from the source so in that case how can we proed so go to Source here source and select the connection so we'll see the connection here first of all we'll check what is the datea date and time format here so this is mm I DD
            • 243:30 - 244:00 I YY format the date format is mm I DD i 4 y and we have uni code characters also so mm I DD i 4 y only we have given if you go to this view connection so just you you'll see the view connections here so what is the code page we have given Ms Windows Latin one so this is the default code page right yes and also date format we have given here that's
            • 244:00 - 244:30 fine we'll keep it and we'll select the object here so object is nothing but this is the file right so we'll take the file in the file so this is the D limiter control C and I will select this particular file click okay go to formatting options so click on the formatting options the file type is D limited file and D limiter here it is other D limiter we
            • 244:30 - 245:00 have to provide the D limiter list right it is not a single D limiter we have multiple D limiter here so that is what just I'm mentioning the multiple D limitter here yes so then text qualifier if you have double codes yes you can mention that and First Data row number at two so since first row is header we have to provide this two so this is the default conditions click okay here and just to check preview data so for our
            • 245:00 - 245:30 reference we can just we can test it okay it is reading see see here it is not reading properly right yes we'll see how to fix this particular issue as of now if you preview the data here itself it is not reading properly right so that means the connection is not correct so go to view view connection edit the connection and here so code page change
            • 245:30 - 246:00 it to utf8 so you you can check with the source team okay so what is the you the code page of the data then accordingly you can change it so for an example this utf8 can read this particular data the utf8 code page right so it can read so the symbol and the symbol and different characters here and these characters it can read okay so how can I
            • 246:00 - 246:30 read it right click okay here we changed the code page so now we will test it so just we can preview it is not showing now so already it has been previewed that's why it's not showing so formatting options click okay here it should again it should refresh okay so now as well it is not showing clearly now we'll go to the field in the field go to options edit
            • 246:30 - 247:00 metadata so here everything make sure that it should be a stream string data type y string so wherever the character we are making that should be a n string okay employ ID is number so 10 digit number other characters other fields are n string only so n string for uni code characters make sure that Informatica we
            • 247:00 - 247:30 have n string okay so now we will job ID higher date it should be date and time and other field okay so we have to change it right so the source is flat file yes we have to change it everything so decimal 4 comma 2 decimal points just I'm checking manager ID I'm making four and I'll go here number number of four yes we have we defined everything
            • 247:30 - 248:00 right just to save this and if you want you can check it preview also so here also you can run the preview we will check it whether the data is reading from The Source or not properly okay it will generate the preview now if it is showing correctly so that means we are good at source side instead of running before itself we can preview the data so this is what we
            • 248:00 - 248:30 can test or debug our mappings after changing the code page whether it is reading correctly yes it is reading correctly right you can see here the symbol got it and then it it is reading properly right different characters okay so now we will try to load the data so if you have any other transformation any other transformation make sure that all the transformation should have n string for the character
            • 248:30 - 249:00 set go to the Target so Target we will take the data we will check it whether it is loading so source side it is reading correctly but Target side we have to check it right whether it is loading properly or Not So based on the uni code characters we need to provide the characters okay Japanese character Chinese character so different utf8 UTF 32 are different character set okay so you can check some uni code characters and you'll get clear
            • 249:00 - 249:30 idea so what is that unic code characters and everything if if the file doesn't have any unic code characters yesterday we have seen so you can go for default character set Latin and then you can start reading it but if the source team they are especially sending with some Unicode characters we need to ask them to check what is the unic code characters here based on the Unicode characters we need to select the code page okay so I have taken the target table go to truncate and load go to
            • 249:30 - 250:00 field mapping so just I'm doing the mapping here okay save this see the file contains uni code characters okay the file contains uni code characters it means this is uni code character of utf8 clear so uni code character of utf8 so different formats we have Chinese character okay if you go and check the code page you will find it the connections so what is the uni code
            • 250:00 - 250:30 character based on that you have to choose see we have the different uni code characters right so I have given utf8 that means these character set can be handled by using this utf8 if you are thinking that okay the source has different uni code characters code Pages then you have to choose accordingly you have Chinese right so Japanese Taiwan and then we have different German so we have different different
            • 250:30 - 251:00 character set we have so you can choose here right so based on the Japanese and then we have different Europe right different character set we have So based on the character set we have to choose from here code page so now I'm going to run it so we have done the mapping now I'm going to run it Source it is reading we will check it whether Target is writing the data so as I told if you have multiple Transformations make sure that all the transformation has n string for the character set n
            • 251:00 - 251:30 string okay n string it should have so utf8 and UTF 16 is not normally it will read most of the special characters so that's why we are going for utf8 or UTF 16 so that is what I'm saying so please go through that Unicode characters List and then you can find so what is Unicode character what is the different characters are there right see here it is not loading the data properly so that means it is it
            • 251:30 - 252:00 has it read the data from The Source but while writing the data to Target table it is not not loading properly so then how can I load properly if you describe the table if you describe the table you can see the table contains Vare 2 but it should have n Vare 2 okay so in that case so what I will do I have to change wherever we have Vare
            • 252:00 - 252:30 to to nare 2 so how can I change with the data can I change the data type no so for that I will truncate it see in real time we cannot truncate as it is we are to create a backup table constraints triggers and everything we have to check and we have to create one more but here just I'm truncating the table so we have to check the dependencies right so directly we cannot trate any table we have to check the indexes triggers is
            • 252:30 - 253:00 there or not and that data should not get truncated as it is so I just trun ated there is no data so in that case I will describe so this character this particular column I need to change it right how can I change data type alter table table name tore employees modify alter table table name modify column name to this one so I will check
            • 253:00 - 253:30 other field so we have some other field right so how many number of field we have character set so we have last name email then we have phone number if I have any special characters we should change it right yes so I will take all other field and then here job ID so this field I will
            • 253:30 - 254:00 selected I will change it so all other field has been changed so now I will go and describe so everything is now like nare to okay so after that what we have to do we have to run it one more time we will check it so who will decide which which uni code we have to use So based on the data okay so which source team is going to send the data based on the data we have to check with them whether what
            • 254:00 - 254:30 is the uni code characters in it right so we have to decide which uni code characters are we have to check with them if they are sending the data what kind of data they going to send so it will have some Chinese character or Japanese character based on that we have to decide we have to work with business analyst and the source team and we have to ask these questions whether the source file contains any unic code characters then what is the unic code characters okay so do we need to handle
            • 254:30 - 255:00 any Unicode characters whether they going to send a file as azip file encrypted file all that seven days they are going to send the file so what is the format what is the D limiter whether the data will contain the D limiter going forward so what is the maximum length for each and every column so these type of questions we need to ask them okay to the source team so while in the requirement itself we have to ask them get it clarified okay so we have executed and then we'll check it now whether it got loaded or
            • 255:00 - 255:30 not we changed the data type we'll check it one more time okay still it is not loaded so what is the mistake we did it so we changed the column in the database level see here so this is the mistake we might be doing in realtime project also that is why so first I didn't do that see here this is the target table I just changed the data type in the Target database database I changed so what is
            • 255:30 - 256:00 the data type in the Target table in information Informatica level so Informatica level if you go to the Target Field still we are seeing the string only right we are seeing string only if you want you can edit metadata still we are seeing whereare only right it should be n workare so how can I see whenever you are altering the structure or altering the data type and everything you have to
            • 256:00 - 256:30 sync to the database see here we have the option synchronize right so just to synchronize this particular options it will ask all the field yes synchronize so now it should be n n string nare if you changing anything see here now it is nker n marker here so if any design changes any ddl changes happening on the target table so that should be
            • 256:30 - 257:00 synchronized with in the IAS as well okay so we will save it so that is what I'm saying so please make sure that from source to Target everything should be nare car to in Informatica level as well as database level so I will run it now so we'll check after this it will truncate the target table it will load it again we will check it whether it has loaded
            • 257:00 - 257:30 properly or not so it'll go here so all the 107 record has been loaded now I will check it whether it has been loaded correctly yes has been loaded correctly right with all the different characters see this particular character set the code page utf8 will handle this kind of special characters so that's why I used utf8 characters So based on the source we have to choose the code page okay so
            • 257:30 - 258:00 this is what so you can you can create one more file with with a different character set some Chinese character or Japanese character you enter into this file and then try to load it and check it whether it is loading properly or Not So based on that you can choose the source code page if you have different data type different code pages on different file yes we have to create
            • 258:00 - 258:30 different connections So based on the source file we have to create different connections okay hi everyone welcome to Nik Academy now today's session we are going to learn replication task in IAC in IAC we do have different task we
            • 258:30 - 259:00 have the mapping task replication task synchronization task Power Center task data transfer task and masking task and so on so in our today's session we are going to learn replication task so if you have any questions on IAC on replication task please add your questions and clarification the comments I will respond to it thanks for watching let us continue our
            • 259:00 - 259:30 session so here I have one database here I have one database so here I have Source you have the source and Target so replication task it is used to replicate the data from here to here for the replication task so one difference is we are bringing the ACs here so whatever the source we are going to have so it will not replicate all the sources so it will take sources like
            • 259:30 - 260:00 relational database table and then it will go for Salesforce these two only we can can take so relational database table and then we can go for Salesforce connection these two we can take and then here we have so in the Target side we can use relational database and we can create a flat file so that's what we can replicate so replicate task means we can if you want to replicate so if we want to take the backup so we have the some data here
            • 260:00 - 260:30 here I have some other regions I want to replicate the data So based on some filter conditions we have to apply country equal to then we have to replicate so then we can create a replication task or we can go for some backup jobs so backup workflow is running so it will create a backup here and then it will load so whatever the tables here we have so that we have to replicate it here so that's what we can go for so in the
            • 260:30 - 261:00 replication task so here I have many tables if I want to replicate all the tables here I can replicate so for the replication task you may or may not have the target table okay that's only uh difference between the replication task and synchronization task synchronization task we need to have a table here the target side we need to have a table but replication task don't need to have the table so first we will see the
            • 261:00 - 261:30 replication task what is it so use the replication task to replicate data to a Target so you might replicate data to backup the data or perform offline reporting so whenever I want to go for some offline reporting some materialized view we are creating right so we are having server here we have another server here we want to create some materialized view so normally we used to create right the same way here I want to create a backup tables then offline reporting here we have to take the reporting then you can go for the
            • 261:30 - 262:00 replication task so this kind of things we can take it and we can go for replication task just I'm going to take if you take the replication task I'm going with new so task so we have seen already mapping task so just I'm going for replication task so copy data from a Salesforce or database Salesforce or database source to a database or file Target so as of
            • 262:00 - 262:30 now they are providing this replication task only okay this facility only maybe in future they might be adding some other functionality as of now we have this concept only okay so just we are going to take replication task so create it so it has five steps so for the new replication task we have five steps steps here so just we can take replication task one we can choose the location here the the folder the project
            • 262:30 - 263:00 and then description if I want to write some descriptions okay this is is the replication task for these tables or flat flat file okay but the source is it will take only the relational database connection and Salesforce connection it will not take flat file connection remember this okay Source we cannot take flat file for the replication task so just I'm taking the source connection so we can choose the S source
            • 263:00 - 263:30 connection and the next option is so it will ask object to replic we want to go for all the objects in this particular connection no I do not want all the objects so I can go for include objects so here we have the include object right so we can select whatever the objects we want to replicate so click on select here so it will select all the whatever the objects we have so it has 16 objects right now
            • 263:30 - 264:00 so for the 16 object I have to create a replication task all the 16 object I have to replicate no I want to go for some some particular tables okay so I'm going for departments employees and locations all this these three I want to create yes you can choose these three objects so we can create a number of objects here then you can go for this is the order it will select then go for select here so it will select these three objects and if you want okay so
            • 264:00 - 264:30 you have 16 object so out of this these many object I do not want then you can exclude those object only then it will be excluded those object only remaining objects it will automatically replicate then we have one option if an error occurs so what I need to do so if error occurs if it is replicating by departments first or employees do I need to come out of the replication task or I have to proceed with other task then you can go for next so next here it will ask
            • 264:30 - 265:00 Target definition here so Target connection so we can choose here as I told the target might be a table and flat file so just I'm selecting the table table connections so what is the prefix of this target table so we have the source table we have selected three tables so for the source three tables what is the prefix I need to give for Target so I'm just uh making that RT replication
            • 265:00 - 265:30 something replication task it is creating prefix I'm just giving so enable bulk load so bulk load we know right Informatica we have bulk load bulk load means so batch by batch data will get loaded so bulk load false means normal load one by one it will insert the data into the target table so that's what so by default bulk load will be false so load type so here we have the replication option so if you see here load type so we have different options incremental loads after initial full
            • 265:30 - 266:00 load so very first time the replication task will go for for full load so if I have the tables here three tables we have taken so all the three tables has some data so here I have thousands of thousand records here I have thousand here I have thousand you assume that so all the Thousand will go for first day okay so very first time it will go for all the Thousand records so next day you are scheduling it don't need to go for these thousand records right so whatever
            • 266:00 - 266:30 the record has been inserted newly so that record alone we have to take so for that we have to have some audit columns if you define the audit columns the iacs itself it will take what is a new record and then the new record alone it will replicate here so here we have the incremental loads after initial full load so we can give so as of now we do not have this option for relational database tables only this option is available for Salesforce connection okay
            • 266:30 - 267:00 this is not available for load type so it will be so every time it it will go for full load only then this option okay so incremental loads after initial partial load so what is that initial partial load okay here I have 100 records or thousand records so very first time I'm loading okay after this particular date whatever the data has been inserted or updated here so that record alone I have to load okay so some date we can mention so it has loaded 500
            • 267:00 - 267:30 data partial data so after this partial data we can go for incremental load whatever the new lead has been added so we have five years of data I do not want five years of data I want last one year data after that incremental data so then you can mention that date from which date I want to go for so initial load rows created are modified after this particular date so that we have to mention then iacs knows that okay
            • 267:30 - 268:00 whichever the date whichever the data is coming after this particular date it will pull the data for very first first TR after that it will go for incremental load so as of now for this options relational connections we have full load only so every time it will go for full load okay delete options so we have delete option remove deleted columns and rows if any record is getting deleted from The Source or any columns so previously we have 11 columns but now
            • 268:00 - 268:30 they have after some times they have uh removed some columns so remove that column means I have to remove the column from the target yes then you have to give this option otherwise you can go for retain retain deleted columns and rows in the Target so we can go for this one so as of now this this option is available this this option is not available for relational database table might be in future they might be bringing it commit size so
            • 268:30 - 269:00 commit size means see we know right Informatica we have Commit size by default we have 10,000 so every 10,000 record we want to commit it yes we can give here r one one one lakh or 1 million so whatever the number we are giving so every these many records it will go for commit okay so just I'm giving every 10,000 record I'm just I want to commit it so this is the target option we have to give then go for next
            • 269:00 - 269:30 do we need to exclude the field so exclude field so you can click on so we have three objects right we have employees object so for the employees object do we need to have all the 11 object 11 columns do we need to remove some columns here yes if you want to remove the columns then you can give say for an example I want to remove the job ID from employes exclude that particular field so this job ID field will not be available in the employees table which while we are replicating and
            • 269:30 - 270:00 departments so in this departments I I do not want the manager ID I want to remove this exclude field just you can give so this will exclude the field okay so from here that's what it will go for so you can add one more so one more object so from employees you want exclude yes so from ex employees alone you want to employees also we have to exclude manager ID just you can mention so two objects we are mentioning
            • 270:00 - 270:30 these two field will be execute exclude from the replication then go for next data filters in this data filters so we have to mention that row limit so process all rows say for an example very first time it is going for the full Row the full load replication right full refresh so do we need to go for all the rows yes or you want to go for some rows yes we can give or we can go for some filter conditions here so if you take employees table if I want to apply some filter
            • 270:30 - 271:00 conditions here so you can apply some filter conditions okay so I have to mention salary greater than 5,000 yes you can mention that so greater than 5,000 so greater than 5,000 here so that record alone I have to make it then you can go for or I have to write some Advanced conditions you can click on Advanced so here you can mention that whatever the advanced condition we want okay so commission percentage is not
            • 271:00 - 271:30 null so you can go for under or R conditions you can write more classes here so just I'm mentioning like only one filter conditions here so if you want to add one more filter condition you can add it here then you want to schedule this particular process or just one time if it is one time activity no need to schedule it if you're going to do this task daily 7: a.m. the replication task should be running it should take the backup of the table here
            • 271:30 - 272:00 in the Rel the remote server then I can go for schedule it so you can schedule it so if anything goes wrong do I need to get email notification yes you can use the email notification here so failure email notification if any warning yes warning notification success so replication done for today replication failed so please look into it something you want to give so here it will generate the email so it has been configured with Cloud so you can just
            • 272:00 - 272:30 try that so it will get email notification will get triggered say for an example I'm giving just email notification okay so here so email notification to be send if it is Success so I can mention that success notification so once it has been completed so we will get the success notification here so execution mode is standard mode verbost mode means it will take all the data so it will the log it
            • 272:30 - 273:00 will take all the data it will write some huge log content so if anything like goes wrong you can go for veros and then you can try okay which data is going for but if you're enabling here itself they have mentioned like it will create log with all the datas the performance wise it will degrade okay so that's what you can go for pre-processing command so you can go for some the cell scripting command or anything you can use it here so maximum number of lock files so last 10 files it
            • 273:00 - 273:30 will store so you can go for finish so we have done with all the replication task here so we have done uh we have we have given all the details here so just I can go and run it so as of now if you see here there is no table in the Target connection we do not have any table with rtor right but it will create now so it has the table has been created now you
            • 273:30 - 274:00 can see so two tables it has been created created now so you can see here it is running now so three task so click on the subtask you can go inside you can see here one by one it is running so employees table it has processed only 59 records so Department table it has taken all the records location also it will take all the records but the column will get excluded so you can see here so as of now two right now one more
            • 274:00 - 274:30 table should be there so rtor employees the manager ID column is not there here right so only 10 columns are available you can go to the data you can see here the salary greater than 5,000 only it will load here so you can make uh some ascending order then you will be seeing from 5,000 only right so 5,000 to all the records we are having up to 24,000 so that's what you can see all the data so replicated for Department also we
            • 274:30 - 275:00 will not have the manager ID column but it has replicated all the data here and locations so like this whatever the object we want to go and replicate you can replicate it so one more time if I'm going to run it so what will happen so one more time it will run okay so it will run it will create so here I'm just going to I I will go and run so here I will run it one more
            • 275:00 - 275:30 time so it's running so normally in real time project we will go for incremental load but this is for full load it is taking okay if the replication task is that's what if the target we do not have any table it will replicate so you may have or you may not have so that's what but synchronization task you need to have a task the target I hope you are clear on the replication task right so whenever we
            • 275:30 - 276:00 want to create a backup then we can go for application task hi everyone welcome to Nick Academy in
            • 276:00 - 276:30 our today's session we are going to learn synchron ation task in IAC so this is the agenda for today's session so we are going to learn what is synchronization task and what are the different properties are available in synchronization task and we are going to learn the synchronization task with complete handsome so if you watch this session completely then you'll get all the stuff in synchronization task so please watch it and if you haven't
            • 276:30 - 277:00 subscribed our Channel Please Subscribe the channel and click on the Bell icon so that you'll be getting all the notifications without wasting your time let us start our session so what is synchronization task so synchronization task used to synchronize the data between source and Target so that's what so Source we have and Target we have if I want to synchronize from source to Target then you can go for synchronization task but for the synchronization task you need to
            • 277:00 - 277:30 have a target the target you need to have the we should have a Target for synchronization task it supports flat file connection Salesforce connection and rdbms connection so if you look at here I will go and open one synchronization task new synchronization task synchronization task data between a source and Target to integrate applications databases and files so application in the sense we
            • 277:30 - 278:00 have the Salesforce right so that's this kind of one okay so I'll go first synchronization the same way it has six different steps so in the synchronization task we can choose the location description if you want to write you can write and then you can go for task operations so here the task operation is important so you can go for insert so insert means every time it will go on insert then we have update so if the record is getting updated in Source it
            • 278:00 - 278:30 will get updated in the Target you can go for upsert upsert means if the data is not available in Target it will go and insert it if the data is available it will go and update the target table but for the update logic upsert logic delete logic we need to have a primary key so for this logic definitely we need to have a primary key so this time I will select upsert so update or insert so for the upset we need to have a
            • 278:30 - 279:00 primary key okay you assume that so what I will do so I will take one table here for the tore employees table itself I will take I'll go here so if you do not have primary key what will happen it will fail so very first time if you want I can go for insert logic so I will go for insert logic first I will then I will I will go we will go for upsert logic okay first I will go for insert logic next so here we have to choose the
            • 279:00 - 279:30 source connection what is source Source it it can take Source from relational database table Salesforce connection and then flatfile connection it will not take any Cloud connections okay so as of now they are not giving any Cloud so you can go for Source connections so it will take single object or you can go for joints so like employees table Department table you
            • 279:30 - 280:00 want to join it and then you have to take synchronization task yes you can go for join or saved query you remember we have taken saved query right so we can go for this particular option also like views we have created remember so just we have taken the saved query and then we have run it so as of now I'm going for single object Source object is so these many objects are available 16 objects we have seen right so 16 objects are available in the source just I'm selecting this particular object so from
            • 280:00 - 280:30 the Oracle database so here it will preview all the columns and rows so totally we have 11 columns and we have so many data and this is what it will preview if you want to preview then just you can preview it yeah this is what I want so we have done with source definition then we can go for the Target definition here so what is my target so for the synchronization task definitely you need to have a Target so at least you want to have a flat file or you want to have a table so if I'm choosing a
            • 280:30 - 281:00 table say for an example I'm choosing a Target table so it will select all the tables here right so I can choose this particular table so here I want to trun it every time unload if yes I can mention that yes or you can go for false so very first time you can do insert and then you can go for upsert logic also so enable target bul load you want to enable it bulk load yes you want to go for yes you can go for this is what Target table it is you
            • 281:00 - 281:30 are previewing the data so you can go for here just I'm truncating the table so as of now we are not having any data you assume that okay so very first time we do not have any data I have tranced select star from the target table so it doesn't have any data so you can if you refresh it yes it will not have any data as of now so you can see here preview all the columns then you
            • 281:30 - 282:00 will get it will get downloaded it will show you but if you are again if you're refreshing it it will not show you the data okay so can I select flat file yes you can select the flat file here so Target flat file connections you can select say Target object so we can select Target object here same way how we will create a flat file right so you can go for the Target flat file so whatever the flat file we have so just we can choose the
            • 282:00 - 282:30 flat file okay so this is what you can choose the flat file if you want to if you want to write the data in the existing flat fight or you can go for create Target the run time you can create a Target so with the what is the flat file you want to CSV or whatever the format you want to you can create it and what is the Target Field in the flat file same order we want to have yes you can create in this way and then it will
            • 282:30 - 283:00 create the target flat file in the run time so you can use use this option to create a Target flat file after that it will synchronize on the flat file itself same flat file or runtime it will I need to create yes you can create runtime also if I'm choosing any flat file here you can format so what is the format comma separated value file or tab separated value file semicolon or other D limiter you can go for text qualifier you can put double codes for characters so that you can mention here what the
            • 283:00 - 283:30 Escape character all this you can mention so you can provide all the informations then it will create okay so just I'm selecting the the target article itself Target object I'm selecting now tore employees so you can see here and then so Tran a Target table I'm I'm making like false so I can go for next here so then data filters so synchronization task do we need to do we need to apply any data filters then you
            • 283:30 - 284:00 can apply filters here the filter condition same way so any filter condition you can apply here so what is the object here employees we want to apply some filter conditions yes you can apply or Advanced filter condition you can apply here so this is my data filters so I'm as of now I'm not applying any filter conditions I'm going for the field mapping so next field mapping since I have selected the insert Logic No need to have a primary key here
            • 284:00 - 284:30 see here I do not have any primary key so employee ID first name last name the field mapping we have so Source we have the primary key if you look at here here we have the primary key but in Target we do not have any primary key as of now so just I'm taking and I can go for next and schedule so I do I do we need to have the schedule yes you can mention the schedule here if you want you can mention the schedule and if you want to
            • 284:30 - 285:00 create a mail alert you can create a mail alert you can go for finish then synchronization task has been created so as of now before executing the synchronization task you can see here there is no data is available I can run
            • 285:00 - 285:30 it it has executed now it has loaded 108 record into the target table if you go and check here so it has been executed right so you can see here all the 108 records has been loaded right so you can see all the data has been loaded into the target table so one more time I'm going to run it one more time I'm going to run it what will happen synchronization task it is insert logic right this insert logic so just I'm I'm
            • 285:30 - 286:00 running it so it has synchronized the task okay so it has synchronized the data so all the data it has loaded one more time right so if you download this particular record so even though we haven't enabled truncate and load it has synchronized
            • 286:00 - 286:30 all the data so you can see here so inserted row 108 in the Target right so applied row all the records has been inserted I can go for upsert logic also so here if you look at here I'm going for upsert logic edit it source so we have the source here so here I can choose what is the options I can go for upsert logic so update or insert but as of now
            • 286:30 - 287:00 we do not have any primary key here just I'm finishing it if I'm going to run it will fail is running so we are getting warning here so you can check here so Target table has no keys specified right so it will fail but you can if you are up updating okay you are altering the table so alter table table name I'm just mentioning the
            • 287:00 - 287:30 primary key so add primary key of employee ID so just I'm adding a primary key okay so it has some duplicate records so why so if I have executed synchronization task for the insert logic right so that's why it got inserted one more time it has inserted right but we would have selected something like the upset logic
            • 287:30 - 288:00 it should not load this way that's what I want to show you see here I have duplicate record then how can I remove the duplicate right so delete from so I can go for truncate and load but how can I remove the duplicate so if you have the data here two records are available so one record we need to keep one record we need to delete so in Oracle it's very easy right so we have different method so delete from table name where so row
            • 288:00 - 288:30 ID not in we can select select Max of row ID from this table we can trank it and load so Group by employee ID so I hope you know already how to so delete it all the one record one8 record deleted so you can go and select see
            • 288:30 - 289:00 after deleting it if you doing one more time you have to commit it so before without committing you are doing it so Informatica knows it's 108 record sorry 210 record only okay so here I'm just altering now it has primary key now you can see here it has primary key if I'm it has primary key if I'm going to run the synchronization task now for the insert logic also it will
            • 289:00 - 289:30 not run so instead of upsert logic I'm going for insert logic okay just I'm going to execute so it will get unique constraint violation see we are getting warning here for insert logic so we have unique constraint violated right we are getting unique
            • 289:30 - 290:00 constraint violated for the synchronization task so here we have primary key Target table we have primary key for the synchronization task it is trying to load one more time insert so that's the difference between insert and upsert so instead of insert I can go for upsert here right so you can go for upsert here you can edit it so you can use upsert logic go to next for but upsert logic we
            • 290:00 - 290:30 need to have a primary key right so go to next in the field mapping you should have a primary key here as of now we do not have primary key in the Informatica level but here we have the primary key in the database level here we have primary key but Informatica level we do not have primary key mentioned here right so I'm going to run it what will happen
            • 290:30 - 291:00 see database level we have primary key but informatical level we do not have primary key still it will fail see here it is failing so what I have to do so I have to mention that primary key in Informatica level also so I got one question can we create a table from in I level with the primary key no we cannot create as of now so in Informatica we have right so you can go for Target definition you can run it it
            • 291:00 - 291:30 will create but here it will not create so you can you have to mention here a primary key so but the primary key mentioned in the Target table just refresh the field so it is recreating this refreshing the table now you have the primary key right since we have created this primary key in the database level so here we have the primary key finish but now what I'm going to do I'm
            • 291:30 - 292:00 just truncating the table so it has primary key it doesn't have any data we have chosen the upsert logic so Informatica what it will do very first time it doesn't have any data it will load all the one 108 records all the 108 records will get loaded into the target table right so here we have Target so all the 108 record has been
            • 292:00 - 292:30 loaded here okay so one more time I'm going to run it since it is upsert logic if I'm going to run one more time it will not fail anything since we have the primary key all the one8 record it will go for update logic but all the 108 record it will go for update logic okay so you can see here all the8 record it will go for update if you download the session
            • 292:30 - 293:00 log synchronization task so you can go and down the download you can see here updated rows all the 108 records has been updated right so here we have updated all the 108 records okay so now what I'm going to do I'm going to insert one record in The Source One record I'm going to update it what will happen
            • 293:00 - 293:30 right if I'm deleting one record in Source site will it delete in Target So This Record I'm going to delete it this record I'm going to delete it will it delete in the Target if I'm going to run the synchronization task no yes or no so we have given upset
            • 293:30 - 294:00 logic right I'm going to restart I have deleted one record okay but I am just running this is for upsert logic see here one seven records has been updated right so if you go and check here how many records will be there in the Target table we will have one8
            • 294:00 - 294:30 record it will not delete record why why because we have we haven't chosen the delete options so if you have if you have chosen the delete option it should have deleted all the 107 records so that only one record will be available whatever the The Source level we have the employee ID all the target level it will go and delete it but now I'm going for update logic upsert logic okay I'm going for upsert logic see here
            • 294:30 - 295:00 I'm going to so here I'm just refreshing I'm just updating one record here so here I'm just updating okay so just I can update and
            • 295:00 - 295:30 insert one record so finally I will insert one record so already 207 is available so I will make 209 here so just simply I'm making that copy and paste so here the email ID is unique constraint so that's why I will change it here the phone number also unique
            • 295:30 - 296:00 constraint yeah so I will commit it here we have committed so manually I'm doing but in real time we will do with script so I'm just running one record will go for insert the remaining all the record will go for update okay executed I think executed right so you can go and check here yes it has been updated has been
            • 296:00 - 296:30 updated this record and one record should have inserted here so if you download the log here session log so it will not go for one record alone update right only one record we have updated in Source but it will not go for one record update it will go for see here inserted rows one rows so muted from update so since it is inserting is muted from update so update
            • 296:30 - 297:00 record it's going for all the 107 records so that's what it will perform okay hello everyone so today we'll start with the
            • 297:00 - 297:30 CI process CI is nothing but Cloud application integration so so far we have seen cloud data integration so that means for most of the project for ETL developer we will work on the CDA part only okay some projects yes they will they will work on the C as well so Cloud application
            • 297:30 - 298:00 integration so first you will see what is web service so first of all what is service so if you ask so what is mean by service so if you have okay so you are going to a shop okay assume that you are going to a shop and you are looking for some products okay some product you are looking for and you are searching for that product but that particular product is not you are not able to find that
            • 298:00 - 298:30 product and you are going to that particular viter and then you are asking or the uh the building counter you are asking that representative so where is that product right so that question is like request you're asking something then he will respond or he she will respond right that will be a response see if you have if you have a client here if you say for an example here I have a client so normally web service
            • 298:30 - 299:00 this is what the we will call it is so here we we have the client and here we have the server so we can say it's like a service provider or whatever it may be okay service provider or server so this particular client look at this particular client he is going to request okay request something to this particular service provider so this will be your request this client will send the
            • 299:00 - 299:30 request to the service provider so this client will ask some service some request to this service provider this will be your request okay this will be your request then the service provider What that particular request they will provide a response right so here we will get the response for the client we will get the response so this will be your response so for the particular request this will be our response so normally
            • 299:30 - 300:00 this is what we will get always is so if it is on the internet like if it is on the web or Internet okay so it is called Web Service okay so if it is on the web or Internet okay so we can call it as it's a web service and here the client the client is nothing but so we are the client okay the client Mission the front end we are setting and then we are typing something and we are
            • 300:00 - 300:30 requesting something to to a server or the service provider then the service provider will respond to the request so if it is happening if the service provider is giving you the service right if it is happening on the web it is called Web Service okay say for an example I can take okay a service if you ask what is web service so what is web service a service so provided from one application to another
            • 300:30 - 301:00 another application over the web or Internet is called web service so you have heard about this web service see for the web service we have lot of languages right so these languages you might have heard about like you would have heard about this XML then this wsdl then Java so these kind of languages we would have seen we do have
            • 301:00 - 301:30 lot of protocols protocol like udda see these are all something like we have already seen some uh if you have and soap so simp it is nothing but simple object access protocol okay so soap udda so these kind of protocols we might have used heard about this right so these are all the things we will use in our web service okay so so one example I will say uh I will tell you
            • 301:30 - 302:00 one example so that you can easily understand what is this website service for an example we want to we want to move from one location to another location say for an example India to USA or India to some different location Australia some different location I want to move so first of all what we will do normally so if I want to book a ticket a flight ticket if I want to move then so two ways we can book a play ticket right we can go to respective the airlines and
            • 302:00 - 302:30 you can book the tickets you can go to respective Airlines and you can book the ticket and or you can go to some service provider right we have some providers like make my trip you assume that this is my make my trip okay so I'm going to this make my trip this particular website or applications from my from my mobile okay smartphones or or
            • 302:30 - 303:00 applications or my system so from make my trip to this particular Airlines site Airlines so how we will book so we will search for this particular location from two on particular date we will search it and a request from here from this client okay this client will be this application the request will be sent to this particular Airlines so do you think this particular
            • 303:00 - 303:30 Airlines will give their database access to this particular make my drip or any other applications whether they will give access to any other uh third party vendors no right they will not give access to any other third party vendors anyone right so if I have one Bank whether the other bank will give access to any other bank database no they will not give access right so what we will do so
            • 303:30 - 304:00 normally this make my trip okay so we'll request to this Airlines through a web and this particular Airlines will give a service through web over the web so this service is called Web Service okay this Airlines right this service it's called a web service so this will ask I need to book a ticket for my customer from this location to
            • 304:00 - 304:30 this location then they will give something like they will give some some code or something they will send a service through web okay so that web is called that service is called web service so they will not give access to their DB say for an example so make my trip also they will use some DB so here also they will use some DB right so this particular Airlines if I'm booking okay
            • 304:30 - 305:00 say for an example 100 tickets are available okay for this particular location 100 tickets are available so from make my trip I'm requesting and then here I'm giving that response this particular customer has booked the ticket and this should be showing like 99 right showing like 99 the same thing should be reflected here also right here also then only this particular application or somebody else some other
            • 305:00 - 305:30 applications instead of make my trip can go for any other Flight Services right sorry the application to book up a flight so if they are booking yes the call Api call will be made to this particular database and they will not give access to their database they will give service over the web is called Web Service okay the interface we are going to have here so that interface is called API okay so application programming
            • 305:30 - 306:00 interface so this is what we happen in the realtime projects whenever we are going for API related stuff okay so if you ask what is AP so this is web service right web service is nothing but service we are providing through web so that is called web service but for an example what is what is mean by API so can you tell one example for API so API means say for an example I want to build a website okay so I want
            • 306:00 - 306:30 to build build a website and here I'm going to build a website okay assume that I want to build a website so I'm giving some informations and I want to make some users here so I want to put some signin signin button okay do you want to sign in so I I can I can make some details I can ask some details here right I can ask some details and if you see some places you might be seeing sign in
            • 306:30 - 307:00 with Facebook sign in with Google so all this you are able to see here right they not asking all other information so they will get the information you have already signed in with the Google or Facebook right so this kind of information you have seen right so I'm going to sign in so enter your email or password or something they'll be asking the signin button instead of that they'll be asking okay sign in using by Facebook Twitter or Google something like this so what is this actually
            • 307:00 - 307:30 so if I'm going to build a website okay if I'm going to build a website so instead of getting all the information and asking like signin button from everything so readily available apis okay so you can get so that information from these service providers so you can get those information and you can use this these API codes into your page so
            • 307:30 - 308:00 you can embed this API code into your page then this signin button okay signin button login button so all this it'll be showing here so that means the reusability so you are not going to do anything and the customer customer can able to already logged in with the Google then they can able to log in here right so that information we will get it from the from the respective service providers uh Google or something say for
            • 308:00 - 308:30 example zomato or Ola Uber these companies are using locations location to identify even I want to use customers locations so here or I want to show them where the where this particular place located exactly right I want to give some places here address and the location then if they click on this particular location then it is very easy to locate them to find
            • 308:30 - 309:00 my office address or are my location right then in order to use this location I have to use the Google geo location API okay say for an example if I want to use something like sometimes they will ask this type of question right so this particular website would like to share your location with the site if you're giving agree then this particular uh Google they will give your information to them right so that's what
            • 309:00 - 309:30 see this is one of the API we are going to embed here in our site and then this location will be tracked by using the geol location API see if I want to use in my website they will be providing some API code see see see here so here we here we have to put our our own API key based on that they will give you this particular
            • 309:30 - 310:00 location so you need to invert this particular code into your backend then so you can able to get the response from the the geolocation API so these are all some of the examples for apis okay normally we will use in our uh realtime project we embed all the details into our website then you may ask what is rest API rest API or rest restful API so what
            • 310:00 - 310:30 is rest API say for an example the same architecture I'm going to explain so restful API this is my client left hand side I have my client okay so client client is nothing but our front end tool you can front end tool mobile or or website or anything okay our desktop or anything you can take so this is my client and this is our
            • 310:30 - 311:00 server okay web server so we have web server right so application server is different this is web server so you assume that this is my the company's web server here we have okay so here we have web server so this client will ask request say for an example I'm just making my website so ww do so I'm going to enter this I'm going
            • 311:00 - 311:30 to enter this particular this particular and I'm getting some response right see I'm getting some response here so this response is coming from the web server so this client is asking some HTTP HTTP request okay so this is HTTP request this request is called so what is the request is called HTTP request so we do
            • 311:30 - 312:00 have lot of methods okay so if I want to get some informations from the server then I will ask some method called get method okay something like I want to post something I can say for an example I want to post some something okay in the Facebook I'm I'm adding some comments so that will be passed as a post method I'm making some like so these are all something like I'm deleting some post okay so delete so
            • 312:00 - 312:30 delete is the method P okay something like so we do have different https request we have okay so post we have get we have we have so also some of the method okay so different methods we have it will be requested to this web server okay this web server this web server all these methods okay these are all called Methods these are all called
            • 312:30 - 313:00 methods so extract we do have different methods this web server respond to this particular request whatever the request we are sending for example this a https request right HTTP request or https request do different method it will ask so for this request this web server should respond okay this web server should respond so we will get response
            • 313:00 - 313:30 in different way okay so response we will get different so what is this response so this is our rest request this will be our so we will get these responses from specific format okay so we'll get the response like a specific format if you see this this particular site you can get the response different format right we are seeing the picture and you are
            • 313:30 - 314:00 seeing the videos so all this you are able to see the forms and everything you are able to see here right so these are all called response so for this response so different specific format we'll get like you may get like HTML page as a response right so you might be getting that response as HTML page you can get images right you can get the videos you can get something
            • 314:00 - 314:30 like even the the different site okay so you are going for uh Json response you may get XML response see different responses we may get okay so this is called resources okay this resources resources means we are getting some resources from the web server we are getting some resources okay in addition to this resource we will get something called
            • 314:30 - 315:00 this is called resource and second one is called status code okay sometimes you are the some URL you are typing it and HTTP request is made to the web server and the particular page is not available the host itself is not available then what is the Response Code we'll get 404 Response Code we will get it right the same way if it is successful then you'll be getting Response Code as 200 see
            • 315:00 - 315:30 different Response Code we might be getting from the web server to the client okay so we'll get resources we'll get status as a transfer okay as a transfer as a transfer see resources and Status so this is called representational State okay so so what is this representational status State transfer so rest API rest
            • 315:30 - 316:00 is nothing nothing but it is a representational representational state transfer as a data resources we are transferring it from here say for an example get method post method put delete methods all this it is requesting this web server is responding with XML Json format HTM HTML page or different way of response we are getting it here right so that's what we will get response so rest API is nothing but
            • 316:00 - 316:30 representational State transfer so this is called rest API okay so what is this so representational State transfer rest AP so if you see this in this particular architecture client and web server architecture in between okay so in between some interfaces happening okay in between so here some interface is happening between the client and web
            • 316:30 - 317:00 server okay so client and web server we are we are giving some interfaces see this interface is called it as rest APA so what is this this interface is called it as rest APA so whatever the interface we are seeing here so that particular interface is called it as rest API so this is what we'll have rest API here okay so I hope you are clear so what is this rest API and yeah some sometimes
            • 317:00 - 317:30 they will call it as it's a rful API right so everything is similar only or we can say it's a representational state transfer AP so application programming interface so this interface is call it as rest AP so how this will be used in our iacs so that's what we are going to see here so what is web service yes we have seen web service as a service over the network is called web service how
            • 317:30 - 318:00 the web is called web service and API is nothing but application programming interface and rest API all this we have seen then how this will be used in our iacs say for an example from from the front end from from the application our IAC needs to consume the data or from IAC need to stand send post some some data to the applications I have to use C
            • 318:00 - 318:30 I or I have to use web service transformation so we already seen so what is web service so what is St API all this so you remember we have seen what is this web the rest API right so representational State transfer so this is the API and we have soap API also the soap API is like Legacy one but we are mostly in real time project so nowadays
            • 318:30 - 319:00 we are using this St AP only representational state transfer AP so we have the web server here and we have the client so from the client two systems how the communication will be through API calls right so we have already seen that so now we will continue with we have seen so how to connect with different process okay so different so you can search in online saying that uh
            • 319:00 - 319:30 some public API okay so public API to connect so we do have lot of apas so you can connect with some apas okay so if you take these apas are free okay so so something like you can use it so Co ai.com so they are sending some they do have some uh apas and you can see here some apas are available here okay so this is a get method I want to know the region wise Regional names
            • 319:30 - 320:00 okay and I want a list of reports today's some coid reports okay so I want all this so just you can go here you can click on this skit and they can pass some parameters like two parameters you can pass and it will send you the response okay you know how to test it through Postman right so Postman also I have explained so go to postman so open the postman you know how to install the postman right so already I have
            • 320:00 - 320:30 explained so if you see here you can go to one more tab here okay so here we can go to one more Tab and if I want to test it so this one so try it out okay so I'm going to get request URL so this is a URL so if you see this URL so I'm passing some country code and date so based on that I'm getting some response okay response I'm getting so if
            • 320:30 - 321:00 you passing ENT equal to India yes you'll get different response right different data so that I I'm going to so here the method is get method I'm going to paste it here so these two are parameters just a sting I'm getting the the response right yes see you can pass different date also different country also so I'm just trying to pass India so whether I'm getting different data yes I'm getting
            • 321:00 - 321:30 different data so based on the parameters I'm getting different response right so from the server so the two things one is get method yes we can get the data from apis and through apis we can server web server you can get the response to iacs right that's what we have seen previously through web services transformation now so how to create our own API so from Informatica Cloud can we create our own API can we
            • 321:30 - 322:00 read the data from this web services to our iacs yes we can get all this okay if you go to runtime environment so I didn't do anything just I have reinstalled one more time so now in this new system the process server is up and running okay so that's what uh you can see here that process server so this process server should be up and running for to create a the CI process okay so CI we have lot of topics for CDI
            • 322:00 - 322:30 how we have data integration administrator these two Services we have on monitor the similar way we have for API we have the AP application integration application integration console and we have API manager API portal but here I'm going with application integration console application integration service so here you can find create a process and create an application connections so all this right so everything so this one we can see and I'm going to create one process
            • 322:30 - 323:00 so just you can go to new so create a process you can create application connection service connections all this so service connection is nothing but to connect with web services rest are soap connections right so that's what you will connect service connectors so I will create the process so how to connect how to create a process our own API how to create it okay so just to go to this process
            • 323:00 - 323:30 so I will give you one document so just to follow this document you can able to complete it so everything I have given step by step explanations you can use okay say for an example I'm going to use so this is the first process I'm going to use so process name I'm going to use so processore my first process it's like same like a task flow right so you can see here some task step are available
            • 323:30 - 324:00 right steps are available assignment service and sub process you can create a new object okay to receive to receive some event yes you can go for receive okay so all this you can see one by one here okay so but here if you open so this is what you'll have General so in general tab you can give not steps typ is start and name yes we need to give and API name no need to
            • 324:00 - 324:30 give okay location is the folder which project I'm going to use if you want to write some descriptions yes you can write some description here some descriptions I want to write some descriptions yes I can go for some descriptions so you can write okay but I'm not going to do anything just I'm I have given only name so now I'll go to the start so here go to start and see here binding so we have two types of API right one is rest API know as soap API
            • 324:30 - 325:00 so for rest API soap API this is the Mind name you have to use if you're going for any event and then you have to use event so but here we going to use only rest and API so allowed groups uh in realtime projects yes we will have the groups so we'll have the user roles user groups all this we are all working in one particular project then they will create one group they will assign a role for that group so whoever is having the role they can able to access this
            • 325:00 - 325:30 particular process okay that's what the allowed group so this will be handled by admin team in realtime project they will create this allowed groups allowed users so just we are going to take that group name only but here I'm going to take so whatever the user I have right so that user I'm going to take okay so this is my username so that means so this user alone can able to connect okay this user alone can able to access this others cannot able to access so that that's
            • 325:30 - 326:00 what the meaning okay so only this and I'm not going to use anything so if you want to run this process on your local server yes you can give your local agent okay but here API I'm going to since it's a public API I'm going to call run on Cloud Server okay so you can use in real time project yes you will use the local security agent only see why I have explained this coid API so you remember last time when we are seeing that uh API
            • 326:00 - 326:30 we have seen this this API similar to this API so you can use like public API data for free if you search in Google you'll get lot of apis okay this is one of the API and similar way we are going to create our own API like this okay so that's what we are going to create now so how to create it see after that Cloud Server that's it don't do anything here just you can give only allowed users if
            • 326:30 - 327:00 you want to to restrict the access to this particular user yes you can give this usern name okay go to input field so input field if you see this this API so what is the input field here for this API the input is date report date so which date I want to generate a report right and then some country based on some ISO code so this is what I have I'm going to send two things so if I'm going to send these to data it will send a
            • 327:00 - 327:30 response right yes the similar way so here also we are going to use the same thing right yes we are going to use the same thing here also so just I'm going to pass some some username password all this right so it will send okay the authentication successful or something so this is my input so what I'm going to pass some input so just to click on the plus symbol you can pass some input here
            • 327:30 - 328:00 so you can put some input so I'm going to make username okay so if you want to pass if you want to make it like a mandatory field yes you can make it like mandatory field so you want to pass some password yes you can put some password also if you want okay so username password I'm going to pass it and I'm going to that's it so I'm not going to do anything here and go to Output field so what output I need to get from the API right so I want to get
            • 328:00 - 328:30 output okay some response okay something like API response or something you can post it here so response message okay so here it's a text and if you want to put some initial value so that means if it is not responding through the server then uh you can put some initial value okay by default value but normally you will not put it so just you can make it like these two field input is nothing but
            • 328:30 - 329:00 input parameter we are passing these two are input like this right but if I'm going to execute it is sending me the response this is the response right so these details I'm sending steps I have given just follow this particular document it's like a completely I have given step by step okay so step by step I have given all this so how to use it so this is what I have to use and the temporary field if if I want to make any
            • 329:00 - 329:30 temporary calculations you know expression and transformation we have some temporary calculation right so if you want to make some temporary field yes you can make it that field you can if you want to assign it to Output field yes you can put that temporary field here messages if you want to put some messages Advanced and notes so all this if you want to make it so here you can make it but so as of now we are making only input field output field and start button so but what is the response I
            • 329:30 - 330:00 need to get so that's what we have to so in between we have to process it so to give a standalone output I'm going to use assignment task so this assignment task I'm going to make it like so AP response so just I'm going to give some name of the task assignment task go to assignments and this assignments you can click on the plus symbol okay so I have given everything here
            • 330:00 - 330:30 so which one I have to select so what is the response I need to get right so I have to select this response message so this response message so you want to put something like you want to put some content okay you want to put some field formula yes you can put here right so you can go for formulas and content means constantly you'll get formula means yes based on this you'll get the formula right so can put some
            • 330:30 - 331:00 formula here okay so hi or hello so you have to put the username the current user you can put or the username which I'm going to print it so I will put current user also so you can put current user then conation then you can make the user authenticated successfully the response I'm going to get it from the apis this is the way
            • 331:00 - 331:30 but you want to check it in the database all this that you have to write it the code but we are not going to write in the database connection at all just we are checking just we are printing that hello that username and then authenticated successfully that's it the response I'm going to get it okay so click on okay here and save this okay so just we have created one simple simple process to get the output
            • 331:30 - 332:00 right yes so for every process we have to publish it for every process we have to publish so for CDI no need to publish right but CI we have to publish it into the server so through the API right web services that's why we have to publish it so click on publish here so it will show you like ass set was published successfully I have published after publishing can go to
            • 332:00 - 332:30 this and you can see the properties detail okay so just click on the property details and you can see so this is what it is showing like published published by and status is still acting okay you are getting that endpoint is two endpoint so what is that endpoint yes one is this rest API URL and another one is so AP URL okay this is what the URL we are getting see how we are
            • 332:30 - 333:00 how we depend on this URL the similar way we have created our own URL so last time we have seen web service transformation to read the data from the API now we are sending the data right so you can take this rest API URL so just to copy this URL okay you can go here so you can paste it here okay so but for this you have to make this question mark like this
            • 333:00 - 333:30 you have to put the username so like this you have to put username and if you want to put password yes you can put password but that is not mandatory on right yes this is the URL question mark and you have to put your all your input parameter so one more input parameter you can put one more question mark but I'm not using that parameter and all here just I'm using only usern name right yes you can copy this and you can run it here
            • 333:30 - 334:00 right okay so I'm getting that hello the current user see since I have used the current user it has taken the username but I want to put so my name alone whatever I'm passing from here so that name alone I want to get it then I have to pass that name here right the user authenticated successfully that's what the response I'm getting so since if you if you go and execute this part particular line you
            • 334:00 - 334:30 will not get the result so why because it's like so in my system I have already log logged in with the my credential right that's why it is able to display but if I want to get this details so from the postman say for an example from here I have to get so go to new tab it's like get method right yes you can see the get method you can paste it here so similar I'm going to paste it
            • 334:30 - 335:00 going to send see we will not get the response so what is the response we are getting we are getting response like 401 status code unauthorized so 41 is unauthorized user so since see the request has not been applied because it lacks valid authentication credentials for the Target response see here I have given authentication method right so while while I have
            • 335:00 - 335:30 created I have given some authentication method process right so here I have used allowed users this user alone can able to access it right yes I can go to the post man here so here we have authentication method so you can go go with in post man yes this is one of the testing tool right so frontend API testing tool so normally this is what the the API tester they will do it so you can go to type you can use some
            • 335:30 - 336:00 authentication method so here I'm going for Bas basic authentication I can use my username and I can put some password okay so I'm I'm using password here so I put some dummy password so you can see here it is now has been authorized okay so we getting the status code 200 and this is this response time from our server so from where is running see if I'm going to
            • 336:00 - 336:30 send this has been the request has been sent to Informatica Cloud Server right the Cloud Server will hit and then the published process it will return the data so clear so how to create our own API so this is what you have to use the process and uh the ca process so you can go more and more on CI okay just you start
            • 336:30 - 337:00 practicing it but this is completely on API all this okay so if you want to unpublish it yes you can go for unpublish here okay unpublish and you can do the changes and then again you have to publish it so if you want to change do some changes okay I do not want the current user I want to go for the different way then you can go for username I have to put just I'm making the username so click okay save this but now I'm going
            • 337:00 - 337:30 to publish it I have to publish it without publishing if I'm going to ex execute it will not be executed see here now it is printing with this username right yes