Skip to main content

21 Courses

Apache Iceberg
Cloud Computing
Preview Course

Cloud Computing

Apache Iceberg

HRDC Reg. No: 10001547565
Duration:  3 Days (24 Hours)

Course Overview

Apache Iceberg is a high-performance open table format designed for analytic workloads on cloud object stores and distributed data lakes. This hands-on course explores Iceberg’s architecture, table design, time travel, partitioning, schema evolution, and integration with modern big data tools such as Spark, Flink, Trino, and Presto. Real-world labs focus on streaming ingestion, rollback, governance, and security.


Who Should Attend

  • Data Engineers

  • Data Platform Architects

  • Big Data Developers

  • Lakehouse Engineers

  • DevOps Engineers

Targeted Industries

  • Cloud-Native SaaS Platforms

  • Financial Services and Banking

  • Retail and E-Commerce Analytics

  • Telecommunications and IoT

  • Healthcare and Pharma

  • Government & Public Sector Data Platforms


Why Choose This Course

HRDC Claimable[TBD]
Master the open table format powering modern data lakes and lakehouses, with real-world training in Apache Iceberg—ideal for secure, efficient, and scalable analytics on cloud-native infrastructure.


Learning Outcomes

Participants will be able to:

  • Understand Iceberg’s architecture and benefits over Hive, Hudi, and Delta Lake

  • Perform schema/partition evolution, rollback, and metadata pruning

  • Ingest batch and streaming data with Spark and Flink

  • Optimize Iceberg performance via compaction and predicate pushdown

  • Secure data with Apache Ranger and encryption

  • Deploy Iceberg in multi-engine environments (Spark, Trino, Flink)


Prerequisites

  • Basic understanding of OLAP/OLTP and SQL (recommended)

  • Familiarity with Hadoop, Linux, and Python

  • Awareness of ETL and Java stack concepts


Lab Setup

Tools & Stack:

  • Apache Iceberg (latest), Spark 3.x or Flink 1.14+

  • Trino or Presto, MinIO/S3 emulation

  • Kafka (for streaming), Docker or cloud (optional)

  • Jupyter, Zeppelin, VS Code, Apache Ranger


Teaching Methodology

  • Instructor-led walkthroughs with diagrams

  • Hands-on lab sessions using real-world datasets

  • Daily knowledge checks and a capstone project

  • (0)
  • Apache Flink
    Cloud Computing
    Preview Course

    Cloud Computing

    Apache Flink

    HRDC Reg. No: 10001548574
    Duration:  4 Days (28 Hours)

    Course Overview

    Apache Flink is a high-throughput, fault-tolerant, real-time stream processing framework. This course equips Data Engineers and Developers to master Flink’s DataStream and Table APIs, stateful stream processing, windowing, checkpointing, and integration with systems like Kafka, JDBC, S3, and Elasticsearch. Participants will gain hands-on experience through code labs and a capstone project using real-world data processing scenarios.


    Who Should Attend

    • Data Engineers

    • Backend Developers

    • Streaming Engineers

    • Big Data Architects

    • DevOps and Platform Engineers

    Targeted Industries

    • FinTech and Banking (fraud detection, payments)

    • Telecommunications and IoT (real-time telemetry)

    • E-Commerce and Retail (clickstream analytics)

    • Logistics and Manufacturing (sensor and vehicle tracking)

    • Media and Advertising (audience measurement, engagement)


    Why Choose This Course

    HRDC Claimable[TBD]
    Get up to speed with production-grade stream processing using Apache Flink—ideal for building scalable, low-latency pipelines with integrations into real-time and batch ecosystems.


    Learning Outcomes

    Participants will be able to:

    • Understand Flink’s architecture and distributed processing model

    • Develop streaming and batch applications using Flink APIs

    • Apply windowing, event-time processing, and state management

    • Integrate Flink with Kafka, S3, JDBC, Elasticsearch, and others

    • Secure Flink deployments with SSL, Kerberos, and RBAC

    • Optimize and deploy real-world streaming pipelines


    Prerequisites

    • Basic database and SQL understanding (recommended)

    • Knowledge of Python and Linux shell

    • Familiarity with ETL workflows and optionally Hadoop

    • JVM/Java knowledge is helpful but not mandatory


    Lab Setup

    Minimum Requirements:

    • RAM: 8 GB (16 GB recommended)

    • CPU: Quad-core

    • OS: Ubuntu/CentOS preferred; Windows with WSL

    • Tools: Flink, Kafka, Python 3.8+, IntelliJ/VS Code, Docker (optional), sample datasets


    Teaching Methodology

    • Instructor-led architecture and code walkthroughs

    • Hands-on labs and scenario-based exercises

    • Daily practical assignments

    • Final capstone project and quiz

  • (0)
  • Big Data on AWS
    Cloud Computing
    Preview Course

    Cloud Computing

    Big Data on AWS

    HRDC Reg. No: 10001547674
    Duration:  4 Days (28 Hours)

    Course Overview

    This hands-on course provides a comprehensive guide to designing and implementing Big Data and Machine Learning solutions on AWS. It covers key AWS services such as EMR, Redshift, Glue, Kinesis, Athena, DynamoDB, and Airflow. Participants will learn to build scalable, secure, and cost-effective data pipelines and architectures using cloud-native services for ingestion, transformation, analysis, and orchestration.


    Who Should Attend

    • Data Engineers

    • Cloud Architects

    • Data Analysts and Scientists

    • DevOps Engineers

    • ETL Developers

    Targeted Industries

    • Finance and Insurance

    • Retail and E-Commerce

    • Healthcare and Life Sciences

    • Telecom and Media

    • Government and Defense

    • Logistics and Manufacturing


    Why Choose This Course

    HRDC Claimable[TBD]
    Designed for teams moving their data workloads to AWS, this course equips participants with both foundational knowledge and advanced practical skills to harness the full potential of AWS Big Data services.


    Learning Outcomes

    By the end of this course, participants will be able to:

    • Understand AWS cloud architecture and Big Data ecosystem

    • Ingest and analyze structured/unstructured data using AWS tools

    • Build data pipelines with Kinesis, Glue, Athena, and EMR

    • Optimize storage with S3 and databases with Redshift, RDS, and DynamoDB

    • Automate workflows using Airflow and Lambda

    • Secure and monitor AWS Big Data environments


    Prerequisites

    • Familiarity with Hadoop, Spark, Hive, and HDFS

    • Programming in Python

    • Knowledge of SQL/NoSQL and database design


    Lab Setup

    Access & Infrastructure:

    • Free-tier AWS account recommended

    • Connectivity to AWS over HTTP, SSH, and TCP

    • Public IP whitelisting and AWS Infra Readiness guidance provided

    Lab Activities:

    • Guided exercises with S3, Athena, Glue, EMR, Redshift, and Airflow

    • Real-world data sets and project simulations


    Teaching Methodology

    • Instructor-led sessions

    • Hands-on labs and use-case-driven exercises

    • Daily knowledge checks and real-time demos

  • (0)
  • Apache Kafka/Confluent Kafka
    Cloud Computing
    Preview Course

    Cloud Computing

    Apache Kafka/Confluent Kafka

    HRDC Reg. No: 10001547563
    Duration:
    5 days (35 hours)

    Course Overview

    This intensive hands-on course provides a comprehensive understanding of Apache Kafka and the Confluent platform. Participants will gain expertise in building scalable, real-time streaming architectures, managing Kafka clusters, creating custom producers/consumers, integrating Kafka with Spark, and leveraging Confluent tools like Kafka Connect and KSQL DB for streamlined data pipeline development.


    Who Should Attend?

    • Data Engineers

    • Backend Developers

    • DevOps Engineers

    • Solution Architects

    • System Integrators

    Targeted Industries

    • Financial Services & Banking

    • Telecommunications

    • E-Commerce & Retail

    • Media & Entertainment

    • Government & Smart Infrastructure

    • Logistics & Manufacturing


    Why Choose This Course

    HRDC Claimable[Insert HRDC Claimable ID once registration number is available]
    A practical guide to building enterprise-grade, real-time data platforms with Kafka and Confluent tools, optimized for mission-critical event-driven systems and data pipelines.


    Learning Outcomes

    By the end of this course, participants will be able to:

    • Deploy and manage multi-node Kafka clusters

    • Create custom Kafka producers and consumers using Java

    • Build and manage streaming pipelines using Spark and Kafka Streams

    • Secure Kafka with SSL, SASL, and ACLs

    • Use Kafka Connect for integration with external systems

    • Query and process real-time streams using KSQL DB

    • Monitor and tune Kafka for performance and reliability


    Prerequisites

    • Knowledge of distributed computing

    • Basic understanding of Hadoop and Spark

    • Programming experience in Java or Python

    • Familiarity with Linux and command-line tools

    • Awareness of enterprise architecture concepts


    Lab Setup

    Each participant will receive a dedicated environment with:

    • 3-node Kafka cluster (includes Zookeeper, Kafka, Spark, and connectors)

    • Hardware Requirements:

      • Processor: Intel i5 (8 cores)

      • RAM: 32 GB

      • Storage: 200 GB SSD (2,000 IOPS, 100 Mbps)

    • OS: Ubuntu 22.04

    • Software: IntelliJ, PyCharm, Docker, Java 8/11, Maven, Python 3.8+, Chrome

    • Access: Internet (GitHub, Google Drive), SSH, sudo access

    • Note: AWS setup, IP whitelisting, and proxy configuration as needed


    Teaching Methodology

    • Instructor-led architecture deep dives

    • Hands-on coding and labs with real-time data

    • Project simulations and daily quizzes

    • Scenario-based exercises using Twitter or finance data streams

  • (0)
  • Apache Ozone File System (Next Generation of Hadoop HDFS)
    Cloud Computing
    Preview Course

    Cloud Computing

    Apache Ozone File System (Next Generation of Hadoop HDFS)

    HRDC Reg. No: 10001548281
    Duration:
    3 Days (24 hours)

    Course Overview

    This hands-on course is designed for Data Engineers and Big Data Architects aiming to adopt Apache Ozone—a high-performance, scalable object store that is a next-generation replacement for HDFS. Participants will explore architecture, cluster setup, security, integrations with big data tools (Hadoop, Hive, Spark, Kafka, Flink), and deployment of real-world data pipelines. Labs and a capstone project reinforce each learning objective.


    Who Should Attend

    • Data Engineers

    • Hadoop Administrators

    • Platform Architects

    • Big Data Engineers

    • Cloud and Storage Architects

    Targeted Industries

    • Banking and Financial Services

    • Telecom and Media

    • Healthcare and Pharma

    • Retail and E-Commerce

    • Government and Defense

    • Industrial IoT and Manufacturing


    Why Choose This Course

    HRDC Claimable[TBD]
    Future-proof your data architecture with hands-on training in Apache Ozone, a cloud-native object store compatible with HDFS and S3, and optimized for next-gen data lakes and hybrid big data platforms.


    Learning Outcomes

    By the end of this course, participants will be able to:

    • Understand the evolution and advantages of Apache Ozone over HDFS and S3-compatible object stores

    • Deploy and manage multi-node Ozone clusters

    • Perform volume and object-level operations using CLI and APIs

    • Tune performance and enable high availability configurations

    • Secure Ozone using Kerberos, Ranger, and TLS

    • Integrate Ozone with Spark, Hive, Flink, Kafka, and Presto

    • Implement enterprise-grade data lake pipelines using Ozone


    Prerequisites

    • Basic knowledge of Big Data and distributed storage systems

    • Familiarity with HDFS, YARN, and Linux file operations

    • Understanding of Hive, SQL, and object stores like AWS S3, MinIO

    • Basic CLI skills and networking concepts (SSH, ports)

    • Experience with Spark or Flink (recommended)


    Lab Setup

    Pre-configured Lab Environment (provided):

    • 3-node Ozone + Hadoop cluster

    • Hive, Spark, Flink, Kafka pre-installed

    • Configured with Kerberos, Ranger, Prometheus, Grafana

    • Sample datasets for log, financial, and clickstream analysis

    Manual Deployment Requirements:

    • RAM: 16 GB per node

    • vCPUs: 4

    • OS: Linux (Ubuntu/CentOS)

    • Java: 8 or 11

    • Access: SSH, sudo, open ports for Web UI/S3


    Teaching Methodology

    • Instructor-led conceptual and practical sessions

    • Scenario-based labs and integrations

    • Real-world use cases

    • End-of-course capstone project and certification quiz

  • (0)
  • Apache Cassandra / DataStax Cassandra
    Cloud Computing
    Preview Course

    Cloud Computing

    Apache Cassandra / DataStax Cassandra

    HRDC Reg. No: 10001547630
    Course Duration: 35 Hours (5 Days)

    Course Overview

    Apache Cassandra is a fault-tolerant, distributed NoSQL database designed for large-scale data management. This training focuses on both open-source Cassandra and the DataStax Enterprise (DSE) version, equipping participants with the knowledge to deploy, manage, and integrate Cassandra with enterprise tools like Spark, Kafka, and Java SDKs. Topics include architecture, replication, security, monitoring, advanced querying, and data modeling.


    Who Should Attend

    • Java Developers

    • Database Administrators

    • Data Architects

    • Big Data Engineers

    • DevOps and System Engineers

    Targeted Industries

    • Telecommunications

    • Banking and Financial Services

    • E-commerce & Retail

    • Healthcare & Life Sciences

    • Public Sector & Defense

    • Media and Streaming Services


    Why Choose This Course

    HRDC Claimable[TBD]
    Ideal for organizations looking to adopt scalable NoSQL systems, this course blends architecture mastery with real-world integration skills using Java, Spark, and DevOps tools.


    Learning Outcomes

    By the end of this course, participants will be able to:

    • Design and administer Cassandra clusters

    • Perform replication, data modeling, and backup/restore

    • Write efficient queries using CQL

    • Secure and monitor Cassandra deployments

    • Use OpsCenter, nodetool, and Prometheus for administration

    • Integrate Cassandra with Java SDK, Apache Spark, and Kafka


    Prerequisites

    • Familiarity with Linux and shell commands

    • Basic Java programming knowledge

    • Awareness of Big Data tools and SQL concepts


    Lab Setup

    Minimum System Requirements:

    • Processor: Intel i5 (8 cores, 2.5GHz+)

    • RAM: 32 GB

    • Storage: 200 GB SSD (2,000 IOPS, 100 Mbps bandwidth)

    • Internet: Access to GitHub, Google Drive

    • OS: Ubuntu 22.04

    • Software: IntelliJ, PyCharm, VirtualBox, Docker & Compose, Java 8/11, Maven 3.6+, Python 3.8+, Chrome, Git Bash, Putty (Windows)

    • Administrative Access: Required

    • AWS Labs (optional): SSH access, Elastic IP whitelisting, proxy setup (if applicable)


    Teaching Methodology

    • Instructor-led demos and lectures

    • Hands-on labs with real-world datasets

    • Integration projects with Java and Spark

    • Performance monitoring and debugging sessions

  • (0)
  • Big Data Using Handoop Ecosystem
    Cloud Computing
    Preview Course

    Cloud Computing

    Big Data Using Handoop Ecosystem

    HRDC Reg. No: 10001548285
    Course Duration: 35 Hours (5 Days)

    Course Overview

    This practical course introduces Big Data concepts and the Hadoop ecosystem. Participants will learn to process, store, transform, and analyze large-scale data using tools such as HDFS, Hive, Pig, Sqoop, Oozie, Kafka, HBase, and Spark. Delivered via Cloudera's environment, the course combines lectures and hands-on labs with real-world datasets.


    Who Should Attend

    • Data Architects

    • Enterprise Architects

    • Developers and Engineers

    • System Administrators

    • Data Analysts

    • Technical Architects


    Why Choose This Course

    HRDC Claimable (subject to HRDC registration). Participants gain end-to-end understanding of how to architect and build Big Data pipelines using Hadoop tools and frameworks. This course blends foundational theory with hands-on practice for enterprise readiness.


    Learning Outcomes

    Upon completion, participants will:

    • Understand Big Data frameworks and ecosystem tools

    • Use HDFS, MapReduce, and YARN

    • Perform ETL with Pig and Hive

    • Build workflows using Oozie

    • Utilize HBase for NoSQL storage

    • Use Kafka for real-time data ingestion

    • Analyze and process data using Apache Spark


    Prerequisites

    • Basic Linux command-line skills

    • Understanding of databases and SQL

    • Familiarity with Java is beneficial but not required


    Lab Setup Requirements

    Hardware:

    • CPU: Intel i5 or higher

    • RAM: Minimum 8 GB (16 GB recommended)

    • Disk: 20+ GB free

    Software:

    • Cloudera QuickStart VM

    • Java JDK 8+, IntelliJ/Eclipse

    • MySQL for Sqoop

    • Kafka, Spark pre-installed

    • Real datasets: Yahoo Finance, SFPD Crime Data


    Teaching Methodology

    • Instructor-led architecture walkthroughs

    • Hands-on labs with real data

    • Use of Cloudera VM sandbox

    • Daily guided exercises and demonstrations

  • (0)
  • Apache Spark using Java
    Cloud Computing
    Preview Course

    Cloud Computing

    Apache Spark using Java

    HRDC Reg. No: 10001548585
    Course Duration: 35 Hours (5 Days)

    Course Overview

    This hands-on course focuses on using Apache Spark with Java to develop large-scale distributed data applications. Participants will learn how to process batch and streaming data using RDDs, DataFrames, Spark SQL, and Structured Streaming. The course also explores integration with Hadoop, Hive, Kafka, and Delta Lake to build robust real-time and versioned data pipelines.


    Who Should Attend

    • Java Developers transitioning into Big Data roles

    • Data Engineers and Architects

    • ETL Developers working with Hadoop/Spark stack

    • Backend Developers integrating Spark into systems

    • Engineers developing real-time analytics pipelines


    Why Choose This Course

    HRDC Claimable. This course delivers practical, Java-centric expertise in Spark-based Big Data applications. It includes real-world lab exercises using IntelliJ IDE, Apache Kafka, Delta Lake, and Hive, preparing participants for data-intensive enterprise environments.


    Learning Outcomes

    Participants will be able to:

    • Understand Spark’s distributed architecture and execution model

    • Develop Spark applications in Java using IntelliJ

    • Leverage RDDs, DataFrames, and Spark SQL

    • Integrate Spark with Hive, Kafka, Delta Lake, and Hadoop

    • Optimize Spark jobs through performance tuning

    • Build real-time applications using Structured Streaming and Kafka

    • Implement ACID-compliant data lakes using Delta Lake


    Prerequisites

    • Strong Java programming knowledge

    • Familiarity with Linux OS

    • Understanding of databases and data pipelines

    • Basic exposure to Big Data and messaging systems helpful


    Lab Setup Requirements

    Hardware:

    • Intel i5 CPU or higher

    • 8 GB RAM minimum (16 GB recommended)

    • 25 GB free disk space

    Software & Tools:

    • Java JDK 11+

    • IntelliJ IDEA

    • Apache Spark 3.x, Hadoop, Hive

    • Apache Kafka, MySQL/PostgreSQL

    • NoSQL: HBase or Cassandra (optional)

    • Preconfigured VM or Docker image (provided)


    Teaching Methodology

    • Instructor-led theory and live demonstrations

    • IntelliJ-based Java development

    • Hands-on labs with real-world data

    • Optional integration with BI tools and databases

  • (0)
  • Apache Spark using Python (PySpark)
    Cloud Computing
    Preview Course

    Cloud Computing

    Apache Spark using Python (PySpark)

    HRDC Reg. No: 10001547561
    Course Duration: 35 Hours (5 Days)

    Course Overview

    This intensive hands-on course covers Apache Spark with Python (PySpark), designed to equip participants with the skills to build and scale data processing applications for Big Data. Through real-world labs, learners will explore Spark Core, SQL, Streaming, and advanced topics like Apache Iceberg for scalable, fault-tolerant data lakes.


    Who Should Attend

    • Data Engineers

    • Big Data Developers

    • ETL Developers

    • DevOps professionals working in Hadoop/Spark ecosystems

    • Data Analysts building distributed data solutions


    Why Choose This Course

    HRDC Claimable (HRDC Registration Number required). Gain in-demand data engineering skills using PySpark through an immersive training experience. The course offers real-world use cases, from ETL pipelines to real-time analytics with Kafka and Iceberg integration.


    Learning Outcomes

    Participants will learn to:

    • Understand Spark architecture and integrate with Hadoop

    • Develop applications using PySpark and Spark SQL

    • Perform distributed data processing, aggregations, and ETL tasks

    • Stream data with Spark Streaming and Kafka

    • Manage modern data lakes using Apache Iceberg

    • Optimize Spark applications for performance and scalability


    Prerequisites

    • Basic Python and Linux skills

    • Familiarity with databases and ETL workflows

    • Basic knowledge of Hadoop and SQL recommended but not essential


    Lab Setup

    Hardware Requirements:

    • CPU: Intel i5 or higher

    • RAM: 8 GB minimum (16 GB recommended)

    • Storage: 20 GB free space

    Software Environment:

    • Preconfigured VM with Hadoop, Spark 3.x, Hive, Kafka, Iceberg

    • Python 3.8+, Jupyter Notebook or VS Code

    • Additional libraries: PySpark, pandas, numpy, matplotlib, kafka-python


    Teaching Methodology

    • Instructor-led sessions

    • Guided hands-on lab exercises

    • Real-world scenarios and architecture walkthroughs

    • Use of VMs or containers for practical exposure

  • (0)
  • AZ-305: Designing Microsoft Azure Infrastructure Solutions
    Cloud Computing
    Preview Course

    Cloud Computing

    AZ-305: Designing Microsoft Azure Infrastructure Solutions

    HRDC Reg. No: TBD
    Duration:
    4 Days

    Course Overview

    This Microsoft Azure expert-level course provides hands-on training for designing and implementing Azure-based infrastructure solutions. Participants will learn how to design governance, compute, storage, networking, security, and data integration solutions on Azure. The course aligns with industry best practices and prepares professionals for the AZ-305 certification exam.


    Who Should Attend?

    This course is ideal for:

    Cloud Architects – Designing scalable and secure Azure infrastructure.
    IT Professionals – Managing enterprise cloud environments.
    Solution Architects – Creating cloud-based solutions for businesses.
    Database Administrators – Designing scalable and available data solutions.


    Why Choose This Course?

    HRDC Claimable (Check with HRDC for eligibility)
    Covers real-world Azure infrastructure solutions
    Hands-on training with practical design scenarios
    Prepares for Microsoft AZ-305 certification


    Prerequisites

    Knowledge of Azure Active Directory
    Understanding of Azure Compute Technologies (VMs, Containers, Serverless)
    Familiarity with Azure Virtual Networking and Load Balancers
    Understanding of Azure Storage (structured & unstructured data)
    Basic knowledge of application design concepts (messaging, high availability)

  • (0)
  • Advanced SQL Azure
    Cloud Computing
    Preview Course

    Cloud Computing

    Advanced SQL Azure

    HRDC Reg. No: 10001465529
    Duration: 14 hours (2 days)

    Course Overview

    This advanced course provides a comprehensive understanding of SQL Azure, focusing on performance optimization, security, scalability, and automation within Microsoft's Azure SQL Database. Designed for database administrators, developers, and IT professionals, this course equips participants with the skills needed to manage and optimize cloud-based databases effectively.

    Who Should Attend

    Ideal for:

    • Database Administrators
    • Data Engineers
    • SQL Developers
    • Cloud Solution Architects

    Learning Outcomes

    By the end of this course, participants will be able to:

    1. Optimize database performance and scalability in SQL Azure.
    2. Implement advanced security features like encryption and threat detection.
    3. Automate database management tasks using Azure tools.
    4. Design high availability and disaster recovery solutions.
    5. Integrate SQL Azure with other Azure services.
    6. Perform query tuning and index management.
    7. Utilize monitoring and alerting tools for database health.

    Prerequisites

    • Basic understanding of SQL and SQL Azure.
    • Familiarity with cloud computing concepts.
    • Experience with database management and development.

    Lab Setup

    • Access to Azure Portal with SQL Database services.
    • SQL Server Management Studio (SSMS) installed.
    • Pre-configured Azure SQL Database for hands-on labs.

    Teaching Methodology

    • Instructor-led presentations and demonstrations.
    • Hands-on labs and real-world scenarios.
    • Interactive discussions and Q&A sessions.
    • Case studies and problem-solving exercises.

  • (0)
  • Architecting Microsoft Azure Solutions
    Cloud Computing
    Preview Course

    Cloud Computing

    Architecting Microsoft Azure Solutions

    HRDC Reg. No: 10001465531
    Duration: 35 hours (5 days)

    Course Overview

    This 5-day course provides a comprehensive understanding of Microsoft Azure architecture, focusing on designing and implementing secure, scalable, and efficient cloud solutions. Participants will learn about Azure infrastructure, networking, security, compute, storage, data management, and application services using best practices and Azure tools.

    Who Should Attend

    Ideal for:

    • Cloud Architects
    • Solutions Architects
    • IT Professionals
    • Developers seeking Azure architecture expertise

    Learning Outcomes

    Upon completing this course, participants will be able to:

    1. Understand core architectural components of Microsoft Azure.
    2. Design secure and scalable Azure solutions.
    3. Implement networking configurations for Azure environments.
    4. Integrate on-premises infrastructure with Azure services.
    5. Optimize performance and cost-efficiency.
    6. Monitor, manage, and maintain Azure infrastructure.
    7. Develop Azure applications and microservices.

    Prerequisites

    • Basic understanding of cloud computing concepts.
    • Familiarity with the Microsoft Azure platform.
    • Knowledge of networking fundamentals.
    • Experience with IT architecture and design principles.

    Lab Setup Requirements

    • Microsoft Azure subscription.
    • Configured environment with resource creation permissions.
    • Tools: Azure CLI, Visual Studio, Azure Portal.

    Teaching Methodology

    • Instructor-led lectures.
    • Hands-on labs with practical scenarios.
    • Real-world case studies.
    • Interactive discussions and Q&A sessions.

  • (0)
  • AWS Business Essentials
    Cloud Computing
    Preview Course

    Cloud Computing

    AWS Business Essentials

    HRDC Reg. No: 10001465533
    Duration: 7 hours (1 day)

    Course Overview

    This 1-day course provides a high-level overview of Amazon Web Services (AWS) and its key services, focusing on cloud computing benefits for business decision-makers. It covers financial advantages, security and compliance, and cloud migration strategies, helping participants make informed decisions on cloud adoption.

    Who Should Attend

    Ideal for:

    • Business Leaders
    • IT Decision-Makers
    • Finance Managers
    • Cloud Adoption Stakeholders

    Learning Outcomes

    By the end of the course, participants will be able to:

    1. Understand the key concepts and benefits of cloud computing with AWS.
    2. Identify the main AWS services and their business applications.
    3. Evaluate the financial advantages of cloud adoption with AWS.
    4. Understand AWS security, compliance, and the shared responsibility model.
    5. Make informed decisions about cloud migration strategies and service selection.

    Prerequisites

    • Basic understanding of IT concepts and business fundamentals.
    • No prior AWS knowledge is required.

    Lab Setup

    • AWS Free Tier account for demonstrations.
    • Basic internet-enabled computer for AWS Management Console access.

    Teaching Methodology

    • Interactive lectures with real-world examples.
    • Live demonstrations of AWS services.
    • Group discussions and Q&A sessions.
    • Case studies highlighting successful AWS adoption.

  • (0)
  • AWS Cloud Practitioner Essentials
    Cloud Computing
    Preview Course

    Cloud Computing

    AWS Cloud Practitioner Essentials

    HRDC Reg. No: 10001468114
    Duration: 7 hours (1 day)

    Course Overview

    This one-day course offers a comprehensive introduction to AWS Cloud concepts, services, security, architecture, pricing, and support. Designed for business professionals and those new to cloud technologies, it covers key AWS services and foundational principles of cloud computing.

    Who Should Attend

    Ideal for:

    • Business Professionals working with AWS.
    • Sales, Marketing, Legal, and Management Teams.
    • Individuals new to the cloud and AWS technologies.

    Learning Outcomes

    By the end of this course, participants will be able to:

    1. Describe the AWS Cloud infrastructure and its global reach.
    2. Explain AWS core services and their use cases.
    3. Understand cloud architecture principles.
    4. Articulate the value proposition of AWS.
    5. Identify security and compliance features and the shared responsibility model.
    6. Understand billing models and cost management tools.
    7. Locate technical support resources and documentation.

    Prerequisites

    • No prior AWS or cloud computing experience required.

    Lab Setup

    • AWS Free Tier account or instructor-provided account.
    • Stable internet connection and web browser.

    Teaching Methodology

    • Interactive lectures with real-world demonstrations.
    • Hands-on labs and guided exercises.
    • Scenario-based discussions to reinforce concepts.
    • Q&A sessions for participant engagement.

  • (0)
  • Cloud Computing with AWS
    Cloud Computing
    Preview Course

    Cloud Computing

    Cloud Computing with AWS

    HRDC Reg. No: 10001465535
    Duration: 35 hours (5 days)

    Course Overview

    This comprehensive 5-day course covers AWS Cloud Computing concepts, focusing on essential services, architecture best practices, and hands-on experience with AWS tools. Participants will gain the skills to design, deploy, and manage secure, scalable, and resilient cloud-based solutions using Amazon Web Services (AWS).

    Who Should Attend

    Ideal for:

    • IT Professionals and System Administrators
    • Developers and Cloud Engineers
    • AWS Certification Candidates

    Learning Outcomes

    By the end of the course, participants will be able to:

    1. Understand core cloud computing concepts and AWS architecture.
    2. Set up and manage AWS accounts and services.
    3. Design and deploy scalable cloud architectures.
    4. Implement security and compliance best practices.
    5. Utilize AWS services like EC2, S3, RDS, Lambda, and VPC.
    6. Automate cloud deployments using AWS tools.

    Prerequisites

    • Basic knowledge of networking and server management.
    • Familiarity with command-line interfaces and scripting.
    • Experience with IT infrastructure concepts.

    Lab Setup

    • AWS Free Tier account setup.
    • Virtual lab environment for hands-on practice.
    • Internet access for AWS Management Console.

    Teaching Methodology

    • Instructor-led lectures for theoretical understanding.
    • Hands-on labs for practical experience.
    • Group discussions and Q&A sessions.
    • Case studies for real-world applications.

  • (0)
  • Cloud Computing with Google Cloud
    Cloud Computing
    Preview Course

    Cloud Computing

    Cloud Computing with Google Cloud

    HRDC Reg. No: 10001465537
    Duration: 35 hours (5 days)

    Course Overview

    This 5-day comprehensive course provides an in-depth exploration of Google Cloud Platform (GCP), focusing on core cloud services, data management, security, DevOps practices, and machine learning. Participants will gain hands-on experience in building, deploying, and managing scalable applications using GCP tools and best practices.


    Who Should Attend

    Ideal for:

    • IT Professionals and Developers
    • Cloud Architects and System Administrators
    • Data Engineers and Data Scientists

    Learning Outcomes

    By the end of the course, participants will be able to:

    1. Understand Google Cloud architecture and core concepts.
    2. Utilize GCP's Compute, Storage, and Database Services.
    3. Deploy and manage scalable applications on GCP.
    4. Implement networking and security best practices.
    5. Manage data using BigQuery, Cloud SQL, and Spanner.
    6. Automate infrastructure using Cloud Build and Terraform.
    7. Explore Machine Learning services on GCP.

    Prerequisites

    • Basic understanding of cloud computing concepts.
    • Familiarity with networking and databases.
    • Programming knowledge (Python/Java) recommended but not mandatory.

    Lab Setup

    • Access to Google Cloud Console with permissions for creating projects and billing.
    • Stable internet connection and modern web browser.

    Teaching Methodology

    • Interactive lectures with real-world case studies.
    • Hands-on labs and guided exercises.
    • Group activities and collaborative projects.
    • Quizzes and assessments for concept reinforcement.

  • (0)
  • Cloud Patterns and Architecture
    Cloud Computing
    Preview Course

    Cloud Computing

    Cloud Patterns and Architecture

    HRDC Reg. No: 10001465542
    Duration: 35 hours (5 days)

    Course Overview

    This 5-day course explores the principles, design patterns, and best practices essential for building scalable, resilient, and cost-efficient cloud architectures. Participants will gain hands-on experience designing solutions using AWS, Azure, and Google Cloud while addressing real-world architectural challenges.


    Who Should Attend

    Ideal for:

    • Cloud Architects
    • DevOps Engineers
    • Software Developers
    • IT Professionals managing cloud migration or deployments

    Learning Outcomes

    By the end of this course, participants will be able to:

    1. Understand core cloud architecture principles and concepts.
    2. Implement cloud design patterns for scalability, security, and resilience.
    3. Build highly available and fault-tolerant cloud applications.
    4. Design architectures to meet business and technical requirements.
    5. Implement disaster recovery and auto-scaling strategies.
    6. Optimize performance and cost-efficiency in cloud solutions.
    7. Leverage cloud-native tools for automation and deployment.

    Prerequisites

    • Basic knowledge of cloud platforms (AWS, Azure, GCP).
    • Familiarity with networking, databases, and operating systems.
    • Basic understanding of software development and system architecture.

    Lab Setup

    • Access to AWS, Azure, or GCP accounts.
    • Tools: IDE, SDKs, CLI, and cloud-native services.
    • Virtual environments for simulating real-world use cases.

    Teaching Methodology

    • Lectures and interactive discussions.
    • Hands-on labs and live demonstrations.
    • Case studies and real-world scenarios.
    • Group projects for architectural design challenges.

  • (0)
  • Developing Applications with Google Cloud Platform (GCP)
    Cloud Computing
    Preview Course

    Cloud Computing

    Developing Applications with Google Cloud Platform (GCP)

    HRDC Reg. No: 10001465544
    Duration: 21 hours (3 days)

    Course Overview

    This 3-day course is designed for developers who want to design, develop, and deploy applications using Google Cloud Platform (GCP) services. Participants will gain hands-on experience with cloud storage, compute engines, Kubernetes, serverless services, and APIs, enabling them to build scalable, secure, and cloud-native applications.


    Who Should Attend

    Ideal for:

    • Application Developers
    • Cloud Engineers
    • DevOps Professionals
    • IT Professionals seeking GCP certification

    Learning Outcomes

    By the end of this course, participants will be able to:

    1. Understand GCP core services for application development.
    2. Develop and deploy cloud-native applications using GCP tools.
    3. Implement Google Compute Engine, App Engine, Kubernetes, and Cloud Functions.
    4. Apply security and performance optimization best practices.
    5. Use Cloud SDK, Cloud Build, and Cloud Monitoring for efficient deployment.
    6. Integrate GCP APIs and databases into applications.

    Prerequisites

    • Familiarity with application development (Python, Java, or Node.js).
    • Basic knowledge of cloud computing.
    • Experience with web frameworks and databases (optional).

    Lab Setup

    • Google Cloud Platform account (trial or paid).
    • Google Cloud SDK installed.
    • IDE (Visual Studio Code, IntelliJ IDEA).

    Teaching Methodology

    • Instructor-led presentations with demonstrations.
    • Hands-on labs for real-world application.
    • Group discussions and Q&A sessions.
    • Project-based learning and case studies.

  • (0)
  • Developing Windows Azure Solutions
    Cloud Computing
    Preview Course

    Cloud Computing

    Developing Windows Azure Solutions

    HRDC Reg. No: 10001468116
    Duration: 28 hours (4 days)

    Course Overview

    This 4-day course provides a deep dive into Microsoft Azure, focusing on developing, deploying, and managing cloud applications. Participants will gain expertise in Azure compute, storage, networking, and security services, ensuring they can build scalable, secure, and resilient Azure-based solutions.


    Who Should Attend

    Ideal for:

    • Software Developers
    • Cloud Engineers
    • Solutions Architects
    • IT Professionals working with Azure applications

    Learning Outcomes

    By the end of this course, participants will be able to:

    1. Understand Azure core services and their applications in cloud development.
    2. Design and implement Azure compute services (App Services, Functions, VMs).
    3. Manage Azure Storage solutions for cloud-based applications.
    4. Deploy and optimize Azure applications using Azure DevOps.
    5. Implement security best practices with IAM and encryption.
    6. Utilize advanced services like API Management, Logic Apps, and AKS.

    Prerequisites

    • Basic understanding of cloud computing concepts.
    • Familiarity with .NET programming and web development.
    • Experience with databases and RESTful APIs.

    Lab Setup

    • Microsoft Azure subscription with resource group creation permissions.
    • Visual Studio or Visual Studio Code installed.
    • Azure CLI or Azure PowerShell installed.

    Teaching Methodology

    • Lectures: Interactive concept explanations.
    • Hands-on Labs: Real-world Azure exercises.
    • Case Studies: Practical use cases and problem-solving.
    • Group Activities: Collaborative learning projects.
    • Q&A Sessions: Interactive discussions and clarifications.

  • (0)
  • Google Cloud Architect- Engineer on Apigee
    Cloud Computing
    Preview Course

    Cloud Computing

    Google Cloud Architect- Engineer on Apigee

    HRDC Reg. No: 10001468849
    Duration: 35 hours (5 days)

    Course Overview

    This 35-hour comprehensive course is designed to equip cloud architects and engineers with the foundational knowledge and practical skills required to design, build, and manage API gateways and microservices using Google Cloud Apigee. Participants will explore the essential concepts of API design, development, security, and deployment while understanding the architecture and best practices of Apigee within Google Cloud Platform (GCP).


    Who Should Attend?

    • Cloud architects
    • DevOps engineers
    • API developers
    • IT professionals looking to specialize in API management with Apigee

    Why Choose This Course?

    This HRDC-claimable course (HRDC Reg. No: 10001468849) provides hands-on experience in designing, securing, and managing APIs using Google Cloud Apigee, with real-world case studies and projects.


    Learning Outcomes

    By the end of this course, participants will be able to:

    Understand Apigee architecture and its components
    Design, secure, and manage APIs using Google Cloud Apigee
    Implement API policies for rate-limiting, quotas, security, and caching
    Utilize OAuth2.0, SAML, and OpenID Connect for securing APIs
    Monitor, troubleshoot, and optimize API performance on Apigee
    Integrate Apigee with other Google Cloud services
    Implement and manage API monetization models
    Design scalable, secure, and reliable API gateways for enterprise solutions


    Prerequisites

    • Basic knowledge of cloud computing and networking
    • Familiarity with REST APIs and their implementation
    • Basic programming knowledge (preferably in Java, Python, or Node.js)
    • Experience with GCP is an advantage but not mandatory

    Lab Setup

    • A GCP account with administrative privileges
    • Apigee Edge account with developer permissions
    • IDE for API development (VS Code, IntelliJ, or similar)
    • Sample APIs and datasets for hands-on practice
    • (Optional) Docker for running microservices and API services locally

    Teaching Methodology

    Instructor-led Lectures – In-depth explanations of Apigee and API concepts
    Hands-on Labs & Exercises – Practical implementation of API management techniques
    Group Discussions & Problem-Solving – Collaborative approach to real-world API issues
    Case Studies & Project-Based Assignments – Industry use cases and applications
    Continuous Assessments – Quizzes, feedback sessions, and a final project

  • (0)