Maxtrain.com - info@maxtrain.com - 513-322-8888 - 866-595-6863
AI and ML Foundations with Python
Ohio TechCred Approved Credential: Python Machine Learning
Description
AI and ML Foundations with Python
Learn to design, build, and apply machine learning and generative AI models through hands-on, scenario-based labs. This course blends core machine learning foundations—data wrangling, visualization, regression, and classification—with modern generative AI techniques using open-source LLMs and Transformers.
Students explore Python’s data-science ecosystem with Pandas, NumPy, and Matplotlib, build and evaluate models with scikit-learn, examine interpretability and ethical AI, and progress into deep learning using PyTorch. By course end, participants will have built and deployed an AI-powered application capable of running locally or in the cloud.
Audience
- Python Developers
- Data Analysts and Aspiring Data Scientists
- DevSecOps Engineers
- AI-Curious Managers and Architects
Prerequisites
- Python – PCEP Certification or equivalent experience
- Familiarity with Linux command-line environment
Learning Outcomes
- Prepare and visualize data with Pandas, NumPy, and Matplotlib
- Train and evaluate classical ML models with scikit-learn
- Interpret models using SHAP and understand responsible AI concepts
- Train and optimize Transformer models using PyTorch
- Master advanced prompt engineering
- Understand Transformer architecture, embeddings, and self-attention
- Work with open-source LLM frameworks such as LLaMA and cpp
- Apply model quantization and fine-tuning techniques
- Compare CPU vs GPU hardware acceleration
- Develop a real-world AI application powered by open-source models
AI and ML Foundations with Python Outline
Foundations of Machine Learning and Data Handling
Python for Data Science Refresher
- Lecture: Python’s role in Data Science and AI
- Lecture: Overview of Pandas, NumPy, and Matplotlib
- Lab: Solve basic data-science problems using Pandas and NumPy
- Lab: Create visualizations (histograms, scatter plots, correlations) with Matplotlib and Seaborn
Exploratory Data Analysis (EDA) and Preprocessing
- Lecture: Understanding datasets, features, labels, outliers
- Lecture: Data wrangling and preprocessing (missing data, normalization, encoding)
- Lab: Clean and preprocess a real dataset (e.g., Titanic or Housing)
- Lab: Visualize distributions and relationships to identify predictive features
- Classical Machine Learning with scikit-learn
- Lecture: Introduction to supervised learning, regression and classification
- Lecture: Unsupervised learning overview, clustering basics
- Lab: Implement linear regression and evaluate R² and MSE
- Lab: Train a classification model (e.g., Logistic Regression or RandomForest) and visualize confusion matrices
- Lecture: Model evaluation metrics, accuracy, precision, recall, F1-score
- Lab: Experiment with k-means clustering and compare cluster performance
Explainability and Ethical AI
- Lecture: Why explainable AI matters, bias, transparency, and trust
- Lab: Use SHAP or feature importance visualization on a small model
- Lecture: Responsible AI practices and data ethics in machine learning
Transition from Classical ML to Generative AI
Deep Learning and the Rise of Transformers
- Lecture: From neural networks to Transformers, key differences
- Lecture: Feed-Forward Neural Networks and the role of embeddings
- Lecture: What is Intelligence? Introduction to Generative AI
- Lecture: The Transformer model explained
- Lab: Tokenization, breaking text into tokens
- Lab: Word embeddings, numerical representation of language
- Lab: Positional encoding, enabling sequence understanding
Build a Transformer Model from Scratch
- Lecture: Introduction to PyTorch
- Lab: Construct a Tensor from a dataset
- Lab: Orchestrate Tensors in blocks and batches
- Lab: Initialize PyTorch generator function
- Lab: Train the Transformer model with positional encoding and self-attention
- Lab: Attach feed-forward neural network and build decoder block
- Lab: Review the complete Transformer model as runnable code
Model Evaluation and Comparison
- Lecture: Comparing classical ML vs deep learning, trade-offs in complexity and data needs
- Lab: Re-run earlier regression/classification datasets using a simple neural network to compare results
Generative AI, Prompt Engineering, and Real-World Applications
Prompt Engineering
- Lecture: Introduction to prompt engineering concepts
- Lab: Getting started with Gemini or OpenAI API (depending on environment)
- Lab: Develop basic prompts and explore role/context patterns
- Lab: Intermediate prompts — define tasks, inputs, outputs, constraints, and style
- Lab: Advanced prompts — chaining, role setting, feedback loops, and exemplars
Hardware Requirements and Optimization
- Lecture: GPUs vs CPUs — why acceleration matters
- Lecture: TensorCore architectures vs legacy GPU designs
- Lecture: Cost and performance analysis for training and inference workloads
Working with Pre-Trained LLMs
- Lecture: A history of neural network architectures
- Lecture: Introduction to the cpp interface
- Lecture: Preparing an A100 GPU for server operations
- Lab: Operate LLaMA2 models with LLaMA.cpp
- Lab: Select quantization levels balancing performance and perplexity
- Lecture: Running the cpp package
- Lab: Experiment with interactive and persistent contexts
- Lab: Constrain outputs with grammars
- Lab: Deploy a LLaMA API server and interact via a client
Fine-Tuning and Application Integration
- Lab: Fine-tune a model using PyTorch on a small dataset
- Lab: Apply advanced prompt-engineering techniques
- Lab: Build a simple web application (Flask or FastAPI) that integrates your fine-tuned or pre-trained model
- Lab: Implement basic error handling and response caching
Testing and Pushing Limits
- Lab: Optimize model parameters and prompt patterns for best results
- Lab: Evaluate latency and resource utilization across different quantization levels
Capstone Project
Build a Real-World AI Application
Students design and deploy a small but functional AI-driven application using open-source LLMs. The capstone incorporates:
- Dataset preparation and EDA from Day 1
- Model fine-tuning and deployment from Day 3
- Integration of ethical AI principles and explainability visualizations
Certification
- Alta3 AI & ML Foundations with Python Certification
|
$2495.00
|
3 Days Course |

