Difference between revisions of "Basics-of-Artificial-Intelligence/C2/Generative-AI/English"
| Line 31: | Line 31: | ||
|| '''Slide 3''' | || '''Slide 3''' | ||
'''System Requirements''' | '''System Requirements''' | ||
| + | |||
| + | ||To practice this tutorial, you will need: | ||
| + | |||
* A '''computer''', a '''laptop''', or a '''smartphone''' | * A '''computer''', a '''laptop''', or a '''smartphone''' | ||
| Line 39: | Line 42: | ||
You do not need '''coding''' or technical skills. | You do not need '''coding''' or technical skills. | ||
| + | |- | ||
|| '''Slide 4''' | || '''Slide 4''' | ||
| Line 46: | Line 50: | ||
|| For the prerequisite '''Artificial Intelligence tutorials''', please visit this '''website'''. | || For the prerequisite '''Artificial Intelligence tutorials''', please visit this '''website'''. | ||
| − | In a previous '''tutorial''', we saw how '''AI''' helps ''' | + | In a previous '''tutorial''', we saw how '''AI''' helps '''computer''' to think and act smart. |
But have you ever wondered how '''AI''' makes all this happen? | But have you ever wondered how '''AI''' makes all this happen? | ||
| Line 105: | Line 109: | ||
|| '''GenAI''' uses a '''model''' called '''Transformer'''. | || '''GenAI''' uses a '''model''' called '''Transformer'''. | ||
| − | Transformers process language in parallel, rather than word by word. | + | '''Transformers''' process language in parallel, rather than word by word. |
This makes them faster, smarter, and more efficient. | This makes them faster, smarter, and more efficient. | ||
| Line 164: | Line 168: | ||
Think of '''tokens''' like puzzle pieces. | Think of '''tokens''' like puzzle pieces. | ||
| − | AI first breaks the information into small pieces. | + | '''AI''' first breaks the information into small pieces. |
Next it understands each piece. | Next it understands each piece. | ||
| Line 176: | Line 180: | ||
It stands for '''Large Language Model'''. | It stands for '''Large Language Model'''. | ||
| − | LLM is a powerful '''model''' trained on billions of text '''tokens'''. | + | '''LLM''' is a powerful '''model''' trained on billions of text '''tokens'''. |
For example: '''ChatGPT, Gemini,''' and '''LLaMA'''. | For example: '''ChatGPT, Gemini,''' and '''LLaMA'''. | ||
| Line 187: | Line 191: | ||
It’s the stage where the '''model''' produces a response after it has been trained. | It’s the stage where the '''model''' produces a response after it has been trained. | ||
| + | |||
So, when you give a '''prompt''', the '''model''' generates an '''output'''. | So, when you give a '''prompt''', the '''model''' generates an '''output'''. | ||
Revision as of 04:21, 29 December 2025
Tutorial Name: Generative AI
Keywords: Generative AI, Predictive AI, Transformer, LLM, Prompt, Model, Training, Inference, Spoken Tutorial, Video Tutorial, EduPyramids
Pre-requisite Tutorial: Introduction to Machine Learning and Deep Learning
| Visual Cue | Narration |
| Title Slide
“Introduction to Generative AI” |
Welcome to this Spoken Tutorial on Introduction to Generative AI. |
| Slide 2
Learning Objective Bulleted list |
In this tutorial, you will learn about
Generative AI and Predictive AI Differences between Predictive AI and Generative AI Some important Generative AI terms |
| Slide 3
System Requirements |
To practice this tutorial, you will need:
You do not need coding or technical skills. |
| Slide 4
Pre-requisite |
For the prerequisite Artificial Intelligence tutorials, please visit this website.
In a previous tutorial, we saw how AI helps computer to think and act smart. But have you ever wondered how AI makes all this happen? |
| Slide 5
Generative AI Text highlight: “Create new things” |
Generative AI can do creative tasks that we thought only humans could do.
Generative AI, or GenAI, doesn’t just analyze data but can also create new things. |
| Montage: AI-generated poem, painting, code snippet | It can generate original content such as poems, images, music, code, and more. |
| Split screen: Spam filter vs AI painting | Most AI we use daily is Predictive AI.
Predictive AI classifies data and makes predictions. |
| Text on screen: “Predictive AI → What is this?” | Predictive AI helps in understanding two things.
It can give answers to “What is this?” and “What may happen next?” Predictive AI recognizes things and makes smart guesses about the future. |
| Spam filter animation | For example, use of a spam filter in an email inbox.
Here, Predictive AI checks each email and decides whether it is spam or not. |
| Recommender system visual | We use Predictive AI in many everyday apps.
Some of them are weather apps, movie apps, map apps, and photo apps. |
| Transition: Text “Generative AI → Create something new” | On the other hand, Generative AI or GenAI, can create something new.
It creates them based on the knowledge it has learned earlier. |
| Astronaut riding horse (AI art) | It can generate a unique image, like an astronaut riding a horse. |
| Poem generation example
Code snippet generation |
It can also write a short poem or generate a working code. |
| Text highlight: “Predictive → classify / Generative → create” | So remember this difference.
Predictive AI classifies and predicts data, whereas GenAI creates examples. |
| Timeline animation → 2017 marker | GenAI uses a model called Transformer.
Transformers process language in parallel, rather than word by word. This makes them faster, smarter, and more efficient. |
| Sentence visualization using a story analogy | Transformers understand how words relate to each other, like we follow a story. |
| Side-by-side: before vs after Transformer | This lets them generate well-structured, meaningful, human-like text.
GenAI grew rapidly after Transformer architecture made a breakthrough in 2017. |
| Transition slide: “Core Terminology” | Now, let’s learn a few important GenAI terms. |
| Brain icon | First, let's understand the term Model.
Think of the model as the “brain” of the AI system. A model can be small and task-specific. It can also be large and for general purposes. For example, ChatGPT is a model trained on large amounts of text. |
| Training animation | Next term is Training.
It means teaching the model to use a large amount of data. The model is given numerous examples, including entire books and sample texts. The model learns patterns from this data during training. |
| Prompt box + user typing | Next term is Prompt.
A prompt is the instruction you give to an AI model. It tells the model what task it has to perform. For example, “Write a story about a robot.” |
| Text cum Visual | Next term is Tokens.
AI models break the input into small pieces called tokens. A token is a unit of data. It can be in any form, such as text, image, or audio. The model reads, understands, and generates information token by token. Think of tokens like puzzle pieces. AI first breaks the information into small pieces. Next it understands each piece. Then it puts the pieces together to form a meaningful response. |
| LLM logo examples (GPT, Gemini, LLaMA) | Next term is LLM.
It stands for Large Language Model. LLM is a powerful model trained on billions of text tokens. For example: ChatGPT, Gemini, and LLaMA. |
| Model responding to prompt | Next term is Inference.
Inference is when the model takes a prompt and generates an output. It’s the stage where the model produces a response after it has been trained. So, when you give a prompt, the model generates an output. |
| Summary Slide
Bulleted list |
Let’s quickly summarize what we learned.
Predictive AI is used to classify or predict. Generative AI is used to create new content. Some key terms used in Generative AI. |
| Assignment Slide
Bulleted list |
As an assignment, do the following:
Pick one Predictive AI example and one Generative AI example from your daily life. Write down one line explaining why each fits its category. Also, reflect on any risks or biases that may exist in your examples. |
| EduPyramids logo | This Spoken Tutorial is brought to you by EduPyramids Educational Services Private Limited, SINE, IIT Bombay. |
| Closing Slide | Thank you for watching! |