A: The GPT-2 model uses probabilistic text generation, meaning it doesn’t produce identical outputs for the same input. Each execution samples from probability distributions of possible next words, resulting in creative variations and diverse story completions even when given identical incomplete sentences.