A Generative Pretrained Transformer (GPT) is a type of large language model (LLM) that uses deep learning to generate human-like text.
Generative - they can generate new text based on the input they receive
Pretrained - they are trained on a large corpus of text data before being fine-tuned for specific tasks
Transformers - they use a transformer based neural network architecture to process input text and generate output text.
Some generative AI models have been trained on large of amounts of data found on the internet, including copyrighted materials. For this reason, responsible AI practices have become an organizational imperative.
OpenAI’s latest release ChatGPT, which caused a viral sensation and reached a million users in just five days, has been described as breaking ground in a much broader range of tasks. The use cases currently under discussion include new architectures of search engines; explaining complex algorithms; creating personalized therapy bots, helping build apps from scratch; explaining scientific concepts; writing recipes; and college essays, among others.
Text-to-image programs such as Midjourney, DALL-E and Stable Diffusion have the potential to change how art, animation, gaming, movies and architecture, among others, are being rendered. Bill Cusick, creative director at Stability AI, believes that the software is “the foundation for the future of creativity”.
Based on a new era of human-machine based cooperation, optimists claim that generative AI will aid the creative process of artists and designers, as existing tasks will be augmented by generative AI systems, speeding up the ideation and, essentially, the creation phase.
GPT-3.5, a foundation model trained on large volumes of text, can be adapted for answering questions, text summarization, or sentiment analysis. DALL-E, a multimodal (text-to-image) foundation model, can be adapted to create images, expand images beyond their original size, or create variations of existing paintings.
Refer to the additional references and respurces below for the exam preparations.