GPT-3

Partager
" Retour à l'index des glossaires

GPT-3, or Generative Pretrained Transformer 3, is an advanced language model created by OpenAI[2]. As the third iteration of the GPT series, this model stands out due to its unparalleled size, making it the biggest non-sparse language model to date. It surpasses its predecessor, GPT-2[4], as well as Microsoft[1]’s Turing NLG, boasting ten times the capacity of the latter. GPT-3 is known for its ability to generate text, such as news articles, and assist with coding tasks. However, it also has potential misuse risks like spreading misinformation or phishing. Various versions of GPT-3 serve different needs, the largest being davinci with 175 billion parameters. The later GPT-3.5 series introduced new models and capabilities. GPT-3 is instrumental in industry and research, underpinning products like GitHub[3] Copilot and being used in several Microsoft products. However, its use also prompts ethical and academic concerns.

Définitions des termes
1. Microsoft ( Microsoft ) Microsoft is a globally recognized technology company, well-known for its software, hardware, and other digital services. Founded in 1975 by Bill Gates and Paul Allen, it launched revolutionary products like the Windows operating system, Microsoft 365 suite, and the Xbox gaming consoles. Under the leadership of their current CEO, Satya Nadella, Microsoft broadened its scope to include cloud computing and pursued a policy of active acquisitions, including GitHub and Mojang, to enhance its product offerings. Despite facing criticisms and legal challenges over monopoly behavior and usability issues, Microsoft has maintained a solid financial performance, even achieving a trillion-dollar market cap in 2019. Today, it continues to innovate and expand, holding a significant position in the global tech industry.
2. OpenAI ( OpenAI ) OpenAI is a prominent artificial intelligence (AI) research organization that was established in December 2015. It was founded by a group of technology entrepreneurs, including Elon Musk and Sam Altman, to develop and promote friendly AI for the benefit of all of humanity. As an organization, OpenAI places a significant emphasis on openness, collaboration, and transparency, often partnering with other institutions in their research. OpenAI has been funded with over $1 billion and is based in San Francisco. The organization has developed various AI platforms, such as OpenAI Gym and Universe, and has also introduced several groundbreaking AI models, including GPT-3 and DALL-E. In a significant shift in 2019, OpenAI transitioned to a capped for-profit model to attract more funding, with profits capped at 100 times the investment. They have also collaborated with Microsoft on a $1 billion investment. OpenAI's research and models have wide-ranging commercial applications, driving the future of AI technology.
GPT-3 (Wikipedia)

Generative Pre-trained Transformer 3 (GPT-3) is a large language model released by OpenAI in 2020. Like its predecessor, GPT-2, it is a decoder-only transformer model of deep neural network, which supersedes recurrence and convolution-based architectures with a technique known as "attention". This attention mechanism allows the model to selectively focus on segments of input text it predicts to be most relevant. It uses a 2048-tokens-long context[jargon], float16 (16-bit) precision, and a hitherto-unprecedented 175 billion parameters, requiring 350GB of storage space as each parameter takes 2 bytes of space, and has demonstrated strong "zero-shot" and "few-shot" learning abilities on many tasks.

Generative Pre-trained Transformer 3 (GPT-3)
Original author(s)OpenAI
Initial releaseJune 11, 2020 (beta)
Repository
PredecessorGPT-2
SuccessorGPT-3.5
GPT-4
Type
Site webopenai.com/blog/openai-api

On September 22, 2020, Microsoft announced that it had licensed GPT-3 exclusively. Others can still receive output from its public API, but only Microsoft has access to the underlying model.

" Retour à l'index des glossaires
fr_FRFR
Retour en haut