GPT-3: Hype or Hyper-useful to the Globalization and…


2021-07-07 01:25 GALA


阅读模式 切换至中文

Click on the arrow above to listen to this article By Dr. Patrice Caire In June of 2020, the US research lab OpenAI, which is backed up by Elon Musk, among others, launched its breakthrough language model: GPT-3—Generative Pre-trained Transformer 3. Why was it such a big event? Because GPT-3 may just change the way we work: we’re talking about a tool that can summarize mails and news articles, generate tweets and create chatbots, and yes…translate language. GPT-3 versus Other Language-prediction Programs As a language-prediction program created with neural networks, GPT-3 is by far the most powerful language model ever created. It is an artificial-intelligence system trained on a gigantic corpus of text: thousands of digital books, the entire Wikipedia, and roughly a trillion words from blogs, social media, and the internet at large. Given enough text—and enough processing—the machine learns probabilistic connections between words. In a nutshell: GPT-3 can read and write. In fact, GPT-3 can process text far better than its predecessor, GPT-2 (2019). GPT-3 is also distinguished by its sheer size: it boasts 100+ times more parameters than its previous version. GPT-3 uses 175 billion parameters, compared with the 17 billion of its closest competitor, Microsoft’s Turing NLG—making it the largest neural-network ever built. GPT-3: The Power and the Glory Get this: GPT-3 can correct English. It can write poetry. It can write like Faulkner, Proust, or Goethe. Any author you “feed” it—really. Even more relevant to our industry, GPT-3 can also: translate legal text into plain language—and generate legal text from simple “instructions”; write the code to create a website, based on your directions; generate well-formatted charts and infographics—from your summary; and fill in the missing parts of a spreadsheet, by computing formula on its own. What We Love about GPT-3 First, using easily understood natural language, GPT-3 can answer questions on any topic, while retaining the context of previously-asked questions. GPT-3 can also translate from spoken languages and programming languages—Python, JavaScript, and CSS (very useful for websites!), for example. GPT-3 can also generate questions and answers for UX, due diligence document search, and report generation. GPT-3: Dealbreakers At first, GPT-3’s generated text can be impressive. But longer texts tend to dissolve into meaninglessness. Another problem: like many chatbots, for example, the program amplifies biases inherent in the data—including racism and sexism. Then there’s the ginormous amount of processing power that GPT-3 sucks up: the program is miles more “compute-hungry” than even bitcoin. Finally, there’s cost: conservative estimates place a GPT-3 training run at a staggering $4.6 million—well beyond the reach of most companies. GPT-3 and You Right now, GPT-3 isn’t ready to replace the leading conversational interfaces on the market—Amazon’s Echo and Apple’s Siri, for example—but it can be used as a powerful tool to improve them. For now, GPT-3 is a crucial steppingstone to building language models of the future. So, How Can GPT-3 Be Used Effectively? It’s unclear. After all, if GPT-3 is eventually shown to generate the right text only half of the time, will it satisfy professionals? Be of use as more than a steppingstone? Let’s see!
点击上面的箭头收听这篇文章 作者:Patrice Caire博士 2020年6月,由埃隆·马斯克等人支持的美国研究实验室OpenAI推出了其突破性的语言模型:gpt-3——生成预训练的变压器3。 为什么这是一个这么大的事件?因为GPT-3可能只会改变我们的工作方式:我们谈论的是一个可以总结邮件和新闻文章、生成推文和创建聊天机器人翻译语言。 GPT-3与其他语言预测程序相比 作为一个由神经网络创建的语言预测程序,GPT-3是迄今为止创建的最强大的语言模型。它是一个人工智能系统,在大量文本库上训练的人工智能系统:数千本电子书、整个维基百科,以及来自博客、社交媒体和整个互联网的大约1万亿个单词。给定足够的文本和处理,机器可以学习单词之间的概率联系。简而言之:GPT-3可以读写。 事实上,GPT-3对文本的处理能力远远优于其前身GPT-2(2019)。GPT-3的区别还在于它纯粹的绝对大小:它拥有的参数比以前的版本多100+倍。GPT-3使用了1750亿个参数,而其最接近的竞争对手微软的图灵NLG只有170亿个参数,这使其成为有史以来建立的最大的神经网络。 GPT-3:权力与荣耀 GPT-3可以纠正英语。它可以写诗。它可以像福克纳、普鲁斯特或歌德一样写作。任何一个你“喂”它的作者,真的。与我们的行业更相关的是,GPT-3还可以: 将法律文本翻译成简单的语言,并从简单的“说明”中生成法律文本; 根据您的指导方向,编写代码来创建一个网站; 从摘要中生成格式良好的图表和信息图形;以及 通过计算公式来填写电子表格中缺少的部分。 我们喜欢GPT-3的地方 第一,GPT-3使用易于理解的自然语言,可以回答任何主题上的问题,同时保留之前提出的问题的上下文。GPT-3还可以从口语和编程语言中翻译——Python、JavaScript和CSS(对网站非常有用!),例如。GPT-3还可以为用户交换机、尽职调查文档搜索和报告生成生成问题和答案。 GPT-3:Dealbreakers 首先,GPT-3生成的文本可能会令人印象深刻。但较长的文本往往会变成毫无意义的文本。 另一个问题是:比如,像许多聊天机器人一样,该程序放大了数据中固有的偏见——包括种族主义和性别歧视。 此外,GPT-3吸收的巨大处理能力:该程序比比特币更“渴望计算”。 最后,还有成本:保守估计GPT-3培训为惊人的460万美元,远远超出了大多数公司的能力。 GPT-3和你 现在,GPT-3还没有准备好取代市场上领先的对话界面——例如亚马逊的Echo和苹果的Siri——但它可以作为一个强大的工具来改进它们。目前,GPT-3是构建未来语言模型的关键基石。 那么,如何有效地使用GPT-3呢? 目前还不清楚。毕竟,如果GPT-3最终只在一半的时间内生成了正确的文本,它会让专业人士满意吗?不不石头石头一样吗?让我们看看!