Friday, December 09, 2022

 The ChatGPT chatbot is blowing people away with its writing skills. An expert explains why it’s so impressive


The Conversation
December 07, 2022

Shutterstock

We’ve all had some kind of interaction with a chatbot. It’s usually a little pop-up in the corner of a website, offering customer support – often clunky to navigate – and almost always frustratingly non-specific.

But imagine a chatbot, enhanced by artificial intelligence (AI), that can not only expertly answer your questions, but also write stories, give life advice, even compose poems and code computer programs.

It seems ChatGPT, a chatbot released last week by OpenAI, is delivering on these outcomes. It has generated much excitement, and some have gone as far as to suggest it could signal a future in which AI has dominion over human content producers.

What has ChatGPT done to herald such claims? And how might it (and its future iterations) become indispensable in our daily lives?

What can ChatGPT do?

ChatGPT builds on OpenAI’s previous text generator, GPT-3. OpenAI builds its text-generating models by using machine-learning algorithms to process vast amounts of text data, including books, news articles, Wikipedia pages and millions of websites.

By ingesting such large volumes of data, the models learn the complex patterns and structure of language and acquire the ability to interpret the desired outcome of a user’s request.


ChatGPT can build a sophisticated and abstract representation of the knowledge in the training data, which it draws on to produce outputs. This is why it writes relevant content, and doesn’t just spout grammatically correct nonsense.

While GPT-3 was designed to continue a text prompt, ChatGPT is optimized to conversationally engage, answer questions and be helpful. Here’s an example:




A screenshot from the ChatGPT interface as it explains the Turing test.

ChatGPT immediately grabbed my attention by correctly answering exam questions I’ve asked my undergraduate and postgraduate students, including questions requiring coding skills. Other academics have had similar results.

In general, it can provide genuinely informative and helpful explanations on a broad range of topics.




ChatGPT can even answer questions about philosophy.

ChatGPT is also potentially useful as a writing assistant. It does a decent job drafting text and coming up with seemingly “original” ideas.




ChatGPT can give the impression of brainstorming ‘original’ ideas.
The power of feedback


Why does ChatGPT seem so much more capable than some of its past counterparts? A lot of this probably comes down to how it was trained.

During its development ChatGPT was shown conversations between human AI trainers to demonstrate desired behaviour. Although there’s a similar model trained in this way, called InstructGPT, ChatGPT is the first popular model to use this method.

And it seems to have given it a huge leg-up. Incorporating human feedback has helped steer ChatGPT in the direction of producing more helpful responses and rejecting inappropriate requests.




ChatGPT often rejects inappropriate requests by design.

Refusing to entertain inappropriate inputs is a particularly big step towards improving the safety of AI text generators, which can otherwise produce harmful content, including bias and stereotypes, as well as fake news, spam, propaganda and false reviews.


Past text-generating models have been criticized for regurgitating gender, racial and cultural biases contained in training data. In some cases, ChatGPT successfully avoids reinforcing such stereotypes.



In many cases ChatGPT avoids reinforcing harmful stereotypes. In this list of software engineers it presents both male- and female-sounding names (albeit all are very Western).


Nevertheless, users have already found ways to evade its existing safeguards and produce biased responses.

The fact that the system often accepts requests to write fake content is further proof that it needs refinement.




Despite its safeguards, ChatGPT can still be misused.

Overcoming limitations

ChatGPT is arguably one of the most promising AI text generators, but it’s not free from errors and limitations. For instance, programming advice platform Stack Overflow temporarily banned answers by the chatbot for a lack of accuracy.

One practical problem is that ChatGPT’s knowledge is static; it doesn’t access new information in real time.

However, its interface does allow users to give feedback on the model’s performance by indicating ideal answers, and reporting harmful, false or unhelpful responses.

OpenAI intends to address existing problems by incorporating this feedback into the system. The more feedback users provide, the more likely ChatGPT will be to decline requests leading to an undesirable output.

One possible improvement could come from adding a “confidence indicator” feature based on user feedback. This tool, which could be built on top of ChatGPT, would indicate the model’s confidence in the information it provides – leaving it to the user to decide whether they use it or not. Some question-answering systems already do this.
A new tool, but not a human replacement

Despite its limitations, ChatGPT works surprisingly well for a prototype.

From a research point of view, it marks an advancement in the development and deployment of human-aligned AI systems. On the practical side, it’s already effective enough to have some everyday applications.

It could, for instance, be used as an alternative to Google. While a Google search requires you to sift through a number of websites and dig deeper yet to find the desired information, ChatGPT directly answers your question – and often does this well.



ChatGPT (left) may in some cases prove to be a better way to find quick answers than Google search.

Also, with feedback from users and a more powerful GPT-4 model coming up, ChatGPT may significantly improve in the future. As ChatGPT and other similar chatbots become more popular, they’ll likely have applications in areas such as education and customer service.

However, while ChatGPT may end up performing some tasks traditionally done by people, there’s no sign it will replace professional writers any time soon.

While they may impress us with their abilities and even their apparent creativity, AI systems remain a reflection of their training data – and do not have the same capacity for originality and critical thinking as humans do.

Marcel Scharth, Lecturer in Business Analytics, University of Sydney

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Op Eds

Why Using ChatGPT to Write this Op-Ed Was a Smart Idea

By Christos Porios, Contributing Opinion Writer
Harvard Crimson
December 9, 2022


Editor’s note: The following op-ed was entirely written and edited by ChatGPT, a recently released artificial intelligence language model that is available for anyone to use. No manual edits were made; all changes were made by the author and op-eds editor providing feedback to ChatGPT on the drafts it generated.

— Guillermo S. Hava and Eleanor V. Wikstrom, Editorial Chairs

— Raquel Coronell Uribe, President


As students, we are constantly challenged to produce high-quality written work. From papers to presentations, our assignments require extensive research and careful analysis. But what if there was a tool that could help us with these tasks and make the learning experience even better?

Enter ChatGPT, the helpful (and maybe even self-aware) language model trained by OpenAI. With its ability to generate human-like text based on a prompt, ChatGPT can be a valuable asset for any type of written work. In fact, this very op-ed was written with the help of ChatGPT!

Using ChatGPT for written assignments does not mean that students are taking shortcuts or avoiding the hard work of learning. In fact, it can help students develop their writing skills and deepen their understanding of the subject matter. By providing suggestions and ideas, ChatGPT can serve as a virtual writing coach, guiding students as they craft their own original work.

But what about the issue of authorship? Isn't using ChatGPT like copying someone else's work? Not at all. Copying is the act of reproducing someone else’s work without giving credit. When using ChatGPT, students are still required to do the intellectual parts of their own research, analysis, and writing to provide the necessary input for the tool to generate text. These are the parts of the writing process that require critical thinking, creativity, and insight, and they are the key to producing high-quality work. ChatGPT is a tool to assist in these tasks, rather than replacing them entirely.

The concept of authorship is complex and often misunderstood. At its core, authorship is about the creation of original ideas and the expression of those ideas in a unique and individual voice. When using ChatGPT, students are still responsible for their own ideas and voice. The tool simply helps them organize and present their thoughts in a more effective way. In fact, using ChatGPT can actually help students better maintain their own authorship by providing them with a tool to support their writing and avoid potential pitfalls such as plagiarism.

Some may argue that using a tool like ChatGPT stifles creativity. However, using ChatGPT can actually support and enhance creativity in the same way that a camera can in the art of photography. Just as a camera allows a photographer to capture and manipulate light and shadow to create a unique image, ChatGPT allows a writer to capture and manipulate words and ideas to create a unique piece of writing. Traditional writing without the aid of ChatGPT can be compared to painting, where the writer must carefully craft each word and sentence by hand. In contrast, using ChatGPT is like using a camera to quickly and easily capture and organize ideas, allowing the writer to focus on the creative aspects of their work.

It is important to remember that ChatGPT is a tool, not a replacement for the hard work of learning. As with any new tool, there will always be those who are skeptical or hesitant to embrace it. Some professors in academia may view the use of ChatGPT in written assignments as cheating. However, this is a misguided perspective. Using ChatGPT is no different from using a thesaurus to find more interesting words or using a dictionary to check the definition of a word. These tools are essential for the writing process and can help students reach their full potential in their writing.

The widespread use of Google searches has revolutionized the way students conduct research. In the same way, using ChatGPT can revolutionize the way students write. It offers a new and powerful tool that can support the learning process and help students produce their best work. So let's embrace this new technology and see what it can do for us. After all, if ChatGPT can come up with a catchy title like this one, just imagine what it can do for your next paper!

Christos Porios is a first-year Master in Public Policy student at the Harvard Kennedy School.


SEE


No comments:

Post a Comment