Two days ago, ChatGPT was released by OpenAI. It's built for conversational interaction, trained on GPT3.5
I've already seen some incredible threads on Twitter of examples using it - I've collated them here.
I've seen a several interesting examples where people have used AI to create. In the example below, someone is combining ChatGPT with an AI image generator: using ChatGPT to create a plot and character descriptions, and then generating images of the characters based on these descriptions.
This example wrote a Seinfeld-style script about a niche Computer Science topic.
Collaboratively writing a sci-fi story with ChatGPT: https://andrewmayneblog.wordpress.com/2022/11/30/collaborative-creative-writing-with-openais-chatgpt/
Impact of this technology on education
Teachers have been concerned about the ease of plagiarism using the Internet for the last decade; they used tools to try to prevent against this while I was at school. It's now possible to generate an entirely original essay in response to a question using these Large Language Models, which presents an entirely different challenge. This would not be detected by plagiarism checkers, and could be edited/stitched together to form an essay using Large Language Models.
Some are suggesting the answer is to transition testing to oral exams, where the assessment is purely a dialogue between examiner and student. Others are suggesting that this might be the time to lessen the significance of standarised testing in education completely.
I touched on the topic of Aristocratic Tutoring as an alternative to instruction-based learning in this blog post, drawing on the thoughts of Erik Hoel.
On the flipside, the below example demonstrates replacing what would have formally been Google search queries with ChatGPT. Instead of receiving a list of links, you get the answer written in plain English, along with an in-depth explanation.
This can be great for someone who is new to a topic, where it can lead you through certain steps to reach the final answer/conclusion. But, this is catch 22, as beginners in a given topic are also less likely to spot mistakes in a domain that they're new to, and I have seen several examples where the answers are incredibly convincing but incorrect. This is more problematic than an answer which is non-sensical or clearly wrong, because it could give someone a false understanding of a concept, or bring about confusion while they're trying to learn something new.
ChatGPT as a coding assistant
There have already been incredible AI releases this year to help with coding: GitHub Copilot and Replit Ghostwriter being the most notable. These tools tend to focus on auto-completing code intelligently based on the context, or dynamically refactoring particular passages of code.
The examples below extend this, where they will not only provide code, but offer explanations to help understanding.
This example below provides code, but then refactors the provided code based on natural language feedback. As you can see, the feedback provided is not just instructional/technical feedback, but creative too.
Combining AI tools
I wrote in a previous post about how it was possible to use GPT-3 to improve existing prompts for AI image generators, leading to better results. This is an example of extending that idea, using ChatGPT as a way to create inspiration and write high-quality prompts that can then be used to generate images matching the description.