By now I'm sure you have heard of GPT-3. But if you haven't, simply put it is an AI API released in private beta by OpenAI. You give it a a bit of text, and then it returns text (code, SQL, or writing). Justin Gage has an excellent primer here and so does MIT Tech Review here. But more importantly, you should just watch the demos like these ones:
GPT-3 writing an email response for you based off a few bullet points.
GPT-3 completing an essay as Paul Graham. (bold text is the input, non-bold text generated by GPT-3). Note how it mimics his writing style with decent accuracy.
GPT-3 generating a resume for you.
GPT-3 generating a prototype in Figma based on plain text description.
Disclaimer: I do not have access to the API yet. I'm still wrapping my head around the details of how it works. All my thoughts are based on what I'm seeing in the demos, and looking at some of the documentation. So from here on out, take these thoughts with a grain of salt.
The Good
There is a lot of menial work across marketing, sales, and customer service. My guess is this is where we see the most immediate impact. I imagine the first order applications to be things like:
Content Creation: Give GPT-3 an outline/primer, it generates V1 of a fully written blog post and copy for an email, tweet thread, LinkedIn post to promote it. It probably even generates a bunch of title ideas for you.
Competitive Analysis: Prompt with "people love using competitor xyz because..." and GPT-3 returns a synthesized competitive analysis. Credit: Serge Doubinski
Customer Service: Auto generate a draft of a response to someones request. Rep checks it over.
Sales Outreach: Auto generate a unique sales outreach by prompting with customer, company, and other bullet points.
Anywhere you are writing copy, doing basic research on the web, etc GPT-3 looks like it can generate a decent V1. This is the low hanging fruit, first order use cases. There will be plenty of 2nd and 3rd order use cases.
These things might sound small, but when you add up the hours that professionals in marketing, sales, and customer service spend on these things it is quite significant. So the optimistic view is that GPT-3 eliminates these things and frees up professionals to focus on more complex and higher impact problems for the org.
The Bad
It is unreasonable to expect it is all good. There will definitely be some bad. For years, the majority of the content marketing ecosystem has been focused on aggregating other people's content in a more comprehensive way in order to rank SEO (popularized by Brian Dean's Skyscraper Technique). We've been living in a giant game of telephone which has been driving noise up and average quality down. To make things worse, companies have a hard time understanding the value of unique content and therefore justifying the cost.
GPT-3 is going to make this 1000X time worse by drastically decreasing the cost and friction. One key will be if/how Google deals with SEO ranking. Compounding returns in content marketing stem from ranking in SEO. So even if you produce unique content repeatedly, if you don't rank, it is hard to make the ROI equation work.
Moving From Execution → Complex Problem Solving
Even if it turns out that GPT-3 can't do the above things with decent quality, the message is clear, that AI with the ability to do so is extremely near. What that means for a lot of professionals is that execution starts to get commoditized, the real value will be placed on those who can solve complex problems.
Comments