Today in Tedium: AI is a fascinating topic that seems to dominate the cultural conversation lately. Whether it’s AI being used to write anything imaginable or one of the many AI image generators making waves everywhere, there’s no denying the fact that AI is here to stay—and people are utilizing it in some very strange ways. AI inspires conversations on everything from copyright issues to regulation and potential job losses. But there’s also a distinct novelty to AI, and if you’ve been reading Tedium for a while, you probably already know where we’re going with this. In today’s Tedium, we’re looking at generative AI though our grainy, red-and-gray tinted glasses. — David @ Tedium
Insightful business news that respects your time and intelligence. The Daily Upside is a business newsletter that covers the most important stories in business in a style that's engaging, insightful, and fun. Started by a former investment banker, The Daily Upside delivers quality insights and surfaces unique stories you won't read elsewhere. Sign up for free here.
The year the then-new writing assistant, Articoolo, arrived on the market. (Ernie wrote a feature on it that year as well.) The idea behind it was simple: type in a prompt and let it write a quick, 500 word article. Sounds familiar, right? In this case, Articoolo offered algorithm generated content by using meticulously designed (and proprietary) algorithms. Although similar to GPT-2 and GPT-3, it isn’t the same thing. Despite that, the desire to create an automated writing tool persisted. And 7 years later, we’ve arrived at an entirely new generation of generative AI tools. Generative AI is taking off in big ways right now. Originally, I was going to start this piece by discussing early AI and offer some interesting stats or trivia about the subject. But then the piece just sort of took on a mind of its own and using AI to craft this section seemed like a good choice for getting the ball rolling.
Isn’t it weird how quickly AI came to dominate the conversation? That’s because it’s everywhere
Whether you realize it or not you use AI every day. Google uses it in just about everything. Autocorrect, for example, is an AI technology. Many enterprise systems—from Coupa to Workday—use machine learning and AI in some capacity.
Even Grammarly—one of the most popular tools many of us use every day—is based on AI. According to their own website, Grammarly is constantly learning from its users and adapting. It works well (for some things) and never tries to sell itself as a magic writing tool. Whether it’s rewriting suggestions or its built-in plagiarism checker, Grammarly is a fantastic example of how helpful AI can be in our daily lives.
Over the past few years, a wide range of other AI writing tools have popped up. In 2022, image generation tools started showing up as well. This year, Microsoft started testing a tool that can simulate anyone’s voice with just a small audio snippet. They also inked a deal with OpenAI to implement GPT-3/ChatGPT into their products. So it’s fair to say the future will be full of these sorts of tools in one form or another.
The marketing for these tools suggest using them to streamline workflow and get things done quicker. But they’re also being marketed as useful for research, brainstorming, outlining, and creating quick messages. But there’s another aspect to them that seems to irk some people.
Schools and academics are already voicing concern about the technology being used to write essays for students. But even OpenAI’s CEO, Sam Altman, says it’s not ready to be used for important tasks. Like that’s stopping anyone. For now, it’s kind of fun to play with it and apply it to our favorite hobbies (songwriting, for instance).
As many of our readers know, I’m a musician and songwriter. This piece began as an exploration to see if AI could help me finish writing an unfinished 20-year old song. It couldn’t. It just kept wanting to write stilted genre fiction instead of song lyrics. Other people have tried to use ChatGPT to write songs in the style of certain musicians.
Someone used ChatGPT to create a Nick Cave song–and it did not go well. Nick himself responded to the person through his newsletter. Let’s just say it did not go well. We had a similar idea, but applied it to a different band: They Might be Giants.
Here’s what it came up with when :
It couldn’t capture even a hint of what makes TMBG great (and it gave me the chords for “Don’t Let’s Start”). I hope the Johns don’t skewer us for that one—we’re still massive TMBG fans—and let’s just say we won’t be using ChatGPT for such things again any time soon.
“It’s still early, and the AI tech has quirks. But ultimately it should serve to express human ideas, not machine ones. If generative AI were a typewriter, we’d be in the awkward pre-QWERTY era. There will be significant refinements ahead. Of course, the ability to communicate more easily brings its own set of problems, such as a lower signal-to-noise ratio, more inter-group conflict, and other unintended consequences. No communications technology advance has ever been 100% good.”
— Benj Edwards, a friend of Tedium and currently the AI reporter for Ars Technica, in a series of recent tweets. Speaking to some of the anxiety creative people (ourselves included) have about generative AI, Benj brings home an important point: we still have a long way to go as these tools develop. And when they do reach their zenith, they’ll be a supplement to human creativity, not a replacement or a means of serving the desires of a machine.
The way we’re using AI is changing fast
New technology always has a period where it’s novel and fun, until it ultimately disappears or gets implemented somewhere. AI is no exception. When the image-making technologies DALL-E 2, Midjourney, and Stable Diffusion opened to the public last year, we saw some very interesting—and a few questionable—things. For a while, it seemed as if everyone was playing around with different prompts (present company included) just to see what these tools could do.
But despite their amazing abilities, there’s an underlying sense of dread and many other issues surrounding generative AI. Copyright concerns. Income loss for artists. Coders, copywriters, and other creative folks being replaced. Students cheating on exams. They’re valid concerns.
I probably would’ve found AI helpful while pursuing both my BA and BS degree in college. I think there’s potential for it to make a meaningful impact on collegiate coursework. But it’s not without its caveats. In a recent Nature article, the magazine’s parent company Springer Nature—one of the largest publishers of academic papers in the world—laid forth some ground rules for how to use ChatGPT to make science more transparent. This is clearly a hot issue and it’ll remain that way for some time to come. As the conversation continues, it seems everyone has a take on this subject. Even Product Hunt—a site where many of us likely heard about AI apps in the first place—published an essay ruminating about the fate of creative work in the future.
I will be the first one to point out that I’m not an expert in this field, nor am I qualified to make judgments about the future of AI. As someone who might be affected by it, I can say the sense of dread is very real. But that dread also comes with a sense of hope and the belief that these tools could be extremely beneficial to enhance our lives (if used properly).
“Artificial Intelligence may be the next big thing, but it’ll never be able to beat the creativity and ingenuity of a human mind—unless of course the AI is programmed to be a total smartmouth, then it might get the upper hand!”
— YouChat, when asked to generate a quote about AI in the style of Tedium. I did add a few additional instructions to get it to create something funny, but we didn’t edit the quote in any way.
The rise of AI-infused hustle culture
At the moment, there seems to be a sprint for productivity and hustle culture looming over AI. Just look at the first page of YouTube when you search for “ChatGPT.”
Following ChatGPT’s release, it didn’t take long for all the “optimization” and “productivity” gurus to come out of the woodwork. Then, the courses and YouTube tutorials began popping up.
“Here’s how to make your prompts 10x better!”
“Unlock the Magic or AI with these five simple tricks!”
“Magically create copy in seconds!”
Ok, cool. Cool. In practice, it takes significant prompting and tweaking/developing your prompts to attain usable results.
What the ads often omit is the fact that the “magic copy” requires high level prompting and some actual legwork to generate high-quality results. Larger companies are now experimenting with using AI in some capacity. Adobe is doing something with generative AI by trying to incorporate it into its overpriced apps. An author, Robert Solano, wrote a children’s book using AI. And guess what Buzzfeed is doing soon? Beyond that, generative AI has been all over the news cycle lately.
One man, Jomo K. Johnson, apparently wrote over 12,000 books with ChatGPT and published them in paperback. More recently, a few software engineers tried their hand at writing children’s books. Someone used ChatGPT to create malware. And everyone already knows about the guy from my home state (Colorado) who won the state fair art competition with something he created in Midjourney (and edited afterward).
AI is affecting stock photo services too. While Getty Images is involved in a lawsuit against Stability AI, OpenAI and Shutterstock are inking deals to make generative AI a part of the service. As we already mentioned, Microsoft hit the honeypot with their OpenAI integration deal. Even the awesome browser, Vivaldi, recently made You.com one of its default search providers.
To say AI is here to stay is obvious and hustle culture is still exhausting, no matter what the tools happen to be at the moment. But the form it will take—and how we ultimately integrate it into our lives—is a vast uncertainty leading us blindly into the future.
The reported amount of money Microsoft committed to its upcoming deal with OpenAI. Although they’re not providing specific amounts yet, this is a ballpark number for the massive investment. The company was already working with OpenAI—they invested $1 Billion in OpenAI in 2019, and became their exclusive cloud provider that year. Most users probably won’t even notice when the tools are eventually incorporated into Microsoft’s products, but only time will tell us how this will play out long-term (just to give you an idea of how rapidly things are evolving with AI, someone released a third-party add-on for Microsoft Word with GPT-3 integration literally a few days before publication).
A dip into the vast ocean of AI writing apps
It was probably inevitable, but it wouldn’t make sense not to discuss specific AI apps in an article about AI. I’ve personally tried all of these and I’m not being paid or sponsored in any way. If somebody wants to pay me a response for me to use their products right about them, I’m 100% open to it.
None of the text in this issue of Tedium—except where explicitly noted—was generated by AI. But we thought it might be interesting to test some of the more prominent AI utilities on the market. Ok, here goes:
Simplified did a fantastic job helping me write an ad for Haunted Cheese Graters. The initial draft hit most of the key points, but still required a complete rewrite to be funny and cohesive. Here’s the ad:
This is another AI writing utility that uses “cure writer’s block” as its marketing message. It’s quite similar to most other current products and works in much the same way. For the brief time I experimented with Hyperwrite, I had some fun with some of its features. I ran out of tokens pretty fast, so didn’t get as much hands-on time with it. Hyperwrite offers GPT-3 integration, an image generator, and a sourcing tool. Ostensibly, the sourcing tool should provide at least one usable resource link relevant to your text. So, I tested out all three features. The writing portion was a bit lackluster. The output was stilted and boring. I tried to use it for sourcing on a tech-related piece … and every single source it provided was a keyword-stuffed, SEO article containing little relevant information to the subject (often, the links were broken as well). The image generator, on the other hand, created a few interesting items with the prompt: “a dapper, anthropomorphic squirrel enjoys dinner in the style of Vincent Van Gogh.” Here are the two best ones (out of 30 examples it generated):
Here’s a writing sample:
Writesonic is another combination writing/image generating utility. I played with their AI writer a bit, but ran out of words extremely fast. It’s a lot like Copy AI, but with a slightly more intuitive interface. The marketer side of me loves its keyword generator. With the limited number of words the free preview allowed, I didn’t produce anything remarkable. I feel if I had more to work with, there’s definitely potential, but my budget is incredibly low (well, it’s nonexistent anyway). I had tremendous fun with the image generator, making it create various anthropomorphic squirrels posing for magazine covers. It was hit and miss. Here are two of them, including Writesonic’s watermark at the lower left hand corner:
There’s a reason companies like Simplified and Copy AI advertise their products as a way to improve workflows instead of as magic writing machines.
I wrote a paragraph explaining what I wanted to say in the piece and the system essentially spit out what I wrote in the prompt word for word. On another attempt, it gave me a paragraph that seemed oddly familiar. I double checked my archive and, sure enough, the output mirrored a marketing piece I wrote a few years ago. The site for which it was written was clearly part of the training data used for the LLM. To be clear: I’m not calling it out for plagiarism. The original piece was written for a general audience and contains information related to that subject from a reputable industry source.
Despite this, it was interesting to see how Copy AI handled prompts. If you provide it with specific, targeted prompts, it’ll give you a solid idea or draft. I can see it being useful in some cases. It almost does feel like magic—until it comes time to write a second draft, build a cohesive narrative, and add sources.
But that’s not what we’re doing here. We used Copy AI to write some silly song lyrics, because this is Tedium and that’s what we do. So I decided to try to get it to write a song about poltergeists who steal Ethernet cables. Here’s the prompt, and one of my favorite results. I didn’t make any adjustments. This is pure, unfiltered Copy AI:
In my experience, Copy AI works wonders for doing some of the more dull and repetitive work I do in my day job. But beyond that, it’s merely another tool, more on par with Grammarly for my purposes.
I love using You.com. As a search engine, it provides helpful, quality results from simple queries. In my experience, it’s much better than Google. Billing itself as an “AI Powered Search Engine,” it offers several apps to improve everything from research to coding. So how does it compare to the others? It does a pretty good job of summarizing web pages and looking past marketing fluff.
It has an image generator running on Stable Diffusion that works pretty well:
It also offers YouChat and YouWrite to take full advantage of GPT-3. Both work extremely well—I’ve had better luck doing research with YouChat than I have with ChatGPT—and YouWrite gives users a decent first draft. We’ve already provided two examples of what YouChat has to offer.
When prompted to write a bio about R. Stevie Moore, this is what it returned:
So its response wasn’t bad, but it leaves much to be desired. It isn’t dynamic and seems sort of generic. It also omits some of my original paragraph, which sort of defeats the purpose. I do like the way it didn’t know what to do with the link and just stuck in the end though. Ultimately, YouWrite is a pretty interesting app and I can see it being quite useful for idea generation and collecting your thoughts.
What can I say about ChatGPT that hasn’t already been said? It’s a pretty useful tool and can help you get is sort of hit and miss with its sourcing. You.com and Lex are much better at providing viable sources.
I did enjoy using ChatGPT to generate some ideas and do a little light research. My attempt at using it as a songwriting partner didn’t pan out (illustrated in the TMBG example above).
When I asked it to help me with a pitch for a Warren Zevon story, it recommended interviewing him. Uh…that isn’t going to happen (we miss you, Warren). So there’s definitely some consistency and relevancy issues with the bit. It has some other problems, too.
On the plus side, ChatGPT did a wonderful job helping me with some deep dive research. Except for the broken links it provided, it cut my research time in half. I’m not enthusiastic about its poetry or prose, though.
Most of the big AI tools tend to market themselves as assistants. ChatGPT certainly has that potential, but I feel people are going to actively misuse even more than they already are.
Jasper is a lot like Copy AI, but a bit more intuitive. Output-wise, it didn’t offer anything more or less remarkable than its peers. The user interface is easy to use and offers plenty of options, so I could definitely see this becoming an “assistant” type of software for some jobs.
Lex is a word processor with built-in Markdown features and a generative AI that can help writers with ideas on an as-needed basis. I have used this tool quite a bit for both creative and non-creative work. It’s genuine. It’s honest. And it’s intuitive.
I worked with Lex’s writing assistant to guide the intro to this piece. Here’s a screenshot (the blue text is what the AI came up with):
This is where I find AI helpful. I ran into a roadblock for the introduction and asked Lex to help. It offered something I didn’t care for, but was able to use to eventually guide the piece. This is how I feel people should use these tools—not to displace knowledge workers or save money by churning out lackluster, uninteresting garbage.
Lex is probably one of the best AI assistants I’ve used. It has some built-in Markdown features, is simple to navigate, and works well as an ideation system. When you’re stuck, Lex can be your best friend (it even told us as much in an “interview” last year).
We reached out to Lex’s creator to see if he knew any instances of people using it for weird stuff. Sadly, he didn’t have any interesting stories. But he did share some fantastic information about the program and how it came to be.
“There’s an opportunity right now to work in this space—either using or creating these tools—because they are not very well understood. When most writers think of LLMs, they think all they’re good for is generating a bunch of mediocre text. That’s one thing they can do, but clearly not the only thing. Few understand this. The stigma will fade over time, but in the meantime it presents a great opportunity for anyone who wants to get in early.”
– Nathan Baschez, the creator of Lex, an AI powered writing assistant that is taking off among professionals in multiple industries. The novelty will fade as people come to understand AI tools and large language models, hopefully ushering in a world where AI tools make the world better rather than disrupting everything around them.
Maybe the best way to use AI is as a helper
Writer’s block can be a real pain sometimes. Sometimes, having a writing partner or someone else to help you brainstorm can be a godsend. It was with that mindset that Nathan Baschez, the president of the newsletter platform Every, came up with the general idea for Lex.
After spending the past few years writing and editing in Google Docs, he grew frustrated. He was already interested in large language models (LLMs) and felt they could aid in the writing process. After trying some of the current tools and being unimpressed, he started building his own.
Sending several nights and weekends working on it, he managed to design a tool that works unlike other AI tools on the market. It truly feels like a writing assistant and multipurpose tool. And there’s a reason for that. We asked Nathan why it feels so intuitive and it’s because he built it as a word processor first.
Lex grew out of his frustration with Google Docs so he wanted to make a better word processor. And that’s precisely what he did. As he worked on it over the course of a month, he stumbled into AI. Since it didn’t begin life as an AI tool, it feels unique among its competitors.
Nathan also doesn’t believe the text that LLMs generate right now is very high quality.
“I’m more interested in how AI can help people become better writers than I am in using AI to replace writers,” he told Tedium. “Most forms of writing are not going to be entirely outsourced to AI, even in the long run, because the goal of most writing is to convey some human’s intent. The AI can help, but not replace this.”
And all the magic of the tools can make it seem otherwise, Nathan has a point. He told us that he believes the tools will ultimately change what’s possible in the world of writing. He likens it to the transition from the typewriter to the computer (an apt analogy shared by Benj Edwards as well).
There’s a possibility not all riders will embrace it. But there’s no denying the fact that it’s happening. As Nathan puts it, “Some writers will embrace it, others won’t. But generally the world will move onto the new platform. I’m excited to see how it unfolds.”
After talking to Nathan and using Lex for some projects, I must say share that excitement, too (at least to a point).
“There is a kernel of truth to the Clippy comparison. Clippy was not based on AI—or machine learning—but ChatGPT is a rather sophisticated auto-completion tool, and in that sense it is a much better version of Clippy.”
— David Lobina, an artificial intelligence analyst at ABI Research, commenting on Microsoft’s deal with OpenAI to offer AI tools in its products. Clippy—for all of his annoying suggestions—never truly transformed our workflows. But there’s a good chance AI tools will accomplish what Clippy never could, ushering into what might prove to be a very fascinating future.
Generative AI is nothing to fear. I’m endlessly fascinated by how it’s being used, what it means for the future of work, and how it’ll affect us in the long-term. From what I’ve read, discussed with others, and experienced myself, it seems the general consensus is that AI is a tool to help increase productivity—not replace humans. It can’t do that. Yet. And why should it? Part of the human experience is creating and sharing art/music/poetry/prose with each other (and the world). If AI can supplement and support that process, that’s a good thing.
On the other hand, there’s also a wider debate about whether or not AI adversely affects knowledge workers, but I’m not going to touch that one. Some experts are recommending becoming familiar with it, but asinine headlines and fluff pieces like this one certainly don’t help matters. Let’s just put it this way: I am approaching it with caution. I feel conflicted about it because I work in marketing. Although I fall more into the “AI is a useful tool on occasion, but doesn’t surpass (and shouldn’t replace) high-quality content,” boat, I’m fully aware of the fact that there are plenty of companies out there who see an opportunity to have an editor tweak or rewrite cheaply produced, dubiously sourced content produced with such tools (we’re looking at you, CNET and Buzzfeed).
In the end, generative AI technology isn’t going to stop improving and evolving. GPT-4 isn’t far away. That Microsoft/OpenAI deal is poised to change how people work. Even Canva is getting in on the image generation bandwagon. AI is here for the long haul and it’ll affect everyone in some way. But the real question is: when the luster of generative AI wears off and the sheen of a cool new toy is gone, where will that leave the rest of us?
Maybe we should ask ChatGPT about that. (We did.)
Thanks David for another killer piece. Find this one an interesting read? Share it with a pal!
And thanks again to The Daily Upside for sponsoring.