It’s difficult to be alive right now.
With so much to worry about, it’s tempting to do the bare minimum: to only do the things we like, and outsource the rest. From essays to human connection to art, people are turning to AI sites like Chat GPT for, seemingly, everything.
After all, why not? It’s fast, it’s easy, and it’s everywhere. It’s there for you to use, so you might as well use it, right? AI content is being increasingly integrated on Instagram, TikTok, Snapchat, and more. No matter what major site you visit online, chances are they either already have or are in the process of adding AI to their platform. The 2020s are shaping up to be the age of AI, and why not? Machines are faster—they can view the entire internet in seconds, do what they’re told, and only bring up evidence to support you. Machine learning networks can’t argue, or force you to reconsider your stance, so some people are opting to replace fallible, fickle, emotional humanity with cold, constant efficiency. Why not?
The 21st Century dream of only having the things you like is almost realized. In the olden days of internet browsing, content had to be sought out actively, on forum boards or blogs; however, these days, algorithms bring what you want to see right to your fingertips. But did you know mechanical labor is still labor? Sure, you end up with only the results, but the AI is doing “work” nonetheless. Something is happening while Chat GPT loads. Although it doesn’t happen right in front of you, the use of AI, while not the same for all forms, has tangible effects on our environment.
Large-scale AI projects are usually housed in warehouses that take a staggering amount of materials to both build and maintain. Building a single 2 kg computer requires the extraction of over 800 kg of raw material. Included in this construction cost are the microchips that power computers, which require earth metals that are often mined in destructive manners. These “energy hog” warehouses contain dozens of massive computers and produce more waste than you might think. The electronic waste often contains toxic substances like lead and mercury, and the operations of the warehouses result in both large carbon emissions and the evaporation of an insane amount of water. According to one source, Microsoft’s GPT-3 research center has already evaporated over 184,920 gallons of water. A single Chat GPT query evaporates three times the water as a Google search without an integrated AI summary. With the Earth’s climate already in rapid decline, AI is just another factor exacerbating our problems. While some AI models are justified, the ones becoming popular right now are absolutely not.
Alright, maybe you don’t care about that. Maybe you’d let the world burn for the sake of ruthless progress, or for your own convenience, or for whatever other excuses you can come up with. Still, one must consider: generative AI isn’t even that good at its job. Because it can only show you what you ask for, it’s prone to inventing sources and quotations that don’t exist. Beyond pulling things out of thin air, sites like Chat GPT that get their data from scanning the entire internet don’t know when people make things up. If it’s said enough online, AI will parrot that information. This doesn’t even consider a glaring issue: as more AI content is uploaded online, AI learning networks are getting caught up in feedback loops due to the sheer amount of available AI media. This is especially applicable to image-generating AIs—they’re trying to generate images of, for example, a cat, but they’re being heavily influenced by the cat images from other AIs. This spirals into increasingly incomprehensible results. In short: the more generative AI is used, the more useless it becomes.
Considering the fact that generative AI is, in essence, a plagiarism machine, one thing is very obvious—if you use AI for academia in any way, shape or form, you’re screwed. For the past couple of years, America has been on the fast track to anti-intellectualism. AI is only furthering this inclination, as seen by Meta’s decision to ban fact-checking. It’s not a stretch to say that the age of AI is becoming—or perhaps already is—synonymous with the age of misinformation.
Who cares?
Who cares if our planet falls apart? Or if everyone stops learning? Who cares if we stop teaching our children correctly, if people get scammed out of money, if they buy into lies, or if jobs don’t get done correctly? It seems to be a small price to pay in exchange for getting things done faster and cheaper.
It’s difficult to be alive right now, and that is the point of living.
What’s the point of writing if the words and ideas are not your own? What’s the point of art if the forms aren’t all chosen by your own hands? Writing is difficult. Art is difficult. School is difficult. We live in a world with an uncertain future, but when you use AI to replace painting, or writing a lab report, or texting your friends, what are you doing? Killing your own humanity in the name of saving time so you can waste more oxygen scrolling?
Everyone has things they dislike. We all only want to hear, see, and do the things we like. But the reality of that happening lies somewhere on a scale from impossible to actively detrimental to humanity. So what’s the point of avoiding things you dislike? Are you really living if you never do anything unpleasant, challenging, or annoying? If you refuse to honestly try to learn or refuse to ever put your all into even the things you dislike, ask yourself: what am I doing?
People hate seeing their personal convenience challenged, but I’ll ask something of you anyway: go do something you don’t want to do, and complete it using your own hands and brain. See it through. Once you’re done, you might find you actually liked the task—or, at least, you’ll enjoy knowing that you’re better off having done it yourself.