Thoughts On AI - June 2025
TL;DR
Writing this post clarified my thinking about generative AI coding tools. Setting the environmental impact aside for a moment, here's where my head's at:
- Generative AI is like cooking videos. If all you do is watch them, you're not cooking. You're consuming.
- Videos showing you how to deep fry a turkey can get you a nice meal with some effort on your part.
- They can also lead you to burning down your house.
Of course, we can't take things in isolation. You can be forgiven for not knowing about the environmental impacts of these system. Once you do, it's on you to weigh the cost/benefit.
A Future Look Back
This post is largely for future me. A touchstone to check back in to see if/how my opinions change over time.
Lots of folks have written eloquently about AI and its interaction with the world. This ain't that. It's a brain dump. Something to get this shit outta my head and help me clarify my thinking through writing.
In no particular order:
-
Preface point here. When I say "AI" in this post I'm generally referring to "generative artificial intelligence".
I don't like that the word "intelligence" is in there. It implies an agency of thought that's not found in the systems. That's branding for ya.
I'm also thinking here mostly about coding/developer/programming tools. Some thoughts apply more broadly to topics like content generation. Some don't.
- It's weird to talk about "AI" in such a general way. It feels like talking about "computers" without narrowing down to a concrete subject.
-
I'm reminded of my days working at a camera store when digital cameras first came on the scene. Their images looked like shit.
All the old photographers said there was no way they'd switch. It was an easy position to take. When the next generate came around, it was way better. Still looked like shit, though. They said the same thing.
What I saw was the improvement and thought about a longer timeline. It seemed clear that it was only a matter of time until digital got better than film.
-
Years later, when digital was on the verge of becoming better than film, wedding photographers were writing blog posts about how it would never be as good.
They weren't saying "It's not good enough yet", which was true. They were saying it never would. Such a clear fantasy at that point given just the briefest look and the trend of improvements.
-
The early jumps in improvement in AI feel like that when it comes to coding tools. It doesn't feel as inevitable, though.
Digital imaging improvements were about physics first. Improving sensors. The post processing stuff is important, but it's secondary to the hardware underneath it.
(This has been changing. Sensors have gotten so good that the post processing improvements are having more of an impact now. I'm speaking here about the early steps in the tech.)
- AI hardware is not the same. Making it faster only makes it faster. Given the same input to the same model with the same random seeds, etc... you get the same output.
-
All that said, I'm not interested in it. At least, not the way I've seen it presented. The idea of generating a full app is just not a thing that's for me. I like putting in the work to make things work.
It makes my brain light up. It's also putting in the practice that makes me better at it.
-
This reminds me again of photography. I used to have at least one camera on me at all times. A little Konica point and shoot. This was before cell phones were a thing. Even longer before they all had cameras.
It was rare that anyone other than me had a camera. Even less frequent that they used it.
Cameras are invisible table stakes now. Getting a phone without one is a feature request.
That availability lead to an explosion of photos.
It used to be that one of the biggest things that set a pro apart from an amateur was that pros shot more film on any given subject. That both made it more likely they'd get a good image, but it also put in the reps of practice (aka the singular thing that you do to get better at a thing).
-
That leads me back to AI. How do you practice it? I mean, it has to be using the prompt since that's the interface.
That feels less like practice and more like... I don't know, asking random strangers for information.
If you ask the right question to the right person on the right day you get a right answer.
That's not practice, that's luck.
-
Back to photography. The photo industry was decimated by camera phones.
This may be one of the biggest points.
Most folks didn't give a shit that the picture quality was awful. They had a photos they never would have had otherwise.
And, there's a thing with photography. When you take a photo you attach your emotions to it. The way you felt when you took it. It doesn't matter if it's "a good photo" or not.
This is why looking at someone else's kids and cats can be so boring. You're seeing a crappy photo. They're reliving how they felt when they shot it.
-
This reminds me of folks using MidJourney. They get to experience a high of making something. But, the more I think about it, the more it's like gambling.
Or, trying to use one of those rip-off crane arm machines with the stuffed animals and iphones in them.
Only, you get something out of MidJourney every time. And, they generally look mostly good. Except when they don't, but there's no preceived cost for trying again.
-
I'm not gonna get into the environmental impact. Everything I've learned about it points to it being horrible. It's the primary reason I don't really play with gen ai tools.
It's tough to hold folks responsible for things they don't know about. Once you learn about it. The responsibility is falls on you.
Of course it's so pervasive now that it's almost impossible to avoid.
Hopefully, the environmental impact will get better over time, since it's pretty clear that generative tools are here with us from now on in the same way camera phones are.
-
I had a shower thought the other day. A riff on Arthur C. Clarke's Third Law:
Any sufficiently advanced technology is indistinguishable from magic
Mine was:
Any sufficiently distributed technology is indistinguishable from nature
That's the path of AI. In some places, it's already there.
-
I prefer my Grimoire over A.I. My little, collection of 11,000 notes. Cobbled together over the past few decades with precise solutions to problems I ran into.
When I put something in there, I know it works. I can go back to it with confidence. When I have to go to the internet for help I have to shift gears from creative mode to analytical mode.
Increasingly, the means burning mental energy on thinking tracing code which ends up being broken.
-
With a technical blog post or stackoverflow answer there's someone who wrote some code that they actually got to work. It may not be the thing you're after but it at least does the thing.
I've yet to hear of any AI who's output is run through tests to verify it actually does what its purported to do.
-
I trust the internet less now. That's true in general. Doubly so for coding topics.
I keep hitting sites that are clearly AI generated. It takes a bit to figure that out though. So, I have be on guard with everything.
That's exhausting.
The answers are also wrong a lot.
-
I expect those wrong answers will be self-re-enforcing.
See also: The AI Ouroboros
-
I'm getting to the point where if it's not a personal blog I don't trust it by default.
The good news is this is pushing me more towards documentation which I'm getting better and better at reading.
-
Makes me think we'll start seeing more folks posting answers on their own blogs and folks collecting lists of blogs that are known to be written by humans.
I'm already after a list of sites that are know AI slop sites so I can just nuke them from orbit in my search results.
- It's only a matter of time until person sites are overwhelmed by AI generated sites designed to look like personal sites. (Or, even just folks using AI to generate slop)
- That's actually not a bad business ides if you could figure out how to pull it of. A search engine that filter out AI sites instead of pushing AI forward.
-
Fuck big business (writ large) for suing people into bankruptcy for using their intellectual property and AI companies for operating off metric shit-tons of copyright infringement.
Both sides of that coin can go fuck themselves.
-
I don't want the output from an AI. I want all the references it used to produce it's answer.
Of course, what I'm describing is a search engine. If you can make a better one of those, I'm all for it.
The way AI is being added to search engines is the opposite of that.
-
I propose a new term, "Pollution Engineer". A person who can navigate documentation to fix the issues created by "Prompt Engineers"
Of course, "Pollution Engineer" makes it sound like they are making the pollution. So, maybe not the best term. Gotta workshop that.
Oh, wait, how about "Slop Sherpa"
-
I've been learning lots of new stuff in Rust these past few weeks. It's all because I've been working on new things.
I want that learning. That understanding.
I want to know how things work so I can make them work the way I want.
I love the feeling of making that happen.
-
Folks say things like, "it'll destroy jobs, but it'll create new ones" when talking about new tech.
I'll remind them that the photography industry didn't get replaced. It got wiped out. The same way the buggy whip industry did.
Even if something replaced it, that doesn't mean an individual can re-tool to move to the new thing.
-
Reading stuff from AI is no different than reading someone else's code in terms of the effect on my flow.
If I don't know how to do something and it's not in my grimoire, I have to shift mental gears. I want to do that less, not more.
- Programming is about making changes. Every change has a chance to introduce bugs. Bugs compound over time. If you can't understand the code that created them, you won't be able to fix them.
Prediction Time
I'm not big into Making Predictions™. These are more like, I Reckons™.
- I'll never use AI to generate a post for this site.
- AI will replace the word Algorithm as the general way to describe what social media feeds give us.
- AI tools will get nothing but more refined in the same way digital imaging did.
- AI coding tools will be able to get lots of folks what they want lots of the time as long as what they are after isn't too complicated and it doesn't need to change.
- AI coding tools will become as wide spread as auto complete. Or, more to the point, they'll become indistinguishable from auto complete. They'll end up being nothing more than knobs that control the type of suggestions you get and how far you want them to go.
- I'll end up adding something that's now considered an AI tool as part of my overall toolkit. It'll be on the more limited side and I won't use it very often.
- It will get nothing but harder to find answer to coding questions through search engines. The AI slop ouroboros will feedback on itself. Eating its own tail until it's mostly noise.
- Folks who can navigate docs to figure things out will become like wizards who can do deep magic.
TBD
We'll see how all this shakes out. Frankly, I'm not that worried about the coding side of things. That'll sort itself out over time.
What has me freaked out is gen AI's ability to make absolutely real looking things that absolutely aren't.
Propaganda was poison before. The increase in potency is staggering.
For example: Show this to Uncle Fred before he shares another fake video.
The impact of introducing those capabilities to the world is fucking terrifying.
-a
Endnotes
I've been thinking about writing this piece for a while now. Could never figure out a good thru-line for making it a single narrative. Hence, a list. Cause if I don't put this down now I never will.