The Paradox of AI: Using Machines to Be More Human
The best use of AI isn't replacing human moments — it's protecting them. How I use automation to free up mental space for the things that actually matter.

I put money under my daughter's pillow last night because Hunter reminded me.
That sentence probably sounds dystopian to some people. It sounds like the setup to a Black Mirror episode. But here's the thing: without that reminder, I would have forgotten. Not because I'm a bad dad. Because by 10 PM, after six kids' worth of homework, dinner, dishes, bedtime routines, and a few hours of client work squeezed in after, my brain is running on fumes. The tooth fairy was not going to survive my mental triage.
So yeah. A machine helped me be a better parent. That's the paradox I want to talk about.
In my first post, I wrote about why I built ARIA, my personal AI assistant. The short version: six kids, a family business, freelance clients, multiple calendars and email accounts and task systems, and a brain that's great at building things but terrible at follow-ups. I was drowning in inputs and dropping balls I didn't even know I was holding.
Since then, I've also started running OpenClaw, an open-source AI assistant framework. My OpenClaw agent goes by Hunter (named after Jan Levinson's assistant from The Office, obviously). Between ARIA handling the backend tooling and Hunter handling the conversational layer, I now have two systems working together to manage my life. That's why I built ARIA, and that's why I run Hunter — same philosophy, evolving tools.
The second post covered the technical fix: consolidating 389 tools into 48 so the AI could actually function reliably.
This post isn't technical. This is about what happens after the system works. What it actually means to offload the busy work of life onto a machine. And why I think most people have the AI conversation backwards.
The busy work tax
Here's what my day looks like. Wake up at 6. Get four kids ready and out the door by 7:20. Back by 8, clean up, get my oldest set up for homeschool. Work from 9 to noon. Lunch. Work from 1 to 2:45. Leave for school pickup. Back by 3:20, work until 4:30. Then it's soccer practice, volleyball, church activities, dinner, homework, bedtime. After the house goes quiet, freelance work.
Somewhere in those fragmented blocks, I'm supposed to manage three email inboxes, a ClickUp board, a Todoist account, multiple calendars, notifications from teachers and coaches and church coordinators, and whatever my clients need.
Every one of those systems is a place where something important might be hiding. Every one requires me to remember to check it. And the logistics of getting six kids where they need to be on any given evening is a game of Twister that would break most scheduling software.
I used to spend 20 minutes deleting junk email just to find the one message from a kid's teacher. I had colleagues asking "did you see that email from the partner?" and I hadn't. Not because I didn't care. Because it was buried under 47 newsletters I never signed up for.
That's the busy work tax. Not just the time it takes, but the mental space it occupies. You're carrying the weight of six inboxes and three task systems in your head all day, even when you're not looking at them. That background process running in your brain, constantly nagging you that something might be slipping through the cracks.
That's what was stealing my attention. Not the work itself.
What I actually mean by "more human"
When I say I use technology to make human experiences more human, here's what I mean: identify the places in your workflow where a human has always done the job, and find ways to make those tasks either automatic or so efficient they barely register. Then take the space you freed up and spend it on the things where humans are irreplaceable.
For me, the robot work was email triage, calendar coordination, habit tracking, reminders, notifications, capturing notes. None of that requires my creativity or judgment or presence. It just requires attention. And attention is finite.
The human work? That's being present in a meeting instead of half-listening while I take notes, because my system captures notes and sends them to me after. It's making sure my wife knows to grab an umbrella for the soccer game tonight, because the weather check happened automatically. It's spending an hour scoping out two features I can kick off with Claude Code, then closing my laptop and going to my daughter's volleyball game while the AI writes the code.
That last one is worth sitting with. The machine works while I'm in the bleachers. I didn't skip the game. I didn't miss the deadline. Both happened.
The investment, not the expense
People ask if this is overkill. Fair question. My answer depends on who's asking, but usually I just start listing everything I'm juggling on a given Tuesday and the pushback dies pretty fast.
The propensity to over-optimize is real, though. I see it everywhere. People who discover the perfect file organization system. The pantry arrangement. The notebook layout. The tool placement that saves three seconds. Some of us are wired to tinker.
I can fall into that. But my approach has been simple: try to have the system do something. If it works, great. If it doesn't, make a small change in minutes that will save hours in the long run. What the system can do grows gradually, which is the key. Incremental value. Small time investment.
I think about it like financial investing. A little bit now that compounds over time. Every automation I set up pays dividends every single day after that. The email rules I built last month have already saved me hours. The weather alerts will run every day for years. The follow-up tracker will catch dropped balls I don't even know about yet.
You're not spending time building these systems. You're investing it.
The fear conversation
There's a lot of anxiety around AI right now. Replacing jobs. Making the world less human. Robots coming for everything.
I get it. But this concern isn't new. You can read philosophers from hundreds of years ago worried that their world was becoming less social, less intellectual. Every generation has this moment with new technology.
Here's how I think about it: if you're regularly asking yourself "how do I add value?" and "how can I add more value?", you'll embrace tools like this. They make you better at what you already do. They free you up to do more of what only you can do.
If you're trying to get away with being inefficient so you can collect a paycheck, then yeah, you should probably be worried.
I've told groups of developers this: AI isn't coming for your job. Not for a long time. But a developer who knows how to leverage AI? That person is coming for your job right now.
The tool is neutral. The user determines the outcome.
Where the line is
I don't think every human task should be automated. The point isn't to remove yourself from your own life. It's to remove the friction that keeps you from being fully in it.
I don't want AI to write my emails to clients. I want it to make sure I don't forget to write them. I don't want it to talk to my kids. I want it to free up the mental bandwidth so when I'm with my kids, I'm actually with them and not mentally scrolling through my inbox.
There's a version of this that goes too far. Where you optimize so aggressively that you've automated away the things that make life feel like yours. I'm not interested in that version.
I'm interested in the version where technology handles the logistics Twister so I can be a dad, a husband, a builder, and a professional without dropping balls in any of those roles.
The takeaway
Technology and AI aren't inherently good or bad. They're tools.
Find ways to use them for good. Use them to increase your quality time. Use them to protect human interaction instead of replacing it.
The paradox of AI is that the best use of a machine is to make you less like one.