Site logo

A collection of things.
By Chris James Martin

Listening: Paul Frazee on Bluesky and the AT Protocol

04 Apr 2025 — Truckee, CA

This Software Engineering Radio episode is one of the best overviews I’ve come across so far for how the AT Protocol (Authenticated Transfer Protocol) that Bluesky is built on actually works. It’s from January 2025 and features an in-depth conversation with Paul Frazee (@pfrazee.com), CTO of Bluesky.

I’d recommend it to anyone curious about decentralized social networks, especially if you’re wondering how Bluesky differs from protocols like ActivityPub (used by Mastodon).

One thing I really appreciate about this podcast is the format — it’s structured and well-moderated, with thoughtful, prepared questions that keep the conversation focused. It’s much more focused and clear than the more meandering tech talk formats out there.

Here are the notes I collected while listening to the podcast. I do this mainly to educate myself, so if I got anything wrong, yell at me on Bluesky.

Bluesky/AT Protocol Origin

Paul describes the origin of Bluesky as a Twitter-funded project to explore alternative architectures for social media. He describes three main categories of decentralized networking tech at the time:

  • Peer-to-peer (think BitTorrent or Secure Scuttlebutt),
  • Federation (Mastodon and ActivityPub), and
  • Blockchain-based systems.

Although blockchain is mentioned, it’s not a big part of the conversation. Paul’s experience comes primarily from the peer-to-peer world, including nearly a decade working on Secure Scuttlebutt. He gives a summary of what worked and what didn’t in that space — namely, the limitations around device syncing, key management, and especially scale. Nice quote: ā€œIt can’t be rocket science to do a comment sectionā€.

What Is the AT Protocol?

ā€œATā€ stands for Authenticated Transfer. It’s built around a few core ideas:

  • DIDs (Decentralized Identifiers), based on a W3C spec — these allow users to have portable identities not tied to any single server.
  • PDS (Personal Data Servers), where each user’s data lives.
  • A relay-and-aggregation system that pulls in updates from across the network to power app-level features like timelines, threads, and search.

This setup enables a decentralized (although still server based), yet scalable, architecture.

Frazee draws a comparison between ATProto and traditional web infrastructure: think of PDSs as websites, relays as search engine crawlers, and app views as search interfaces or timelines. They get into the architecture discussion around 11 minutes in.

Portability and DIDs

One of the big differentiators is account portability using DIDs (decentralized identifiers). DIDs provide stable, cryptographic identifiers — and they’re key to enabling server migration without breaking your social graph.

Paul explains this well around 31-minutes in: in Mastodon, if you move to a new server, your identity and history are fragmented. In ATProto, your DID doesn’t change — it just points to a new server. This eliminates the cascading breakage that happens with federated identifiers.

Domains, Handles, and Identity

At 36 minutes, Frazee talks about how handles work in ATProto. Your handle can be your own domain name, which adds an element of identity ownership. A fun example: Senator Ron Wyden uses @wyden.senate.gov as his handle — no blue check needed. It’s a trust signal in DNS itself.

Custom Feeds and Community Innovation

Around 39–41 minutes Paul describes how community members have been building tools to create custom feeds. The official Bluesky app doesn’t offer a built-in feed editor yet, but others have already made composable UIs for combining hashtags, user lists, and post types into curated feeds. He mentions the ā€œQuiet Postersā€ feed, which surfaces posts from people who don’t post often — a simple but clever way to surface quieter voices.

Moderation: Labels

Another significant topic is moderation — both content moderation and safety/legal compliance. Around 42 minutes, Paul explains how labels serve as a metadata layer that anyone (including independent moderation services) can publish and subscribe to. This allows client apps to let users choose their own filters — a more flexible model than top-down moderation?

Building on ATProto

Towards the end of the episode (around 46 minutes), Paul lists a few projects already using the protocol:

He also describes how building apps works in practice. You authenticate users via OAuth, then write to their PDS and listen to the relay’s event stream to update your UI. It’s a little different from a traditional app stack but not dramatically so — and in some ways, it simplifies the developer experience (says Paul, I haven’t built anything on ATProto yet).

Scale and Opensource

Paul mentions that Bluesky has scaled to over 11 million users (at that time, I believe they’re at 30M+ now) and 1.5–2 million daily actives — with no major architecture bottlenecks (1:05). That’s huge, especially for a new + decentralized protocol, but it sounds like they also have massive infrastructure and funding. I’m curious how much bootstrapping an app on ATProto would realistically cost, especially if it took off. I’m working my way through the talks from the recent ATmosphereConf now, and I hope there’s more insight from independent developers in there.

I was also surprised to learn that all the source code for BlueSky is open source and available on GitHub (1:07).

Final Thoughts

Paul is a clear communicator. He has extensive experience and I’m impressed with his ability to communicate an overview of both the AT protocol and Blue Sky in just over an hour.

After listening, I am significantly more interested in learning about the AT protocol. It’s great to feel excited about a protocol again, and feels similar to the bygone days of REST APIs and RSS being novel new things to play with.

One idea that I might throw some ā€œvibe codingā€ at: an RSS reader backed by ATProto, where you can follow what your Bluesky contacts are reading or listening to, as well as aggregate links shared by friends (and find new feeds) — kind of like the Breaker podcast app (RIP). Absolutely not something the world is crying out for, but it would be fun.

Things I Think are Great: "Context Window" by Matt Webb

28 Mar 2025 — Alpine Meadows, CA

Difficult things becoming easy is not the story here, because on the one hand, we have difficult things become easy; on the other hand, we have easy things become absolutely trivial — and that, for me, is the interesting part.

I’m a few weeks late sharing this. I’ve been sitting on it and spending too much time thinking about how I want to share links here. I’m still thinking about it…

I really enjoyed this talk, particularly the positive and fun perspective on LLMs and AI. I understand much of the negativity around AI, especially related to funding, business models, etc. — but there are also so many opportunities to build fun and useful things. I think Matt does a great job of shining a light on the positive side of the current AI tools.

Anon Code: Local Claude Code?

19 Mar 2025 — Truckee, CA

Anon Kode looks to be Claude Code that you can use with local models:

Terminal-based AI coding tool that can use any model that supports the OpenAI-style API.

Fixes your spaghetti code
Explains wtf that function does
Runs tests, shell commands and stuff
Whatever else claude-code can do, depending on the model you use

Sounds amazing, especially since Claude Code (which is great) is currently quite expensive. It probably should be, but I can’t afford dollars-a-day costs for a coding assistant. Can a local version compare?

Here’s my experience setting up and using Anon Kode with Ollama and Qwen2.5-Coder 14B on my 36GB M3 Pro MacBook Pro.

TL;DR: I got it working with Ollama and Qwen2.5-Coder, but then failed to successfully work with it and don’t have time to debug. If you’ve had success with Anon Kode please message me on Bluesky and tell me about it!

Setup

Install Ollama and ollama run qwen2.5-coder:14b if you don’t have an exsting local LLM set up.

Hiccup: I’ve been having issues connecting to the Ollama API via localhost because my machine preferrs ipv6 for localhost (::1), and Ollama only binds to ipv4 (127.0.0.1). In my own code I just use 127.0.0.1 instead of localhost, but the default Ollama configuration for Kode uses localhost, so I ā€œsolvedā€ the problem by commenting out the ipv6 entry for localhost in my /etc/hosts file. This is not a good solution, but that’s a future Chris problem. If anyone knows how to fix this properly, please let me know.

Install and run Kode:

npm install -g anon-kode
cd your-project
kode

I walked through the config screens and picked Ollama as my API Provider:

Kode provider selection screen

I set the API KEY to ā€œollamaā€. This shouldn’t be necessary, but it doesn’t like it if you don’t set an API KEY value.

Kode provider API KEY screen

Pick your model, I chose to pick the same model for ā€œlargeā€ and ā€œsmallā€.

Kode model selection screen

I went with ā€œDefaultā€ for tokens.

Kode tokens config screen

Looks good.

Kode model confirmation screen

Success! ā€œHelloā€ gets a response from Ollama.

Kode response from Ollama

Use

Now that Kode is talking to Ollama, will it work?

I was half way through writing this post, so I thought I would ask Kode to insert the remaining images above for me. It didn’t go well.

Kode failed prompt

Looks like an issue with understanding the project? Maybe running /init will help?

Kode failed init

🫠 That’s unfortunate.

It seems like there are a number of issues; file access, missing tools… unfortunately I don’t have time to dig further right now, so it’s back to Claude.

Result (for now)

I really want this to work, and I’ll try again when I have time. If you’ve had more success with Anon Kode please message me on Bluesky and tell me about it!

SF2000 Updates and Improvements

13 Feb 2025 — Truckee, CA

Today was a snow day, that means my kids got much more video game time than usual, and in our house video game time means ā€œretroā€ games powered by the amazing ~$20 SF2000 (Super Mario Brothers 3 is still the pinnicle of game design, and I will die on this hill).

The SF2000 is a marvel, and being able to hook it up to the TV and play multiplayer NES and SNES games with my kids makes me so happy. But the screen tearing and other miscellaneous issues that it comes with out of the box motivated me to do some research and updating today.

I’m sharing what I did here, because there is a lot of information to be found about the SF2000, enough that it can be overwhelming if you’re just looking to make it better without watching a bunch of youtube videos and reading way more than you need to get the job done. I’ve now done both of those for you, and can share the basics.

Goals:

  • āœ… Update/fix the bootloader.
  • āœ… Install Multicore firmware, as it has the fixes and improvements we’re looking for and it’s based on the official firmware.
  • āœ… Fix screen tearing (seems that multicore has software fixes).
  • ☐ Change default games for each system, make all the good Marios the defaults, maybe Sonic, Zelda, try to remember the best games for each system. I didn’t get this one done, but multicore has slightly better defaults than the stock firmware, so it’s not urgent anymore.

This guy has written many, many more words about the SF2000 than I ever will. If you want to get deep, go check it out: https://vonmillhausen.github.io/sf2000/

Update the Bootloader

Apparently it is critical to do this before doing anything else, or you risk bricking the device, so do it. More information here.

The process is quick and easy if you’re starting from a working SF2000. Here are the steps, copied from here:

  1. Ensure your SF2000 is in a state where it boots normally when turned on (displays a boot logo, proceeds to the stock firmware main menu)
  2. Ensure your SF2000’s battery is fully charged (having the device power off during the patching process will likely ā€œbrickā€ it, rendering it inoperable)
  3. Power off the SF2000, and remove the microSD card
  4. Connect the microSD card to your computer
  5. Download this zip file:Ā SF2000_bootloader_bugfix.zip
  6. Extract the zip file; inside is a folder calledĀ UpdateFirmware, containing a single file calledĀ Firmware.upk
  7. Copy theĀ UpdateFirmwareĀ folder to the root of the microSD card, so that theĀ UpdateFirmwareĀ folder is in the same place as theĀ biosĀ andĀ romsĀ folders (i.e., you’ll have anĀ sd:/UpdateFirmware/Firmware.upkĀ file)
  8. Eject the microSD card from your computer, and put it back in the SF2000
  9. Turn the SF2000 on; you should see a message in the lower-left corner of the screen indicating that patching is taking place. The process will only last a few seconds. If you do not see this message, and instead just go to the main menu as normal, then either this means your SF2000 has previously had the fix applied already, or you should double-check you’ve placed the patch file in the right place
  10. When the patching is complete, you will be taken to the main menu as usual
  11. Power off the SF2000, and remove the microSD card
  12. Connect the microSD card to your computer
  13. Delete theĀ UpdateFirmwareĀ folder (it’s no longer needed)

Install Multicore Firmware

It’s tough to track down the best sources of information about this. Apparently the real activity happens on Discord and Telegram, but I’m not committed enough to go there. There is some good information here, and it looks like the most ā€œofficialā€ builds live here, but this fairly recent youtube guide/review points to a more recent ā€œPurple Neoā€ build, which I’m going to use.

Firmware installation steps:

  1. Make sure you’ve fixed the bootloader, above.
  2. Download the 10GB zip file, found here.
  3. Either use a new Fat32 formatted microSD card, or back up and format the stock microSD card as Fat32.
  4. Insert the microSD card and determine it’s name. Mine is called ā€˜NO NAME’.
     % ls /Volumes
     Macintosh HD	NO NAME
    
  5. (Optional) Create a zip of the original contents, I’ll store it on the desktop for now.
     % zip -r ~/Desktop/SF2000_backup.zip /Volumes/NO\ NAME/
    
  6. Use Disk Utility to erase (format) the microSD card, and choose MS-DOS (FAT) as the format.
  7. Extract the contents of the 10GB zip you downloaded earlier to the microSD card.
     % unzip ~/Downloads/PurpleNeo_Multicore_0.10_23365b6_2024-07-11_b.zip -d /Volumes/NO\ NAME/
    
  8. Do the screen tearing fix below.
  9. Eject the microSD card, put it back in the SF2000, and boot.

Improve Screen Tearing

Some games that we really enjoy like Super Mario Brothers have pretty bad screen tearing with the default firmware. Once multicore is installed we can set a config value to enable some software improvements.

This is a one-line config change fix, but if you want to see more, here’s a youtube video about it.

  1. Install multicore following the steps above.
  2. With the microSD card still inserted in your computer, open the file cores/config/multicore.opt.
     nano /Volumes/NO\ NAME/cores/config/multicore.opt
    
  3. Set the value sf2000_tearing_fix on line 9 to fast and save the file (^X if using nano).
  4. Eject the microSD card, put it back in the SF2000, and boot.

That’s it, enjoy your improved SF2000!

I Skipped the Super Bowl, but This Ad Was Made for Me

11 Feb 2025 — Truckee, CA

I didn’t watch the Super Bowl this year. I had absolutely no connection to it and no preference for either team. While I generally enjoy football (college football, specifically) and spending time with friends—and yes, the commercials and halftime show are usually worth watching, if only for the water cooler conversations—I was elsewhere.

Instead, I was doing dad things: attending Ski Team with my kids, helping with homework, making dinner, and assisting my wife in unloading a mountain of Costco supplies she’d picked up earlier that day. (I bet Super Bowl Sunday is a fantastic day to shop at Costco.)

That’s just how it is in our family—we’re more of an Olympics-watching group, and that’s perfectly fine.

So, I didn’t watch the game, but apparently I was destined to see one of the ads…

This morning, in my groggy pre-coffee state, I was reading one of my favorite newsletters, Garbage Day, and chuckled at their critique of a Super Bowl ad:

Google’s Super Bowl ad last night, ā€œDream Job,ā€ depicted a dad getting ready for a job interview by talking out loud in his kitchen to an AI voice assistant, something I am very confident no one has done ever. But that doesn’t matter because Silicon Valley believes they are big enough now to create the future, rather than scale up to meet it.

I love Garbage Day—it makes me feel online and cool, with that warm fuzzy smugness we all occasionally need.

Then, not five minutes later, my dad texted me a link to the same commercial:

In case you didn’t see this… ā¤ļø

I watched it, and damn it, my eyes started leaking. It’s beautiful.

I feel seen.

Will I ever walk around my kitchen talking to an AI assistant? I can’t say (but yeah, almost certainly). While I doubt it’ll be Google’s service, I’m firmly in the camp that believes GPT and LLMs are legitimately transformative technologies, with more useful tools being built on top of them every day.

I left my ā€œprofessionalā€ job to devote more time to my family. Currently, I’m exploring AI tools to improve my journaling, workflows, and time management. And yes, I’m about to jump back into searching for paid work that fits with a more balanced life. I am absolutely the target audience for this ad, and it hit the bullseye.

Congratulations to the Google creative team and everyone involved in making this. Even in our hyper-critical online world, the reception seems almost universally positive (even the youtube comments!). And while Garbage Day may have rolled their eyes at it, I thought it was beautiful. I guess I can’t be smug all the time.

The Countdown to Rivian "Adventure Vans" has Begun

10 Feb 2025 — Truckee, CA

As of today Rivian is opening up sales of it’s commercial van (the Amazon one) to anyone (with a business) with orders as low as a single van.

I’m excited to see businesses start picking these up, I don’t expect to see many where I live in Truckee right away; but as I think/type this I realize that maybe that’ll be proven wrong since our mountain-town gas prices make EVs a more compelling option. Too bad there isn’t a 4wd version… Yet?

EV ā€œAdventure Vansā€?

I’m even more excited about what the ā€œadventure vanā€ market will do with these. Do EV camper/adventure vans make sense? I’m not sure, I think so for a large chunk of what people actually use their ā€œadventureā€ vans for, but probably not for what people think they’ll use their adventure vans for.

My use would be 99% in trailheads/ski parking areas within 50 mi of my house, so electric would be awesome. Overlanding to Alaska, maybe not so much.

As for the van, it’s pretty boring, if cute in that quirky Rivian way. Can’t wait to see what some customization shops come up with, even if they’re just built to show off concepts at van shows. I’ll post what I see.

Yay Cameras Update: Two Weeks In

04 Feb 2025 — Alpine Meadows, CA

It’s been two-ish weeks since started my ā€œYay Camerasā€ project. It’s been getting a couple hours of my time most days, and while the site might still look a bit rough, I’m happy with the progress I’ve made. If you haven’t already, you can check out my initial plans for the site here.

Backend Focus

Over the past couple weeks, I’ve been following my preferred path and spending quite a bit of time playing around in backend/ops land. This has involved a lot of learning and experimentation with various technologies:

  • Next.js: Yes, the frontend, but… focusing on server-side components to ensure pages are cacheable. I’m excited about the server side features, I’m aiming to build a completely cached site that can be enhanced later with frontend interactivity, and I’ve been working almost exclusively on a 99% client side React app for the last 5 years, so I want to stay in React for that stuff, for now.
  • Serverless Stack (SST) + OpenNext: SST for deploying the Next.js app on AWS using OpenNext. SST seems good, but I decided to manage other services with…
  • Terraform: To deploy DynamoDB and S3, because it’s so easy and the standard for managing this stuff.

While the frontend might still be a work in progress, the backend is up and running smoothly. I’ve been exploring the pros and cons of these various tools over the last few months (recovering from Amplify), and I’m generally happy with this setup.

Just Ship It

One of the key goals of project is the learning to ship imperfect things. It’s easy to get stuck in perfectionism, but putting something imperfect out there and then iterating on it is something I’ve been pushing myself to do. Learning in public (even though I don’t really think anyone is looking). I’m also trying to develop a habit of writing something every week, and this gives me something to talk about.

Current Features

Here’s a rundown of what I have so far:

  • Backend: A script that I run manually once a day to fetch new cameras and images from Flickr, along with product information from Amazon. All this data is stored in DynamoDB/S3.
  • Frontend: A Next.js app with an index page that picks a random manufacturer and displays a list of cameras we’ve found. Each camera has its own page displaying an image (if available) and photos taken with that camera.

All these pages are Next.js server components, so they’re cacheable and are just served directly from S3 (I think, I haven’t dug into exactly how OpenNext works). The index page is already cached, and I’ll cache the individual camera pages too, eventually.

Camera Pages + Flickr Photo Embeds

Flickr photos are marked up to display Flickr photo embeds. Each image includes metadata such as the photographer, license, title, etc., and links back to Flickr even if the embed doesn’t load. I’m particularly fond of these since I wrote the embed functionality for Flickr and I really like how I did it with progressively enhanced img tag -> sourceless iframe. Flickr went down at one point while I was working and the embeds kept chugging 🄰

Here’s an example page that showcases a nice collection of photos taken with the Sony ILCE-7: Yay Cameras - Sony ILCE-7.

What’s Next?

There’s still plenty of work to be done, especially on the design/features front. I’m taking a bit of a break for now to dig into another project, but I’ll be working on adding more camera and manufacturer information to fill in some empty spaces when I get back into it.

Overall, this project is more an exercise in getting stuff out and learning along the way. I’m not exactly embarassed by what’s there, but I want to put stuff I’m not 100% comfortable with out there rather than just keeping it in a git repo to die. More to come, I’m sure I’ll be embarassed by a lot of it.

Building Something: Yay Cameras!

15 Jan 2025 — Olympic Valley, CA

I’m working on a new personal project from a very old idea, it’s called Yay Cameras!, and it will be a site about… Cameras!

I’ve always loved cameras. I’ve never been a particularly excellent photographer, but I’ve always been interested in technology, and cameras are the coolest technology that was available to normal people like me, even before computers and video games; cameras were (are) magic.

My first digital camera was a Toshiba PDR-2. It was a crazy little thing with a PCMCIA card interface that flipped out of the back to interface with a PC, and I’m shocked that I (read: my mom) even had a computer with a PCMCIA slot to plug it into. I remember taking that camera with me on my first international trip when I was ~15, but I have no idea what happened to any photos that I took. The internet wasn’t quite ready for them at the time.

Over the subsequent years I’ve had dozens of cameras. Before our phones took over, I would wonder the camera section of the electronics store just to see what was new. The idea for Yay Cameras! is a website where cameras get ā€œprofile pagesā€ with sample photos and videos, links to photographers who use them, suggested accessories, etc. I imagine it as a place where people interested in photography cameras can go to see what’s new, research a new camera, or connect with others for tips and advice.

Will there be interest in this? I’m not sure, but that’s not really the point. I want to build it for my own interest.

ā€œThis is my Cam!ā€

This project has roots in a hack day app I built back in 2012 called This is my Cam!. The app let Flickr users generate profile pages for their cameras using their uploaded photos. It was fun, simple, and surprisingly popular.

Here are some screenshots of This is my Cam!:

This is my Cam!

Unfortunately, it was a victim of its own success. The app was built with Django/Python and ran on a tiny EC2 micro instance. It needed offline jobs to fetch and process Flickr photos, and the server couldn’t handle the load when ~1000 people signed up in the first few days. I wasn’t ready to scale it up (or pay for it), so it fizzled out. I’ve used the dream of rebuilding it to explore various technologies over the years, but I’ve never gotten it back out there for public consumption.

Why build this?

I’m not looking to build a ā€œbig thingā€ at the moment, but I want to put something out publicly that will give me a place to play with new tech. I’m constantly building little projects to scratch itches, but they rarely go further than satisfying my curiocity of the technology to justify a ā€œproductā€ or even a blog post. This is my Cam! is too big, I’m not really interested in having auth, user management, permissions, and everything that comes with an app with users. For whatever reason I do really want Yay Cameras! (and This is my Cam!) to exist (I’ve held on to the domains for over a decade…), so I’m going to build something and see if it’s worth maintaining.

ā€œMVPā€

  1. Core Features:
    • A script to discover and catalog cameras by analyzing photos from Flickr explore. (daily, manual execution)
    • A database of cameras with key details and example photos.
  2. Visual Design:
    • Borrowing from the playful style of This is my Cam! circa 2012.
  3. Tech Stack:
    • Built with Next.js, hosted serverlessly on AWS using SST and DynamoDB.
  4. This Week’s Goal:
    • Get a daily process running to update the camera database.
    • Deploy a basic frontend to yaycameras.com to browse the collection.

What’s Next

I expect the backend to come together quickly, I’ve done most of the groundwork in the little projects I mentioned. I imagine the frontend is where I’m most likely to fall into rabbit holes with many new things to explore and learn. I’ll probably make it extremely simple and ugly in this pass.

This isn’t a startup or a grand vision. It’s just a site I want to build because cameras are cool, and I think others might think so too. If you’ve got ideas or suggestions, I’d love to hear them.

Useful LLM AI: First Day With an AI Assistant

10 Jan 2025 — Olympic Valley, CA

What am I talking about?

Watch this video

Earlier this week, I checked a months-overdue item off of my to-do list and set up a very basic ā€œAI Personal Assistantā€ with Dan Catt’s Kitty AI. In its current form, it simply uses ChatGPT 4 to generate a set of three morning questions, then saves my answers and feeds those back into the prompt on subsequent days. This allows each day’s new questions to have some context of my previous answers, and therefore my emotional state, what I’m working on and hoping to achieve, if I’ve been setting aside time for exercise and rest, etc.

So far, I think it is shockingly great, and I am incredibly excited to use this tool to track what I’m doing, give myself accountability, and develop a routine. Obviously, a script backed by an LLM isn’t going to magically make those things happen, but I think it just might be the right thing to help me make those things happen. It’s already made a huge impact on my productivity and what I’ve chosen to do with my time over the last two days (I know, two days… but I’m optimistic!).

Of course, this tool is in the context of what I’m trying to improve personally: mindfulness, organization, goal-setting, and work/personal life separation.

First Day Experience

On my first day using Kitty, it asked me a question about my emotional state, which I answered with something like:

ā€œI’m feeling anxious that I won’t accomplish everything I hope to today, and rushed because I need to get treats to Lucas’ school for his birthday.ā€

I was wrapping up getting Kitty running, really wanted to get it done that day, and stressed that I wouldn’t. Another question was about what type of short break or activity I could include in my day to recharge, to which I said:

ā€œIt would be nice to get a few ski runs in but I don’t know if I’ll have the time.ā€

As I rushed to Lucas’ school to deliver his birthday treats, I turned these answers over in my head. I had initially planned to drop treats off at the school, then head back to a cafe and spend a few more hours on the computer before returning to pick Lucas and Isaac up from school. However, I realized that I would really get more value out of focusing on my son’s birthday and celebrating that, than whatever I might accomplish with two more hours staring at the computer.

Instead of rushing in and out for a quick birthday celebration, I signed Lucas and Isaac out for the remainder of the day (a totally normal thing to do; half the school leaves at noon for ski teams), and spent the afternoon skiing with them. I got the activity I needed and made my sons’ birthday week that much more special.

Would I have done the same without the questions, responses, and post-thought? I don’t know, but I really like the feeling of stopping for a few minutes each morning and being thoughtful, not just rushing into my to-do list.