<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.10.0">Jekyll</generator><link href="https://kachess.dev/feed.xml" rel="self" type="application/atom+xml" /><link href="https://kachess.dev/" rel="alternate" type="text/html" /><updated>2026-04-16T16:00:33+00:00</updated><id>https://kachess.dev/feed.xml</id><title type="html">Kachess</title><subtitle>This is my place where I post my thoughts to share with others.</subtitle><author><name>Christopher Alef</name></author><entry><title type="html">Don’t ignore what’s going on down there</title><link href="https://kachess.dev/health/personal/2026/03/16/dont-ignore-whats-going-on-down-there.html" rel="alternate" type="text/html" title="Don’t ignore what’s going on down there" /><published>2026-03-16T00:00:00+00:00</published><updated>2026-03-16T00:00:00+00:00</updated><id>https://kachess.dev/health/personal/2026/03/16/dont-ignore-whats-going-on-down-there</id><content type="html" xml:base="https://kachess.dev/health/personal/2026/03/16/dont-ignore-whats-going-on-down-there.html"><![CDATA[<p>Around 2pm yesterday I noticed some discomfort in my testicles. I walked it off. Things get uncomfortable down there on occasion — that’s just life. I went about my afternoon.</p>

<p>By 3:30 it was more than discomfort. A quick self-check turned up swelling and real tenderness. My working assumption was epididymitis — when I had my vasectomy a decade ago, my urologist flagged it as something to watch for, since vasectomies elevate the risk. So I had a plausible explanation in my back pocket. Not a crisis. Thirty minutes later, though, the pain had jumped to an 8. In ninety minutes it had gone from background noise to my wife Corinne driving me to the ER because I didn’t trust myself behind the wheel.</p>

<p>It wasn’t the pain that concerned me so much as the speed. That trajectory — that rate of escalation — is the thing worth paying attention to. Here’s why.</p>

<h2 id="the-first-thing-they-rule-out-torsion">The first thing they rule out: torsion</h2>

<p>Testicular torsion is a surgical emergency. The testicle twists, cutting off its own blood supply. Without surgery within roughly six hours, you lose the testicle. Permanently. Forever. Gone.</p>

<p>Now, I want to be clear: I have a vasectomy, I’m done having kids, and I fully appreciate that a missing testicle would make for an excellent nickname. “One Ball Chris” has a certain ring to it. Very on-brand.</p>

<p>But here’s the thing — testicles aren’t just decorative, and they’re not just for making babies. They produce testosterone. And testosterone, particularly as men age, is not optional equipment. It’s what keeps your muscle mass, your energy, your mood, your libido, your bone density, and your general will to get off the couch all functioning as intended. Lose a testicle and you significantly reduce your testosterone production — and that gap only grows as you age. That complicates things considerably, especially in your 40s and beyond when levels are already trending the wrong direction.</p>

<p>So no. Not interested in the nickname. Went to the ER.</p>

<p>Rapid onset is a classic red flag for torsion. Epididymitis typically builds gradually over hours or days. Pain that escalates fast — the kind that sneaks up on you in under two hours — warrants ruling torsion out. That requires imaging, which means the ER.</p>

<blockquote>
  <p><strong>If you have sudden testicular pain, go to the ER — not urgent care.</strong> Ruling out torsion requires a scrotal ultrasound with Doppler imaging. Most urgent care centers don’t have that capability. If they send you to the ER anyway, you’ve just lost precious time. Go directly.</p>
</blockquote>

<h2 id="yes-the-conversation-is-awkward-go-anyway">Yes, the conversation is awkward. Go anyway.</h2>

<p>Let me be honest about what the intake conversation actually looks like. Within the first few minutes, a clinician — who you have never met — is asking you: Who do you have sex with? How often? Do you have anal sex? How often do you masturbate? Any discharge? Any rash? Anything new or different you’ve noticed?</p>

<p>It’s a lot. There’s no warm-up, no small talk. You’re sitting in a hospital gown answering questions you’ve never said out loud to a stranger, about topics most men don’t discuss with their closest friends.</p>

<p>Here’s the thing: the clinician doesn’t care. Not in a cold way — in a professional way. They’ve asked these questions a thousand times. The answers are diagnostic information, nothing more. The awkwardness is entirely on your side of the conversation, and it’s over in five minutes.</p>

<p>What isn’t over in five minutes is the fallout from not going. Keep that in mind.</p>

<h2 id="what-i-actually-had-epididymitis">What I actually had: epididymitis</h2>

<p>After a urinalysis and a scrotal ultrasound with Doppler imaging — which measures blood flow and definitively rules out torsion — the diagnosis was epididymitis. My vasectomy urologist’s decade-old warning turned out to be worth remembering. The hunch was right; it was the speed of onset that made the ER the right call rather than a wait-and-see.</p>

<p>Boring, manageable, and fixable. Exactly the kind of diagnosis you want.</p>

<p>And here’s the thing about that ultrasound: urgent care couldn’t have done it. Most don’t have the equipment or the trained sonographers on staff. They would have examined me, made an educated guess, and either sent me home or sent me to the ER anyway — having burned an hour in the process. For something where torsion is on the table, that hour matters.</p>

<p>Left untreated, epididymitis can progress to abscess, spread to the testicle itself, become chronic, or in rare cases lead to sepsis. None of those outcomes are acceptable for something a course of antibiotics handles cleanly.</p>

<h2 id="the-ultrasound-found-other-things-worth-knowing">The ultrasound found other things worth knowing</h2>

<p>The imaging also turned up several incidental findings — things I had zero awareness of before yesterday, discovered only because I was there in the first place.</p>

<p>The most notable: bilateral hydroceles. That’s a collection of fluid surrounding each testicle, inside the scrotum — on both sides. Small ones, in my case, and almost certainly benign. Many men have them and never know it. They don’t always cause symptoms, they’re not inherently dangerous, and they often require no treatment at all.</p>

<p>So why does it matter? Because hydroceles can sometimes develop as a secondary response to an underlying condition — infection, injury, or in rarer cases, a tumor. The ultrasound ruled out anything concerning driving mine, but that’s exactly the point: you only know that if you get the imaging. Left unexamined, a hydrocele is just a mystery. Examined and baselined, it becomes something a urologist can monitor over time and catch if it ever changes.</p>

<p>The ER doctor recommended urology follow-up to properly document these findings and establish a baseline. Good advice I wouldn’t have received if I’d stayed on the couch Googling myself into a spiral.</p>

<h2 id="why-men-dont-go">Why men don’t go</h2>

<p>We’re conditioned to minimize. I did it myself — at 2pm I walked it off, because that’s usually the right call. The key is knowing when it isn’t. A slow build that stays mild? Maybe monitor it. A rapid escalation that has you reconsidering your ability to drive? That’s the ER.</p>

<p>The things that cost men organs, testosterone, and quality of life are overwhelmingly the things we decided weren’t worth a doctor’s time. Testicular torsion is completely reversible with prompt treatment. Testicular cancer has a very high survival rate when caught early. Epididymitis is straightforward to treat. Every one of these outcomes depends on whether you actually go — and whether you go to the right place.</p>

<h2 id="what-to-watch-for">What to watch for</h2>

<p>See a doctor same day — ER if needed — for any of the following: sudden or severe testicular pain, swelling or tenderness in the scrotum, a lump or change in size or texture of a testicle, or pain that radiates to the lower abdomen or back.</p>

<p>And do monthly self-exams. Two minutes. That’s it. The earlier you catch something, the better your options.</p>

<h2 id="the-bottom-line">The bottom line</h2>

<p>I’m fine. The vasectomy urologist’s decade-old heads-up proved useful, the diagnosis was epididymitis, antibiotics started, urology follow-up booked. And as a bonus, I now know about two hydroceles I didn’t know I had. Best possible outcome — and it only happened because I paid attention to the trajectory, and Corinne got me there.</p>

<p>The conversation was awkward. Go anyway. Keep both of yours.</p>]]></content><author><name>Christopher Alef</name></author><category term="health" /><category term="personal" /><category term="mens-health" /><category term="urology" /><category term="personal" /><summary type="html"><![CDATA[A personal account of a scrotal scare, an ER visit, and why 'One Ball Chris' is a nickname I'd prefer to avoid.]]></summary></entry><entry><title type="html">Breaking Up with TurboTax: How I Used Claude to File My Taxes for Free</title><link href="https://kachess.dev/taxes/ai/personal-finance/2026/02/27/breaking-up-with-turbotax.html" rel="alternate" type="text/html" title="Breaking Up with TurboTax: How I Used Claude to File My Taxes for Free" /><published>2026-02-27T00:00:00+00:00</published><updated>2026-02-27T00:00:00+00:00</updated><id>https://kachess.dev/taxes/ai/personal-finance/2026/02/27/breaking-up-with-turbotax</id><content type="html" xml:base="https://kachess.dev/taxes/ai/personal-finance/2026/02/27/breaking-up-with-turbotax.html"><![CDATA[<p><img src="/images/7CC3F67C-4E75-4CEE-89CD-E6996AE2FA5C_1_105_c.jpeg" alt="Luna the dog" /></p>

<p>After more than a decade of TurboTax, I filed my 2025 federal return for free using Claude. My tax situation isn’t simple: married filing jointly, two children, inherited IRA distributions across two brokerages, a Roth conversion, HSA contributions and distributions, capital loss carryforwards, foreign tax credits, and Section 199A REIT dividends. The completed return ran 42 pages—Form 1040 plus Schedules 1, 3, B, and D, plus Forms 8949, 8889, 8995, and 1116. Cost: $0.</p>

<p>I paired Claude with IRS Free File Fillable Forms, which was unnecessary friction. Skip FFFF. Have Claude fill out the PDFs directly and mail them in. The rest of this post explains why I left TurboTax, what worked, and what to do instead.</p>

<h2 id="why-i-left-intuit">Why I Left Intuit</h2>

<p>Intuit has spent over $45 million on federal lobbying since 1998, including $3.5 million in 2022. Their SEC filings list fighting “governmental encroachment” as an explicit corporate goal—their term for initiatives that would make tax filing easier or free for Americans. When the IRS launched Direct File, Intuit contributed to lawmakers pushing for its elimination and donated $1 million to Trump’s 2025 inaugural committee. Direct File is now gone.</p>

<p><a href="https://www.propublica.org/series/the-turbotax-trap">ProPublica documented</a> how Intuit deliberately hid free filing options from search engines while marketing paid products as “free,” resulting in a $141 million settlement with state attorneys general. The FTC has separately accused them of deceptive advertising.</p>

<p>I stopped paying them.</p>

<h2 id="the-free-filing-landscape-in-2026">The Free Filing Landscape in 2026</h2>

<p>The IRS offers two free filing paths. If your AGI is under $89,000, eight private partners offer guided software through the IRS Free File program. My income disqualified me.</p>

<p>The other option is Free File Fillable Forms (FFFF), available to everyone regardless of income. It’s the electronic equivalent of blank paper forms: you select forms, fill in values, do the math, and e-file. FFFF was added to the Free File Alliance in 2009—part of the agreement under which the IRS pledged not to build its own competing system. It offers no guidance, no interview questions, no imports. As the IRS warns: “If you are not comfortable with completing a paper return, using only the forms and instructions as a guide to file a correct return, this program is not for you.”</p>

<p>I used it anyway. I shouldn’t have—and I’ll explain why in the practical tips below.</p>

<h2 id="enter-claude">Enter Claude</h2>

<p>I created a dedicated Claude project and uploaded all my source documents: W-2s, 1099-Rs from multiple inherited IRAs, 1099-DIV and 1099-INT statements, HSA documents, and my prior year’s return.</p>

<p>I started with Claude Opus 4.5. From my prior year return, it generated a checklist of every form I should expect to receive—a useful starting point I updated as documents arrived.</p>

<p>Claude was my preparation assistant, not my preparer. FFFF doesn’t support any import or automation, so every value required manual entry into the web interface. Claude served as reference and verification; I did the data entry.</p>

<p>Here’s what Claude handled:</p>

<p><strong>Document Analysis and Organization</strong>: Claude extracted values from all uploaded documents, organized them by form type and source institution, and mapped them to the appropriate IRS forms. With six 1099-R forms carrying different distribution codes—inherited IRA distributions, a Roth conversion, and a rollover—Claude tracked each separately and identified which codes produced taxable income.</p>

<p><strong>Form Field Mapping</strong>: Claude generated worksheets showing which source document values mapped to which form lines. For Schedule B, this included every payer, their EIN, and exact amounts. For Form 8949 and Schedule D, it handled the capital loss carryforward.</p>

<p><strong>Calculation Verification</strong>: Claude walked through the Qualified Dividends and Capital Gain Tax Worksheet, the Credit Limit Worksheet for the Child Tax Credit, and the HSA contribution limits.</p>

<p><strong>Error Detection</strong>: Covered in the next section.</p>

<p>One limitation: Claude’s knowledge of IRS forms is not current. The IRS updates forms annually, and Claude’s training data didn’t reflect the current year’s versions. Multiple times I had to explain changed line numbers, revised worksheets, or new requirements. The IRS also blocks Claude from downloading forms directly from irs.gov. The workaround: upload current form PDFs yourself.</p>

<h2 id="the-audit">The Audit</h2>

<p>After completing my FFFF entries, I downloaded the return as a PDF and uploaded it to a fresh Opus 4.6 instance—a new conversation on a newer model, to avoid confirmation bias from the preparation session.</p>

<p>Opus 4.6 found four categories of errors:</p>

<p><strong>Misread Source Documents</strong>: Opus 4.5 had confused which box on a 1099 contained federal withholding versus a different tax category, which would have misstated my refund.</p>

<p><strong>Incomplete Worksheets</strong>: FFFF doesn’t auto-calculate supporting worksheets. Several lines requiring manual worksheet calculations had been left blank.</p>

<p><strong>Reconciliation Mismatches</strong>: FFFF rounds each withholding entry before summing. This created discrepancies between my entries and Form 1040’s expected totals. Claude identified which entries to adjust.</p>

<p><strong>Form Version Confusion</strong>: Some preparation work used outdated line references or missed new requirements. The audit caught these against current form versions.</p>

<p>Each error required re-entry in FFFF, re-export, and re-verification. But the deeper problem with FFFF is structural: it requires a human to transcribe every value manually, which introduces a whole class of errors that wouldn’t exist if Claude were writing directly to the forms.</p>

<h2 id="the-1041-contrast">The 1041 Contrast</h2>

<p>I also needed Form 1041 (Fiduciary Income Tax Return) for two irrevocable trusts. FFFF doesn’t support Form 1041, which forced a different workflow.</p>

<p>I provided Opus 4.6 with blank Form 1041 PDFs, trust documents, EIN assignment letters, and the trusts’ 1099-DIV statements. Claude filled out the forms directly, handling distributable net income rules, compressed trust tax brackets, and the allocation between tax-exempt and taxable income.</p>

<p>Opus 4.6 caught its own errors mid-process, double-checked calculations against source documents, and flagged uncertainties before I asked. The self-correction that required a separate audit session for my 1040 happened within the same 1041 session.</p>

<p>For the 1040: I entered data into FFFF while Claude assisted, then needed a separate audit session to verify. For the 1041s: Claude prepared complete returns that I reviewed, printed, signed, and mailed. (E-filing isn’t available for trust returns.)</p>

<p>The 1041 workflow is the right model for both.</p>

<h2 id="what-this-means">What This Means</h2>

<p>Claude isn’t a replacement for TurboTax—it’s a replacement for the expertise required to navigate tax forms without guidance.</p>

<p>TurboTax asks questions in plain English, handles form selection and calculations automatically, imports prior year data, and explains edge cases. If you want to file without understanding taxes, TurboTax works—provided you’re comfortable funding the reason it’s necessary in the first place.</p>

<p>The IRS already has your W-2s, your 1099s, your brokerage statements. It knows the tax rules. For most Americans, the government could calculate their tax bill, send them a pre-filled return, and let them approve or dispute the result. This is how it works in the UK, Japan, Germany, and dozens of other countries. The reason it doesn’t work that way here is that Intuit and H&amp;R Block have spent decades and hundreds of millions of dollars ensuring it doesn’t. Their business model depends on tax filing being a problem Americans need help solving. So they lobby against pre-filled returns, against Direct File, against anything that would make their products unnecessary. They have succeeded. Americans collectively spend around 6 billion hours per year on tax compliance. TurboTax is a solution to a problem that shouldn’t exist.</p>

<p>Claude explains why things work the way they do, catches errors through analysis rather than validation rules, and adapts to specific questions. It requires engaging with the process, not clicking through it.</p>

<p>The workflow I’d use now: upload all source documents to Claude, have it identify every required form, download current PDFs from irs.gov, and have Claude fill them out completely before auditing. Skip FFFF entirely. That replicates what worked for the 1041 returns.</p>

<h2 id="practical-tips">Practical Tips</h2>

<p>To make this workflow easier to replicate, I’ve packaged the prompting approach as a <a href="https://github.com/calef/us-federal-tax-assistant-skill">Claude skill</a>. Install it into your Claude project and it will guide you through document collection, form identification, and PDF preparation without starting from scratch each time.</p>

<p>If you prefer to set things up manually, here’s what I learned:</p>

<p><strong>1. Have Claude fill out the PDFs directly.</strong> Skip FFFF. Download current blank forms from irs.gov, upload them to your Claude project, and have Claude complete them. Print, sign, and mail. Every extra step FFFF adds—manual entry, rounding quirks, no import support—creates room for errors that cost time to correct.</p>

<p><strong>2. Start with your prior year return.</strong> Upload it and ask Claude to generate a checklist of all forms you should expect to receive. This gives you a document collection roadmap.</p>

<p><strong>3. Upload current year IRS forms.</strong> Claude’s knowledge of form layouts may be outdated. Have Claude list the forms needed, download current versions from irs.gov, and upload them to your project.</p>

<p><strong>4. Work form by form.</strong> Complete Schedule B and verify it before moving to Schedule D. Don’t try to prepare everything at once.</p>

<p><strong>5. Use a separate Claude conversation for audit.</strong> Upload your completed return to a new session on the latest available model.</p>

<p><strong>6. Verify visually.</strong> Have Claude examine images of source documents, not just extracted text. This catches transcription errors.</p>

<p><strong>7. Budget time for iteration.</strong> The audit process is where value gets created. Plan for multiple passes.</p>

<h2 id="the-bottom-line">The Bottom Line</h2>

<p>My 2025 federal return filed successfully. Total cost: ~150,000 tokens.</p>

<p>Time spent was comparable to past TurboTax years. The bottleneck wasn’t Claude—it was FFFF. The audit iterations added time, but each one found real errors.</p>

<p>I stopped funding a company that spends millions lobbying against free tax filing. I learned more about my own tax situation in one filing season than in a decade of clicking through TurboTax. AI-assisted preparation works for complex returns—42 pages, nine forms, multiple inherited IRAs, trust filings—without commercial software.</p>]]></content><author><name>Christopher Alef</name></author><category term="taxes" /><category term="ai" /><category term="personal-finance" /><category term="claude" /><category term="turbotax" /><category term="intuit" /><category term="irs" /><category term="free-file" /><category term="ai-assistant" /><summary type="html"><![CDATA[A journey from paid tax software to AI-assisted free filing using Claude and IRS Free File Fillable Forms.]]></summary></entry><entry><title type="html">Dots and Boxes</title><link href="https://kachess.dev/2026/02/14/dots-and-boxes.html" rel="alternate" type="text/html" title="Dots and Boxes" /><published>2026-02-14T20:00:00+00:00</published><updated>2026-02-14T20:00:00+00:00</updated><id>https://kachess.dev/2026/02/14/dots-and-boxes</id><content type="html" xml:base="https://kachess.dev/2026/02/14/dots-and-boxes.html"><![CDATA[<p>I love the game Dots and Boxes. Graeme and I play it while we're waiting to be served in restaurants and have for years. He usually wins.</p>

<style>
  .dab-container {
    display: flex;
    flex-direction: column;
    align-items: center;
    font-family: sans-serif;
    user-select: none;
  }
  .dab-info {
    margin-bottom: 12px;
    font-size: 1.1em;
  }
  .dab-score {
    display: flex;
    gap: 32px;
    margin-bottom: 12px;
    font-size: 1.1em;
    font-weight: bold;
  }
  .dab-score span {
    padding: 4px 12px;
    border-radius: 6px;
  }
  .dab-p1 { color: #2563eb; }
  .dab-p2 { color: #dc2626; }
  .dab-active { text-decoration: underline; }
  #dab-board {
    cursor: pointer;
    max-width: 100%;
    height: auto;
    touch-action: manipulation;
  }
  #dab-reset {
    margin-top: 16px;
    padding: 8px 24px;
    font-size: 1em;
    cursor: pointer;
    border: 1px solid #888;
    border-radius: 6px;
    background: #f5f5f5;
  }
  #dab-reset:hover { background: #e0e0e0; }
  #dab-message {
    margin-top: 8px;
    font-size: 1.2em;
    font-weight: bold;
    min-height: 1.5em;
  }
</style>

<div class="dab-container">
  <div class="dab-score">
    <span class="dab-p1" id="dab-s1">Player 1: 0</span>
    <span class="dab-p2" id="dab-s2">Player 2: 0</span>
  </div>
  <div class="dab-info" id="dab-turn">Player 1's turn</div>
  <canvas id="dab-board"></canvas>
  <div id="dab-message"></div>
  <button id="dab-reset">New Game</button>
</div>

<script>
(function() {
  const N = 8; // 8x8 grid of boxes means 9x9 dots
  const DOTS = N + 1;
  const DOT_R = 5;
  const CELL = 52;
  const PAD = 20;
  const LINE_W = 4;
  const HIT = 12;

  const P1_COLOR = '#2563eb';
  const P2_COLOR = '#dc2626';
  const P1_FILL = 'rgba(37,99,235,0.18)';
  const P2_FILL = 'rgba(220,38,38,0.18)';
  const LINE_DEFAULT = '#ccc';

  // hLines[row][col] = 0 (none), 1 (p1), 2 (p2)
  // hLines has DOTS rows, N cols
  let hLines, vLines, boxes, scores, turn, gameOver;

  const canvas = document.getElementById('dab-board');
  const ctx = canvas.getContext('2d');
  const size = PAD * 2 + (DOTS - 1) * CELL;
  canvas.width = size;
  canvas.height = size;

  function init() {
    hLines = Array.from({length: DOTS}, () => new Uint8Array(N));
    vLines = Array.from({length: N}, () => new Uint8Array(DOTS));
    boxes = Array.from({length: N}, () => new Uint8Array(N));
    scores = [0, 0];
    turn = 1;
    gameOver = false;
    updateUI();
    draw();
  }

  function dotX(c) { return PAD + c * CELL; }
  function dotY(r) { return PAD + r * CELL; }

  function draw() {
    ctx.clearRect(0, 0, size, size);

    // boxes
    for (let r = 0; r < N; r++) {
      for (let c = 0; c < N; c++) {
        if (boxes[r][c]) {
          ctx.fillStyle = boxes[r][c] === 1 ? P1_FILL : P2_FILL;
          ctx.fillRect(dotX(c), dotY(r), CELL, CELL);
        }
      }
    }

    // horizontal lines
    for (let r = 0; r < DOTS; r++) {
      for (let c = 0; c < N; c++) {
        ctx.strokeStyle = hLines[r][c] ? (hLines[r][c] === 1 ? P1_COLOR : P2_COLOR) : LINE_DEFAULT;
        ctx.lineWidth = hLines[r][c] ? LINE_W + 1 : LINE_W;
        ctx.beginPath();
        ctx.moveTo(dotX(c), dotY(r));
        ctx.lineTo(dotX(c + 1), dotY(r));
        ctx.stroke();
      }
    }

    // vertical lines
    for (let r = 0; r < N; r++) {
      for (let c = 0; c < DOTS; c++) {
        ctx.strokeStyle = vLines[r][c] ? (vLines[r][c] === 1 ? P1_COLOR : P2_COLOR) : LINE_DEFAULT;
        ctx.lineWidth = vLines[r][c] ? LINE_W + 1 : LINE_W;
        ctx.beginPath();
        ctx.moveTo(dotX(c), dotY(r));
        ctx.lineTo(dotX(c), dotY(r + 1));
        ctx.stroke();
      }
    }

    // dots
    for (let r = 0; r < DOTS; r++) {
      for (let c = 0; c < DOTS; c++) {
        ctx.beginPath();
        ctx.arc(dotX(c), dotY(r), DOT_R, 0, Math.PI * 2);
        ctx.fillStyle = '#333';
        ctx.fill();
      }
    }
  }

  function checkBoxes() {
    let claimed = 0;
    for (let r = 0; r < N; r++) {
      for (let c = 0; c < N; c++) {
        if (boxes[r][c]) continue;
        if (hLines[r][c] && hLines[r + 1][c] && vLines[r][c] && vLines[r][c + 1]) {
          boxes[r][c] = turn;
          scores[turn - 1]++;
          claimed++;
        }
      }
    }
    return claimed;
  }

  function updateUI() {
    document.getElementById('dab-s1').textContent = 'Player 1: ' + scores[0];
    document.getElementById('dab-s2').textContent = 'Player 2: ' + scores[1];
    document.getElementById('dab-s1').className = 'dab-p1' + (turn === 1 && !gameOver ? ' dab-active' : '');
    document.getElementById('dab-s2').className = 'dab-p2' + (turn === 2 && !gameOver ? ' dab-active' : '');
    const turnEl = document.getElementById('dab-turn');
    const msgEl = document.getElementById('dab-message');
    if (gameOver) {
      turnEl.textContent = '';
      if (scores[0] > scores[1]) msgEl.textContent = 'Player 1 wins!';
      else if (scores[1] > scores[0]) msgEl.textContent = 'Player 2 wins!';
      else msgEl.textContent = "It's a tie!";
    } else {
      turnEl.textContent = 'Player ' + turn + "'s turn";
      msgEl.textContent = '';
    }
  }

  canvas.addEventListener('click', function(e) {
    if (gameOver) return;
    const rect = canvas.getBoundingClientRect();
    const mx = (e.clientX - rect.left) * (canvas.width / rect.width);
    const my = (e.clientY - rect.top) * (canvas.height / rect.height);

    let best = null, bestDist = HIT;

    // check horizontal lines
    for (let r = 0; r < DOTS; r++) {
      for (let c = 0; c < N; c++) {
        if (hLines[r][c]) continue;
        const cx = (dotX(c) + dotX(c + 1)) / 2;
        const cy = dotY(r);
        const dx = Math.abs(mx - cx);
        const dy = Math.abs(my - cy);
        if (dx <= CELL / 2 && dy <= HIT) {
          const d = dy;
          if (d < bestDist) { bestDist = d; best = {type: 'h', r, c}; }
        }
      }
    }

    // check vertical lines
    for (let r = 0; r < N; r++) {
      for (let c = 0; c < DOTS; c++) {
        if (vLines[r][c]) continue;
        const cx = dotX(c);
        const cy = (dotY(r) + dotY(r + 1)) / 2;
        const dx = Math.abs(mx - cx);
        const dy = Math.abs(my - cy);
        if (dy <= CELL / 2 && dx <= HIT) {
          const d = dx;
          if (d < bestDist) { bestDist = d; best = {type: 'v', r, c}; }
        }
      }
    }

    if (!best) return;

    if (best.type === 'h') hLines[best.r][best.c] = turn;
    else vLines[best.r][best.c] = turn;

    const claimed = checkBoxes();
    const totalBoxes = scores[0] + scores[1];

    if (totalBoxes === N * N) {
      gameOver = true;
    } else if (claimed === 0) {
      turn = turn === 1 ? 2 : 1;
    }
    // if claimed > 0, same player goes again

    updateUI();
    draw();
  });

  // hover effect
  canvas.addEventListener('mousemove', function(e) {
    if (gameOver) { canvas.style.cursor = 'default'; return; }
    const rect = canvas.getBoundingClientRect();
    const mx = (e.clientX - rect.left) * (canvas.width / rect.width);
    const my = (e.clientY - rect.top) * (canvas.height / rect.height);

    let hovering = false;
    for (let r = 0; r < DOTS && !hovering; r++) {
      for (let c = 0; c < N && !hovering; c++) {
        if (hLines[r][c]) continue;
        const cx = (dotX(c) + dotX(c + 1)) / 2;
        const cy = dotY(r);
        if (Math.abs(mx - cx) <= CELL / 2 && Math.abs(my - cy) <= HIT) hovering = true;
      }
    }
    for (let r = 0; r < N && !hovering; r++) {
      for (let c = 0; c < DOTS && !hovering; c++) {
        if (vLines[r][c]) continue;
        const cx = dotX(c);
        const cy = (dotY(r) + dotY(r + 1)) / 2;
        if (Math.abs(my - cy) <= CELL / 2 && Math.abs(mx - cx) <= HIT) hovering = true;
      }
    }
    canvas.style.cursor = hovering ? 'pointer' : 'default';
  });

  document.getElementById('dab-reset').addEventListener('click', init);
  init();
})();
</script>]]></content><author><name>Christopher Alef</name></author><summary type="html"><![CDATA[I love the game Dots and Boxes. Graeme and I play it while we're waiting to be served in restaurants and have for years. He usually wins.]]></summary></entry><entry><title type="html">Language, Intelligence, and the Multimodal Convergence</title><link href="https://kachess.dev/2026/02/14/language-intelligence-and-multimodal-convergence.html" rel="alternate" type="text/html" title="Language, Intelligence, and the Multimodal Convergence" /><published>2026-02-14T00:00:00+00:00</published><updated>2026-02-14T00:00:00+00:00</updated><id>https://kachess.dev/2026/02/14/language-intelligence-and-multimodal-convergence</id><content type="html" xml:base="https://kachess.dev/2026/02/14/language-intelligence-and-multimodal-convergence.html"><![CDATA[<p><img src="/images/chio.jpg" alt="Chio the cat" /></p>

<p>Natural language understanding was long classified as “<a href="https://en.wikipedia.org/wiki/AI-complete">AI-complete</a>” — as hard as general intelligence itself. The assumption was that you’d need to solve reasoning first, and language would follow. Instead, the field discovered that training on language prediction alone produced systems with broad reasoning, coding, math, and planning abilities. Language wasn’t the destination. It was the vehicle.</p>

<p>The biological evidence tells a similar story.</p>

<h2 id="language-as-a-catalyst-for-intelligence">Language as a Catalyst for Intelligence</h2>

<p>Across species, more complex communication systems correlate with more flexible cognition. <a href="https://en.wikipedia.org/wiki/Cetacea">Cetaceans</a>, great apes, <a href="https://en.wikipedia.org/wiki/Corvidae">corvids</a>, and elephants all pass the <a href="https://en.wikipedia.org/wiki/Mirror_test">mirror self-recognition</a> test, use tools, and show social reasoning—and all have relatively sophisticated signaling systems. Bottlenose dolphins pass the mirror test — the first nonprimates shown to do so — while most primate species never do.<sup>[<a href="#ref1">1</a>]</sup></p>

<p>But correlation isn’t causation. The stronger evidence comes from intervention studies—experiments where researchers actively train subjects in new skills and measure the resulting cognitive changes. Kanzi the bonobo acquired over 348 <a href="https://en.wikipedia.org/wiki/Lexigram">lexigrams</a> (symbols representing words) and comprehended novel English sentences, demonstrating planning, categorical reasoning, and concept combination that untrained bonobos don’t display.<sup>[<a href="#ref2">2</a>]</sup> Language doesn’t just reflect intelligence. It scaffolds it.<sup>[<a href="#ref3">3</a>]</sup></p>

<p>Human development data reinforces this. Children’s cognitive growth tracks closely with language acquisition. Inner speech (self-directed language) is critical for executive function, working memory, and self-regulation. Deaf children without early language exposure show delays of roughly three years on <a href="https://en.wikipedia.org/wiki/False_belief_task">false-belief tasks</a>, even when general intelligence is unaffected.<sup>[<a href="#ref4">4</a>]</sup></p>

<p>The evolutionary picture suggests a feedback loop. <a href="https://en.wikipedia.org/wiki/Dunbar%27s_number">Dunbar’s social brain hypothesis</a><sup>[<a href="#ref5">5</a>]</sup> showed that primate neocortex size correlates with social group size across 38 primate groups — bigger groups require bigger brains. <a href="https://en.wikipedia.org/wiki/Shared_intentionality">Tomasello’s shared intentionality framework</a><sup>[<a href="#ref6">6</a>]</sup> argues that cooperative communication is the foundation of uniquely human cognition. Both point to co-evolution. Larger social groups demanded better communication. Better communication enabled cooperation and cultural transmission, selecting for still larger groups. A ratchet effect — and one that AI followed independently.</p>

<h2 id="the-ai-mirror">The AI Mirror</h2>

<p>Large language models learn to predict the next token in a sequence of text. That’s it. But language is a compressed representation of human knowledge, reasoning patterns, and world models. The text is a shadow of the world, and modeling the shadow requires modeling much of the world.</p>

<p>Language-trained AI systems exhibit emergent abilities that weren’t directly optimized for: chain-of-thought reasoning, analogical thinking, in-context learning. These appear abruptly around the 100-billion-parameter scale.<sup>[<a href="#ref7">7</a>]</sup> The symbolic structure of language scaffolds abstract reasoning in silicon just as it does in brains.</p>

<p>Having models “think out loud” step by step before answering improved PaLM 540B’s accuracy on the GSM8K math benchmark from 18% to 57%.<sup>[<a href="#ref8">8</a>]</sup> This parallels <a href="https://en.wikipedia.org/wiki/Inner_monologue">Vygotsky’s theory of inner speech</a>—the idea that externalized language gets internalized as a thinking tool.<sup>[<a href="#ref9">9</a>]</sup> AI models reason better when they use language to structure their thinking, just as children do.</p>

<p>The limits mirror biological ones. Current language models struggle with tasks that don’t map well to linguistic representation: fine motor planning, continuous spatial reasoning, real-time sensory processing. These are areas where embodied experience matters more than language, just as octopus intelligence suggests non-linguistic routes to cognition exist.<sup>[<a href="#ref10">10</a>]</sup></p>

<h2 id="the-multimodal-convergence">The Multimodal Convergence</h2>

<p>The frontier of AI research is increasingly multimodal—integrating language, vision, audio, and action into unified systems. The key finding isn’t just that these systems <em>can</em> handle multiple formats. It’s that each modality improves performance in the others. A model that can see images reasons better about spatial language. A model that processes code writes better natural language explanations. The modalities are synergistic, not merely additive.</p>

<p>This addresses a central critique of pure language models: the symbol grounding problem.<sup>[<a href="#ref11">11</a>]</sup> Words in a language-only system are patterns of tokens without real-world reference. Multimodal training partially solves this. A model that has seen millions of images of dogs alongside the word “dog” has something closer to a grounded concept than one that only knows “dog” from its textual relationships to other words.</p>

<p>The biological parallel is close. Human cognition is fundamentally multimodal. Our concepts aren’t stored as text—they’re distributed across sensory, motor, and linguistic representations.<sup>[<a href="#ref12">12</a>]</sup> The brain’s convergence zones integrate information across modalities into unified representations. Multimodal AI architectures are converging on the same design principle.</p>

<p>The next frontier is robotics. Teams at <a href="https://deepmind.google/">Google DeepMind</a>, <a href="https://www.figure.ai/">Figure</a>, and others are connecting language models to robotic bodies.<sup>[<a href="#ref13">13</a>]</sup> Language provides a powerful planning and abstraction layer; embodied experience provides the grounding and physics understanding that language alone struggles with. A robot that can be told “pick up the fragile thing carefully” needs language comprehension, visual recognition, and motor control integrated seamlessly.</p>

<p>Capabilities emerge only when modalities combine—spatial reasoning that pure language models fail at, generalization of instructions to novel visual scenarios never described in text.</p>

<p>The trajectory points toward unified world models — systems that maintain a shared internal representation updated by whatever modality is available. Your brain already works this way, integrating what you see, hear, feel, and know into a single coherent experience.<sup>[<a href="#ref14">14</a>]</sup> The pattern holds across biological and artificial intelligence: language is the most powerful single modality for abstract reasoning, but it reaches its full potential when grounded in other forms of experience.</p>

<p>Language isn’t intelligence. But it might be the closest thing to a universal catalyst for it.</p>

<hr />

<h2 id="references">References</h2>

<p><a id="ref1"></a>1. Reiss, D., &amp; Marino, L. (2001). “Mirror self-recognition in the bottlenose dolphin: A case of cognitive convergence.” <em>Proceedings of the National Academy of Sciences</em>, 98(10), 5937–5942. — First demonstration of mirror self-recognition in a nonprimate species.</p>

<p><a id="ref2"></a>2. Savage-Rumbaugh, S., &amp; Lewin, R. (1994). <em>Kanzi: The Ape at the Brink of the Human Mind</em>. Wiley. — Documents Kanzi’s language acquisition and the emergent cognitive abilities observed in language-trained bonobos.</p>

<p><a id="ref3"></a>3. Lupyan, G., &amp; Bergen, B. (2016). “How Language Programs the Mind.” <em>Topics in Cognitive Science</em>, 8(2), 408–424. — Reviews evidence for how language shapes perception, categorization, and memory across domains.</p>

<p><a id="ref4"></a>4. Peterson, C. C., &amp; Siegal, M. (2000). “Insights into Theory of Mind from Deafness and Autism.” <em>Mind &amp; Language</em>, 15(1), 123–145. doi:10.1111/1468-0017.00126 — Demonstrates that deaf children without early language access show delays in theory of mind development.</p>

<p><a id="ref5"></a>5. Dunbar, R. I. M. (1998). “The Social Brain Hypothesis.” <em>Evolutionary Anthropology</em>, 6(5), 178–190. — Proposes that primate brain size evolved primarily to manage complex social relationships, with language as a key enabler.</p>

<p><a id="ref6"></a>6. Tomasello, M. (2008). <em>Origins of Human Communication</em>. MIT Press. — Argues that shared intentionality and cooperative communication are the foundations of uniquely human cognition.</p>

<p><a id="ref7"></a>7. Wei, J., et al. (2022). “Emergent Abilities of Large Language Models.” <em>Transactions on Machine Learning Research</em>. — Documents cognitive capabilities that appear suddenly at scale in language models without being explicitly trained.</p>

<p><a id="ref8"></a>8. Wei, J., et al. (2022). “Chain-of-Thought Prompting Elicits Reasoning in Large Language Models.” <em>Advances in Neural Information Processing Systems</em>, 35, 24824–24837. — Demonstrates that step-by-step verbal reasoning dramatically improves LLM performance on complex tasks.</p>

<p><a id="ref9"></a>9. Vygotsky, L. S. (1934/1986). <em>Thought and Language</em>. MIT Press. — The foundational work on inner speech as a cognitive tool, arguing that language transforms thinking rather than merely expressing it.</p>

<p><a id="ref10"></a>10. Mather, J. A., &amp; Dickel, L. (2017). “Cephalopod Complex Cognition.” <em>Current Opinion in Behavioral Sciences</em>, 16, 131–137. — Reviews evidence for sophisticated intelligence in octopuses and cuttlefish despite minimal social communication.</p>

<p><a id="ref11"></a>11. Harnad, S. (1990). “The Symbol Grounding Problem.” <em>Physica D</em>, 42, 335–346. — Defines the problem of how symbols acquire meaning, a central challenge for both AI and cognitive science.</p>

<p><a id="ref12"></a>12. Barsalou, L. W. (1999). “Perceptual Symbol Systems.” <em>Behavioral and Brain Sciences</em>, 22(4), 577–660. — Proposes that cognition is grounded in simulated sensory-motor experience rather than amodal symbols.</p>

<p><a id="ref13"></a>13. Brohan, A., et al. (2023). “RT-2: Vision-Language-Action Models Transfer Web Knowledge to Robotic Control.” <em>arXiv preprint arXiv:2307.15818</em>. — Shows how multimodal language-vision models can be connected to robotic action for grounded intelligence.</p>

<p><a id="ref14"></a>14. Goyal, A., &amp; Bengio, Y. (2022). “Inductive Biases for Deep Learning of Higher-Level Cognition.” <em>Proceedings of the Royal Society A</em>, 478(2266). — Discusses architectural principles for building AI systems that develop abstract reasoning, including the role of multimodal integration.</p>]]></content><author><name>Christopher Alef</name></author><category term="artificial-intelligence" /><category term="cognitive-science" /><category term="language" /><category term="multimodal-ai" /><summary type="html"><![CDATA[From dolphins to deep learning, language is the most powerful single substrate for developing general reasoning. But it reaches its full potential only when grounded in other forms of experience.]]></summary></entry><entry><title type="html">Allan J Alef (June 17, 1950 - Aug 25, 2024)</title><link href="https://kachess.dev/2024/08/26/allan-j-alef.html" rel="alternate" type="text/html" title="Allan J Alef (June 17, 1950 - Aug 25, 2024)" /><published>2024-08-26T04:00:00+00:00</published><updated>2024-08-26T04:00:00+00:00</updated><id>https://kachess.dev/2024/08/26/allan-j-alef</id><content type="html" xml:base="https://kachess.dev/2024/08/26/allan-j-alef.html"><![CDATA[<p><img src="/images/allan-j-alef.jpg" alt="Allan J. Alef" /></p>

<p>Allan Jay Alef, 74, of Bellevue, WA, passed away on August 25, 2024, after years of battling the effects of primary progressive aphasia. Born on June 17, 1950, in Princeton, NJ, Allan was the beloved son of Gustave and Joan Alef.</p>

<p>Allan graduated from South Eugene High School in 1968 and earned his degree from the University of Oregon in 1972, followed by a master’s degree. He was proud of his participation in Junior ROTC and ROTC throughout his schooling. His early career included ten honorable years in the Army as a military police officer at Fort Hood, where he served with distinction. After leaving the Army, he dedicated his professional life as a Special Agent in the Bureau of Alcohol, Tobacco, and Firearms, from which he retired in 2002.</p>

<p>Allan was an avid photographer, capturing the beauty of the world around him. He loved walking his dogs and was keenly interested in investing, staying engaged with the markets, and enjoying their challenges.</p>

<p>To those who knew him, Allan was the most loyal person one could meet. His steadfastness and reliability were hallmarks of his character.</p>

<p>Allan is survived by his son, Chris Alef, and daughter-in-law, Corinne Alef; his grandchildren, Clay Alef and Graeme Alef; his brothers, Peter Alef and Eric Alef; his sister-in-law, Carol Alef; and his trusted friend, Barb Hammermeister. He was predeceased by his ex-wife, Gail Alef. Allan’s family and friends will always cherish the memories they shared with him.</p>

<p>Following Allan’s wishes, no formal funeral or memorial service will be held. Instead of donations or flowers, please honor Allan by taking yourself out to a nice meal and making a toast in his memory.</p>

<p>Rest in peace, Allan. Your loyalty, kindness, and spirit will forever remain in our hearts.</p>]]></content><author><name>Christopher Alef</name></author><summary type="html"><![CDATA[]]></summary></entry><entry><title type="html">Hospice</title><link href="https://kachess.dev/2024/07/03/hospice.html" rel="alternate" type="text/html" title="Hospice" /><published>2024-07-03T23:23:00+00:00</published><updated>2024-07-03T23:23:00+00:00</updated><id>https://kachess.dev/2024/07/03/hospice</id><content type="html" xml:base="https://kachess.dev/2024/07/03/hospice.html"><![CDATA[<p>This week, I made the difficult decision to enroll my father in hospice care. He is suffering from <a href="https://www.mayoclinic.org/diseases-conditions/primary-progressive-aphasia/symptoms-causes/syc-20350499">primary progressive aphasia</a> and is in the very late stages of the disease. For the past two months, he has been struggling with hallucinations and paranoia, making him miserable. He believes the staff around him are carrying weapons and have beaten him, which has increased his agitation and caused him to move around unpredictably. Standing and walking contributed to multiple injuries and trips to the emergency department. A few weeks ago, despite round-the-clock care and observation, he fell, hit his head, and suffered a <a href="https://my.clevelandclinic.org/health/diseases/14480-brain-bleed-hemorrhage-intracranial-hemorrhage">brain bleed</a>. He has been in the hospital since. In the hospital, his condition has deteriorated more rapidly. His speech is slurred and he is even more agitated, including assaulting staff and removing monitors. They have been forced to intermittently restrain and sedate him.</p>

<p>Due to being bedridden for so long, he can no longer stand or walk. He never wanted to be bedridden and hallucinating. So, we have shifted his <a href="https://doh.wa.gov/public-health-provider-resources/emergency-medical-services-ems-systems/portable-orders-life-sustaining-treatment-polst">POLST form</a> from selective treatment to comfort care, which allows him to return to his assisted living facility tomorrow and eliminates further trips to the emergency department, of which he’s had many in the last few months.</p>

<p>The next weeks or months will be difficult. At this point, we’re waiting for some condition to catch up with him, causing him to lose his ability to drink and eventually pass away. If we hadn’t given him IV fluids and antibiotics during this most recent hospital visit, he would likely have already passed. I fear that might have been the better outcome at this point. Both he and I hope that he simply goes to sleep and doesn’t wake up. Unfortunately, he is no longer capable of making the decision to end his life under <a href="https://doh.wa.gov/data-and-statistical-reports/health-statistics/death-dignity-act">our state’s death with dignity laws</a>. So, we wait.</p>]]></content><author><name>Christopher Alef</name></author><summary type="html"><![CDATA[This week, I made the difficult decision to enroll my father in hospice care. He is suffering from primary progressive aphasia and is in the very late stages of the disease. For the past two months, he has been struggling with hallucinations and paranoia, making him miserable. He believes the staff around him are carrying weapons and have beaten him, which has increased his agitation and caused him to move around unpredictably. Standing and walking contributed to multiple injuries and trips to the emergency department. A few weeks ago, despite round-the-clock care and observation, he fell, hit his head, and suffered a brain bleed. He has been in the hospital since. In the hospital, his condition has deteriorated more rapidly. His speech is slurred and he is even more agitated, including assaulting staff and removing monitors. They have been forced to intermittently restrain and sedate him.]]></summary></entry><entry><title type="html">I quit my job!</title><link href="https://kachess.dev/2024/06/20/i-quit-my-job.html" rel="alternate" type="text/html" title="I quit my job!" /><published>2024-06-20T01:52:12+00:00</published><updated>2024-06-20T01:52:12+00:00</updated><id>https://kachess.dev/2024/06/20/i-quit-my-job</id><content type="html" xml:base="https://kachess.dev/2024/06/20/i-quit-my-job.html"><![CDATA[<p>This week, I made the difficult decision to give my notice at <a href="https://www.nerdy.com/">Nerdy</a>. After 2.2 years, I realized that my leadership style no longer aligns with the company’s current needs. While I am confident leaving is the right choice, it hasn’t been an easy week.</p>

<p>At our best, my tenure embodied the type of organization I always aspired to be part of. However, Nerdy has transitioned into a phase requiring highly tactical, directed work with centralized decision-making and frequent reprioritization. This contrasts sharply with my philosophy of creating highly autonomous teams with clear goals and strong ownership. Essentially, it became a case of me being a square peg in a round hole.</p>

<p>This realization dawned on me gradually over several months. Up until the weekend before I gave my notice, I was in denial, convincing myself it was just a temporary phase. I believed, somewhat arrogantly, that with my strong leadership, I could help guide us through it. However, a series of incidents, rather than a single event, ultimately convinced me that my vision was no longer aligned with reality. I clung to the dream of the organization I wanted to build, rather than accepting what it is.</p>

<p>Were it not for my father’s failing health, I might have persevered for several more months, hoping to realign the organization with my vision. However, as his condition deteriorates, I find myself in an impossible situation—juggling a demanding role while being the only person able to calm my father during his agitation. He is in stage 6 of 7 on the FAST scale of dementia, rapidly approaching the final stage. The decisions regarding his care are challenging and significantly impact my mental health. I cannot be there for him from 2am to 7am and then begin my workday at 7:30am.</p>

<p>So I chose him, because it’s the only decision that makes sense given my circumstances. I am privileged to be able to make this choice. No one else can fulfill my role in his care. Although stepping away from work with no plan is daunting at 50, especially in a challenging job market, it is the right decision.</p>

<p>However, attributing my decision solely to my father’s condition would be disingenuous. I know Nerdy would have supported me through these personal challenges. Ultimately, it still comes back to the fundamental mismatch: square peg, round hole.</p>

<p>In the end, life happens, and we support each other through its trials. I’m not the right fit, and unlike my younger self, I’m at peace with that. So we part ways, with me wishing Nerdy the best and mourning the dream I once had. I’ll forever be thankful to the people that I worked with and learned from. I’m a much better manager and human as a result of my time at Nerdy.</p>

<p>What’s next? Realistically, I’m in limbo until my father’s passing. I cannot start something new and I will need time to grieve. But I’ll take the time to learn my lessons from this experience and come back stronger.</p>

<p>Coincidentally, my best man was laid off on the same day I resigned. After work on Friday, we will head up to Lake Kachess, a place we’ve been visiting for 35 years, to relax, float, and ponder life. And so begins another chapter.</p>

<p><img src="/images/fast-scale.webp" alt="FAST scale" /></p>]]></content><author><name>Christopher Alef</name></author><summary type="html"><![CDATA[This week, I made the difficult decision to give my notice at Nerdy. After 2.2 years, I realized that my leadership style no longer aligns with the company’s current needs. While I am confident leaving is the right choice, it hasn’t been an easy week.]]></summary></entry></feed>