Sunday, June 15, 2025

Nostalgia for the Amiga

When I was young, my father owned a series of Commodore machines, the last one being an Amiga 500.  BASIC was for chumps, and our only other alternative was an assembler and a single misprint-riddled book for it.  We didn’t have the internet, or any local user group/mentoring.  We made the asm stuff work, by which I mean, mostly I stole the results of my older brother’s reverse engineering.

Years later, just before the Java phenomenon reached our little community college, I started learning C++ (officially) and C (from the internet), and some mysteries started falling into place.

  • The large blocks of declare statements in assembly were a C struct.
  • The weird parts of that struct like next_window: dc.l 0 were spaces for the OS to put a pointer to the next window, creating a singly-linked list of windows on screen.
  • The difference between label and #label snapped into sharp relief when I reached Pointer Enlightenment. They were *p and p, respectively. (Somewhat; C doesn’t distinguish between absolute and PC-relative addressing.)
  • Loading the label at the start of a block of declare statements into a register before making a library call was passing a pointer to the OS.
  • The 68000’s data/address register split wasn’t arbitrary; only address registers could be used with an offset, so the pointer-to-struct went in an address register, for reading fields within the struct.
  • That weird WaitForEvent function in the OS library was the core of cooperative multitasking.  The Amiga also had preemptive multitasking, but the time slice was large enough to make the system visibly laggy if a process was uncooperative. Like all of my asm programs.

That experience of simultaneous, two-way enlightenment (asm to C and back) with pointers was incredible, but has also been a once-in-a-lifetime high.  It also gave me a lot of “if-only” feelings.  Had I understood all this stuff back then, I could have done so much more with the Amiga.

(Incidentally, that next-window pointer being in the window structure makes it invasive.  I guess this means my first encounter with invasive lists could have been before Linux was ever released.)

I also felt a sort of awe or magic about the hardware design, that was missing from the PC space.  The Amiga was half computer, half game console internally, with a bunch of custom chips to do really neat graphics. Although it had a limited number of colors in the palette, the co-processor could switch out the palette between scan lines, allowing quite a few colors per screen.  HAM mode [hold-and-modify] could also produce “thousands of colors” out of a 16-color palette, if one could accept some color-fringing artifacts.

We didn’t have a VCR or camcorder (the money was spent on the computer) but apparently, the Amiga was great at video, too.

But it was soon proven to be a dead end.  When RAM gets fast enough, a boring linear RGB framebuffer is the best option, and bitplanes (or HAM mode) turn from “cool hacks” into mere workarounds for the era’s limitations.  The 68000 family wasn’t able to keep up with Intel’s immense fab budget to fling clockspeed with the best of them, and Commodore died before a PowerPC transition could have happened.  And who even knows if they could have been as successful as Apple with it?

The Amiga’s ability to “just plug it into a TV” became a limitation over time, as the business world put all its money into PCs with dedicated monitors.  They were more expensive, but also better.  Our first PC would run in 800⨉600, although it could also produce a flickery 1024⨉768.  By comparison, Workbench on the Amiga ran in “medium” resolution, 640⨉200.  High resolution only doubled the line count, and flickered intensively on our monitor.  Ultimately, monitors came down in price and offered much better than TV quality; I went off to university (post-community-college) with a monitor capable of 1280⨉1024.

This entry was somewhat prompted by “Classical "Single user computers" were a flawed or at least limited ideas”.

Sunday, May 11, 2025

Thoughts from Trying Generators in PHP

I am late to the party, but I have been playing with Generators in PHP more, and running into the limitations of module boundaries.

Some module might produce a Generator so that iteration can be performed in chunks, reducing peak RAM.  For example, producing results one store at a time, instead of loading up all stores into a giant array.  Code that processes an entire database table, but wants to lower lock contention and memory use can also benefit; it can use a Generator to isolate the fetch-in-pages logic from processing the individual records.  The consumer sees one stream of results, while the Generator fetches more as needed.

In short, there are plenty of use cases.

The problem comes when a caller wants to pass “the data” produced by the Generator to another function or method that specifically takes an array.  Once that happens, either the destination needs to be reworked to accept the broader iterable type, or the efforts toward efficiency are erased by an iterator_to_array() call.

(Of course, back when generators were introduced to PHP, I didn’t use type declarations, so I could have gotten away with throwing a generator at something that assumed it would receive an array or PDOStatement. Dealing with larger teams and beginning to use an IDE were both great reasons to add the type information, and the array type forbids passing a Generator in its place.)

A separate issue is that anything consuming a Generator (thus, anything type-hinted iterable) needs to be aware of its once-only nature.  This only sometimes becomes a problem—for instance, if a template wants to output the data set and also some aggregate statistics over it for display before the main output.

Generators can also produce “return” values, which can be fetched by code that knows it is dealing with a Generator after the regular values are produced.  (I might change my mind later, with more experience, but it doesn’t pass the vibe check.  It feels a lot like requiring methods of a class to be called in a specific order, which is usually best to avoid.)  It implies that the entire system should lean into handling Generators in particular, and not allow them to mix with other iterable types.

These are (mostly) things I was vaguely aware of from reading about Python generators, but they weren’t on my mind while writing PHP.

Sunday, May 4, 2025

The fiserv Outage

Editor’s Note: this post was penned offline early Friday evening, before the author had knowledge of the issue being resolved, and fiserv processing the backlog as of 16:45.  We have chosen to simply add some links, now that we are online to retrieve them.  The post follows.

As I write, on Friday, 2025-05-02, fiserv has been offline all day, or substantially all day.  This company acts as a third party to a number of banks, providing wire transfers, ACH, and/or direct deposit services, and possibly even online/mobile banking.  A number of large banks, including Ally Bank, Bank of America, Capital One, and Synchrony have been affected in some way by this outage, as was my regional bank.

I don’t know anything about the root causes yet.  It would be irresponsible to speculate about those causes, so of course I am going to.

Sunday, April 20, 2025

Simplicity can be Imaginary

There’s a comic about simplicity: how an Apple product has one place to touch, a Google product has one search field, and “your company’s app” has dozens of fields with interrelated requirements, obscure codes, strange highlighting, and “…” buttons.

The thing is, for internal or even b2b apps, the user probably knows what kind of thing they have, that they would like to search on.  If they are trying to look up a customer ID, then matching to a PO number is irrelevant; it will just take time and produce extraneous results.  If they can tell the computer directly, “Find customer #33448” then jump straight to the customer record, it saves them an extra round-trip through a search result page they didn’t need.

“Your company’s app” from the comic comes across as more of a data-entry page than the main point of interaction.  One might still organize the form along required/optional dimensions, and put auto-loaded fields in proximity with what will automatically update them.  However, to make the business happen, there’s a minimum amount of data that is genuinely required, that shouldn’t be crammed down to one textarea and parsed back out.

Sunday, April 13, 2025

The Enterprise’s Goals

When I was a n00b on the internets, I heard whisperings about awful, over-complex “rule based systems” out there, somewhere.  Programmers scoffed at them for essentially being programs that were being written by non-programmers; nebulous “managers” allegedly dreamed of replacing expensive programmers with cheap office staff.  I did not understand at the time where these systems came from, if everyone seemed to think they were so bad.

Part of that answer is simple.  “Programmers aren’t ‘everyone.’” Oops.

The other part of that answer turns out to be review and auditing. Anything that exists in code is opaque to the business staff; they largely have to trust the programmers on it, or demonstrate defective outcomes. (And at that point, they need to wait for the necessary programming and deployment for it to be fixed.  If it is a big enough problem that customers or clients are exploiting in the meantime, that delay can become costly.)

Functionality that is exposed as ‘configuration data’ to the office staff becomes reviewable by other office staff, such as managers, and errors can be corrected more quickly.  External auditors can use the same review capability for their own work.  The next problem is that this data might not be flexible enough, which pushes toward the development of conditions and actions, and the rule-based system is born.

It was never about the programmers; it was about the business being able to view its own source code.

Sunday, April 6, 2025

Every Change Might Be Breaking

We originally had the “automatic minor version upgrade” option active at Amazon RDS.  This option simply does not work very well.  Sometimes, for no clear reason (and without notification), it would stop applying upgrades, and require manual updates to get moving again.  We mostly lived with it, and then we hit the worst case scenario: it did perform the upgrade, and then one of our scripts stopped working.

Not only that, it managed to break while I was on vacation.

(Obligatory xkcd about spacebar heating.)

Since then, we don’t use that option.  When I’m good and ready, I peruse the changelogs, then schedule the update to happen when I will be in the office to handle unexpected issues.

For their part, AWS recommends testing the app against the new version of the database before performing any upgrades.  This is implicitly a recommendation against using automatic minor upgrades, because there is no automated process to test the upgrade first.

One knows an analysis tool is looking at AWS with a security-first paradigm when it recommends switching the automatic upgrade option back on for the database.  It is technically correct that new releases MAY contain security fixes, but upgrading to them MAY cause an automated denial of service.  It is not a simple, inconsequential task.

Sunday, March 23, 2025

Some Notes from Fixing my Server’s IPv6 / SLAAC

I had a hard time getting IPv6 to work properly on my VPS. It has a static address, which I published to DNS (ages ago), but it wasn’t fully operational. It wasn’t obvious, because it was able to accept and respond to incoming IPv6, but it was not able to generate outgoing IPv6 connections. Thanks to Happy Eyeballs, the system cheerfully fell back to IPv4 and left me none the wiser.  Probably for years. (Since inbound traffic could be responded to, the IPv6 network-transfer graph looked plausible, too.)