20101125

Politics and Budgets

(I had a lot of time to think this week while sitting in airports, hence the posting blitz.)

While watching the recent economic turmoils such as the collapse (so to speak) of Greece, and the pending collapse of the state of California, I've come to the conclusion that Democracy is failing.  Now, I'm not much of a political science guy, so I really don't have a better system on the whole.  The really nice thing about Democracies is that they tend to be stable governments (in the sense that they resist violent-revolution) since malcontents and prospective revolutionaries find it easier to work within the system than to overthrow it.  But they seem to be failing the "rational self-interest" test for short-sightedness across the board (ie, "Will I later regret the costs of getting what I want now?").

There is a trade-off here between short-sightedness, and the length of time you are willing to tolerate bad leaders.  Giving elected leaders longer terms will force them to plan farther ahead, since they will suffer more consequences when they are held responsible for the results.  So, to some extent, short-sightedness is the price you pay for avoiding being unhappily misgoverned by a leader you cannot dispose for a while.  But I think there are more basic problems with the control systems that might be fixable without such a trade-off.  (Yeah, I'm a computer engineer, so ideal governments are just fancy control systems with lots of inputs.)

The most basic problem I see, is that there is the lamentable lack of an educated, and self-disciplined populace.  This seems to be indirectly asserted by the prevalence of political advertising.  Everyone who puts up a political sign with "Yes on Blah" and "Vote for Me," seems to be implicitly admitting that they expect a large enough portion of the voting population to be swayed merely by repeated exposure to their signs that it is worth the expense and effort of placing them.  This to me, is the antithesis of a well-educated and self-disciplined populace.  But short of disenfranchising such folk (which severely reduces the stability benefits mentioned above), or pumping lots of money into education (which has been shown to increase both education and discipline, but takes a good 20-40 years to kick in), I have no real solution, so I will limit my whining.

The solvable part of what I think is behind the economic troubles, is that legislatures experience what Frederick Brooks calls the "committee effect."  When you get a group of people together with varying interests, there is a strong incentive to pass anything you do not strongly object to.  To use a game theory explanation, if each committee member has an item he or she personally wants passed, and fighting against another's item will cause them to retaliate by trying to block yours, it is in your best interest to make friends and support everyone's agenda, so that they will support yours.  In engineering, this results in "design by committee," where the feature list is so bloated as to be unrealizable.  The problem here is that while each individual item may be doable, there is an inherent trade-off in constrained resources.  However, the game-theory dynamics of each individual actor in the standard committee promote neither the conservation of those resource, nor any consideration of the trade-offs.  The result is that while each actor succeeds at the committee game by getting their feature on the list, they ultimately lose the real-world game, when the project fails (or random features get cut, if you have a particularly aggressive engineering department).  For legislative committees, the items are often government programs, and the constrained resource is budget dollars for their implementation.  When everyone's programs pass, the government overspends their revenue, and eventually the bottom falls out and someone loses the reality game (usually the voters).

The underlying economic problem seems to be requiring a balanced budget (or at least requiring the budget to fit some fixed figure, determined by the financial policy gurus who may calculate the optimal level of overspending).  There are a few ways to do this, and I think the best solution is a hybrid approach that uses all of them.  First, you could require the legislatures to balance the budget.  This effectively moves the resource constraint problem inside the committee.  Unfortunately, it also destabilizes the equilibrium of the game theory, which quickly results in deadlocks.  The California legislatures inability to pass a budget every year is a prime example of this.  So while I ultimately think that the solution is to move the resource constraints inside the control system, you have to be slightly more clever.

Another way to get around this is to add a second, post-committee control system that intelligently handles the over-spending problem.  If you can excuse the Computer Science reference, this would be a second-pass on the output.  This is akin to the engineering department that intelligently drops what they consider to be the less important features in order to make it fit into the schedule and budget.  You've now separated the decision making from the people who should rightly be making the decisions, and lost some subtleties about the true priorities and rankings in the process, but at least you have the problem solved somewhat intelligently by an interested party.  The improvement here is that otherwise, the real-world could impose constraints randomly, as whatever features fail to be finished on-time are cut, or when the project fails and everyone loses.  In the legislature, the place to vest this power seems to be with the governor.  Any budget which passes the legislature that over-spends the expected revenue, will be trimmed down to size by the governor however he sees fit.

While I think this is an improvement, I see it as falling into one of two possible traps.  If the legislature is consistently unable to pass a controlled budget, then you have just given the governor basically arbitrary control over the budget and program funding (up to the amount the legislature goes over).  To me this seems like a lot of power to stick in one place, even if it's only there as a check when the legislature has "failed".  Alternatively, the legislature could realize that it's in their rational self-interest to balance the budget themselves, and the dynamics reduce to the previous case where the legislature deadlocks.  So this system seems to oscillate between misrepresenting the true priorities (by concentrating the power in one man), or deadlocking.  But at least now we're only deadlocking sometimes.

The third approach is to change the game, so that the priorities of the actors promote balancing the budget more than (or at least appropriately with) their own personal (or constituent if you are less cynical) agendas.  This is kind of tricky, especially given the term-length/responsibility trade-off I've mentioned above.  But, I think that one way to improve this, is to give voters an incentive to care about the state budget, by raising taxes across the board to fund any budget overages.  Especially if this is done on a yearly basis, so when the taxes shoot up, or a new unpopular program gets started, it's fairly easy for an individual voter (or their favorite political media) to figure out why, and calculate their expected return for canceling the program.  Once this is well-understood enough that political candidates can use a rivals spending preferences against him, it should help coalesce the legislative agendas with the true cost to society, while avoiding the long-term overspending trap.  It might even produce better-educated, engaged and disciplined voters, but that's just a pipe-dream of mine...  I don't see a particular downside to this (besides the difficulties of getting it implemented), but it's unclear to me how much of an effect this would have by itself on the voters.  If nothing else, it would reduce the debt-incurring costs of overspending, even if the government itself is not made more efficient with respect to societies' true wishes.

But, I think that you can combine the strengths of all of these into a hybrid system.  First, require tax increases to cover any spending deficits, putting an overall constraint on the system.  Then, allow the governor the power to cut funding at will from any program if an over-spending budget passes the legislature.  To avoid the power issues, give the legislature the power to override the "budget veto" with a super-majority.  Together, this would establish hard-cap on the over-spending from the tax increases (with whatever accompanying changes in voter priorities that come out of it as a secondary benefit), allow intelligent decisions to be made about where to cut over-committed resources during legislative failure, move the constraining problem inside the committee deliberations (with the "out" of passing an over-spending budget to avoid deadlocks), and not concentrate too much power in the governor by allowing to a legislative override.

I realize, the current economic situation is much more complicated than government overspending, but this seems to be a recurring theme in modern democracies.  But, I'm approaching this from a control systems angle with a little bit of game theory thrown in for politics.  Anyone have a better idea?

20101124

Fun with Fedora

I don't want to be the guy who rants all the time, because life's too short to spend it complaining.  But I had such a negative experience that I'm going to share it anyway.  Thumper's Mother can scold me later...

I had to test some things for work on the new shiny Fedora 14 release, and decided that it was the most brain-dead and hard to setup Linux distribution I've ever worked with.  (After some reflection, the only other serious contender was Topologilinux, who's built-in upgrade system used to leave the installation unbootable.  But that was back in 2002, and since they're essentially some fancy installation scripts around Slackware, I'll cut them some slack (no pun intended), since I don't expect them to maintain the entire Slackware repository.)    All I wanted to do was get VMware tools installed properly, and to do that I needed to install gcc and some kernel headers, which sounded simple enough.  I'm willing to overlook (what I perceive as) the grave crimes of any distro that doesn't ship with gcc and the kernel headers included, because I realize that not everyone uses systems the same way I do.

Now, granted, I've never worked with Fedora or RedHat before, and Gentoo's package management system has pretty much spoiled me for life, so I was prepared for a bit of a learning curve to get everything setup properly.  The first task was networking.  I'm behind the firewall-proxy setup at work, so I didn't expect networking to work out of the box.  I'm even willing to overlook the fact that there is no happy graphical utility to set a system-wide proxy configuration.  Nobody else seems to do this either, despite the fact that it would just have to dump two environment variables to a config file.  But, on a modern Linux distribution, and especially on standard hardware environments like a VMware virtual machine, I expect my network devices to come up under DHCP without any user intervention.  So I wasted several minutes trying to play with proxy settings, before I realized that I had the more fundamental problem: eth0 does not start by default.

The next thing I expect from a software system, is reasonably descriptive error messages. If there's a network problem, I expect to see something to the effect of "Unable to connect to fedora.com," or even "Unable to download package database" or even "Plug in your network cable you moron."  But, the message yum gave me was something like "Error: Cannot retrieve repository metadata (repoman.xml)," which just doesn't help me at all.  Now, maybe they tried, since the word "retrieve" is in there, which sounds vaguely networky, but it's not terribly descriptive.  It sounds like the kind of generic error message you get when the programmers were either too lazy to implement proper error handling, or the errors are obscure enough that the programmers could not have reasonably predicted them.  So I went digging around with yum, trying to rebuild package databases, and running obscure maintenance-looking commands, trying to figure out what could possibly be weirdly screwed up on my stock install.  After fixing my proxy settings (aka, copying the environment variables so many places, yum has to get it right), I was confronted with a different error message.  Now yum was listing websites and telling me it was failing to connect with them.  Somehow, after giving yum access to the Internet, it was finally giving me an error message that looked definitively network related.... go figure.  Clearly something else was wrong.  I managed to convince myself that the sites in question really did exist (at this point I was praying they weren't dumb enough to ship with dead mirrors), and that I really could connect to them through the proxy and the firewall at work.

After longer than I'd like to admit, and double and triple checking the network connectivity and proxy settings, I gave up and started digging around on Google.  I finally found some other poor saps looking at the same error message who had managed to find a fix.  Apparently, and for reasons I cannot adequately explain, yum is unable to access HTTPS sites out of the box.  This is, as you might imagine, a severe limitation, when all of the default mirrors include https:// in the URL.  So, I went and hand-edited my repo list, and converted all of the https:// to http://, and prayed that nobody was running a https only mirror.  That finally worked for me, and I was able to start the onerous process of matching up real-world package names like "gcc" and "kernel headers" to the cryptic, numbered formulas that seem to be the best non-Gentoo package managers can offer.  I'm really hoping that this was an issue with the Squid proxy at my work, and that the Fedora folks didn't ship a release that was unable to validate SSL certificates.  For that matter, I'm not completely sure what added security running over HTTPS really gives you.  I'm hoping they checksum the binaries, so the only advantage I see to HTTPS is to make it so that people upstream from me can't tell exactly which packages I'm downloading...

Meanwhile, I had another problem: yum was deadlocking itself.  Apparently, yum will happily spin for over 10 minutes, trying to acquire some internal yummy lock.  Furthermore, Unix file-locking being what it is, killing the competing processes doesn't release the lock.  So, I had to go Google around for the lock files yum uses, so I can kill all the yums on my system, free the locks, and try again.  (I had by this point accumulated quite a number of hung yums.)  Somehow, this was unsuccessful.  So I logged out, and restarted my X session.  No luck.  I rebooted.  No luck (after reworking the solutions for eth0, and the proxy above).  I finally figured out that whatever system-tray icon their default session launches to inform me of all the wonderful updates I'm missing was also running yum, and hanging in some new and interesting way.  I never solved that one.  But, if I killed it, and all my hung yums, and cleaned all the lock files, finally I could install packages.  However, mistyping a shell command would cause my shell to hang indefinitely... I can only assume it was asking yum what I meant to type, so it could offer to install it for me, but that yum was still horribly broken in some unusual way.

This, is the Linux distribution that gets all the money?  I fear for our novices...