Saturday, October 20, 2007

Myth progress report

In the past month I picked up an 800MHz GX260 Optiplex for $75 from Campus Surplus and successfully built it into a slave backend for my Myth system. A slave backend is a frontend which has its own tuner. If you only want to watch recordings from the frontend but don't need to watch live TV, all you need to build is a frontend.

Something perplexed me about this rig once it was built. It could watch recordings from the master backend without the irritating freeze every five seconds we had come to expected from playback on the MBE's own frontend. The chief difference between the two machines was that the MBE was a 450MHz PIII Dell Dimension and the SBE is an 800MHz Celeron. Not being a complete idiot, I concluded that a faster CPU on the MBE would resolve the freeze problem.

Yesterday, I picked up another identically configured Optiplex GX260 in the same place, this one for $50 (they weren't moving fast enough at $75, apparently), and today I performed the necessary surgery to put the MBE's guts into the GX260. Well, nearly all the guts. I figured the nVidia TNT2 from the Clinton administration wasn't worth moving to the new machine, even if it had integrated Intel graphics. If they were good enough for playback on the SBE, they ought to be good enough for the MBE.

After rebooting, everything worked... almost. The graphics wouldn't load, even after commenting out the nVidia device in xorg.conf. I ended up copying the appropriate section from the SBE's xorg.conf and it was good to go. Unfortunately, the IR didn't work. This wasn't new: LIRC is often the most fragile part of the Myth ecosystem, and kernel updates have killed it before. In those rare cases, redoing the make/make install process has quickly fixed it. Not this time. Searching dmesg revealed an interesting error:
[ 48.302736] lirc_i2c: no version for "lirc_unregister_plugin" found: kernel tainted.
Taints usually refer to proprietary stuff in the kernel. The wifi card I'm using is Atheros-based, and the kernel driver taints the kernel. But so does the nVidia driver I neglected to remove.

Update: that message is always in dmesg, but I hadn't realized it. The real source of the problem was the nvidia-glx-legacy driver that no longer applied and needed to be removed. Once it was, LIRC started working again just fine.

The only remaining problem has to do with the multiple tuners across backends. The system is vaguely aware of the SBE's tuner, but never uses it: when the SBE is tuning live channels, it's actually using the MBE's tuner. Giving separate names to the tuners doesn't seem to have fixed it.