New hardware for my server at home

After some stability problems and an unresolvable PCI IRQ problem with my Debian server at home (Internet gateway, mail, PostgreSQL, LDAP and NFS server and test system), I decided to get a new mainboard and wireless card last week. I did some research to get hardware that is properly supported and where I can use the rest of my current components. I had

  • a AM2 socket AMD Athlon64 3500+ CPU
  • 2 1GB DDR2/667 DIMMs
  • a µATX chassis
  • an IDE Transcend 4GB SSD for my root partition
  • 2 750GB SATA hard disks (software RAID-1)
  • a Intel EtherExpress 100 Server PCI adapter (that I wanted to use to separate the internal and external networks but which did not work due to the above mentioned PCI trouble)

So I had to find a mainboard with at least 2 DDR2 DIMM sockets, an AM2 compatible CPU socket in µATX format with SATA ports at least one IDE Port, a Gigabit ethernet port (serving NFS home directories) and at least 2 PCI slots (one for the wireless card) that is well supported by the current Debian Squeeze Linux kernel. As I had some issues with the previous board's nvidia SATA chipset I wanted to get something else that is AHCI compatible. After some research I decided to buy a Gigabyte GA-MA74GM-S2H (rev. 3.0) which is available for a good price and fits all my criteria (2 DDR2 DIMM slots, 6 SATA ports, an RTL8169 based Gigabit ethernet ports, 2 PCI slots, ...). The mainboard uses an AMD 740G + SB710 Chipset. It has an integrated graphics chipset which I don't really need because I run the machine as a headless system.

As I use the server as a wireless access point I need a wireless network adapter that supports AP mode. There are not many chipsets that are properly supported in AP mode and allow good data rates so I decided to get an Atheros ath9k based card, because I don't need non-free firmware like with broadcom based cards and could find some affordable hardware. The previous was a Netgear WG311 ath5k card that had issues that it lost its connection under more than minimum load and could only be reanimated by a system reboot. I looked at the Linux Wireless pages and found out that D-Link has a matching adapter the DWL-G520. Unfortunatelly they don't tell about the used chipset (as almost all other vendors too) and write about possibly changing hardware at the product sheet. To be sure I looked at their Windows driver's .inf files and found nothing indicating that there are any non Atheros variants of the card.

I also ordered two Fantec MR-35 SATA mobile racks for more comfort when one of the RAID1 discs crashes (had 2 crashes last year) and a 12" chassis fan to keep the system cool.

Today the package arrived and after unpacking and putting everything together I started the system (with keyboard and a display attached) to see how it works. All hardware was automatically detected by the current Debian Squeeze system and I only had to remove the entries for the old onboard network adapter and the old wireless adapter from /etc/udev/rules.d/70-persistent-net.rules and fix the device names for the new network cards. I added a new stanza for the Intel ethernet adapter in /etc/network/interfaces and setup a set of ferm rules to have a working firewall (thanks to Formorer for the suggestion to use ferm). After this was finished everything worked fine so far and I put the machine at its normal place without keyboard and display.

To verify that the stability situation had really improved, I copied some huge files over the network (both wired and wireless and in both
directions). Afterwards I did a bonnie++ benchmark to test the SATA and hard drive reliability. Everything went well and I'm happy with my investment.


Hi Jan:

Do you speak English? Sorry, I do not speak German. I am a Windows admin but I am moving into Linux since it is a change for me and I need to enhance my skills.

I am running Ubuntu but I'd like to work more in Debian. Any suggestions appreciated?


David Clark

Bonnie++ and zcav

You might consider running zcav when you do Bonnie++ tests. On a system with two disks it is useful to compare the zcav performance from a single disk with the performance from two disks running at the same time. Graph this and in some situations you will see evidence of a motherboard IO performance bottleneck.