Using robocopy to backup files to a D-Link DNS-321


I use robocopy to back up my music collection to my D-Link DNS-321. I use the /xo switch so only new and modified files would be copied – like so:

robocopy c:\music\music z:\music /E /xo

However, it was taking forever because every file was considered “NEWER” than the existing backup copy. At first, I thought my music program (Media Monkey on Windows) was updating the mp3 tags or touching the files. That turned out to not be the case.

Instead, the issue was a difference in precision between timestamps on my laptop (Windows XP) and the DNS-321 (Linux/Samba).

The solution? The /fft switch:

robocopy c:\music\music z:\music /E /xo /fft

More details:

Setting up a new Subversion repository and Trac project


Here are the steps I use each time I need to create a new Subversion repository:

Create Repository

> sudo svnadmin create /var/svn/MY_PROJECT_NAME

Create TRAC project

> sudo trac-admin /var/www/trac/MY_PROJECT_NAME initenv

During install, you will be prompted for the repository path created above.

Set up Apache to serve TRAC project

> cd /etc/apache2/sites-enabled
> sudo nano 000-default
# add each repository to apache
# also, we need to let the apache user read the directory:
# (the basic authentication isn't passed on to the user)
> cd /var
> sudo chown -R www-data.svn svn
# (alternative: add www-data to svn group)
# set perms correctly on new trac directory
> cd /var/www
> sudo chown -R www-data.svn trac
# update Apache settings to enable trac authentication; as described here:
# http://trac-server-hostname/trac/MY_PROJECT_NAME/wiki/TracModPython
> cd /etc/apache2/sites-enabled
> sudo nano 000-default
> sudo /etc/init.d/apache2 restart

See the original article that inspired this post.

Using SQL Server to analyze IIS logs


Rather than using LogParser to analyze IIS logs, you can import your log files in an instance of SQL Server Express.

First, create a table that has the same columns as your log file. Depending on what data you were capturing in your log files, you will need a different structure.

For example, on my Windows XP IIS 5.1 machine, I had the following fields:

CREATE TABLE [dbo].[iis_logtable_1] (
[time] [datetime] NULL ,
[c-ip] [varchar] (50) NULL ,
[cs-method] [varchar] (50) NULL ,
[cs-uri-stem] [varchar] (255) NULL ,
[sc-status] [int] NULL

For an IIS7 server instance, I had these fields captured:

CREATE TABLE [dbo].[iis_logtable_2] (        
[date] [datetime] NULL,
[time] [datetime] NULL ,
[s-ip] [varchar] (50) NULL ,
[cs-method] [varchar] (50) NULL ,
[cs-uri-stem] [varchar] (255) NULL ,
[cs-uri-query] [varchar] (2048) NULL ,
[s-port] [varchar] (50) NULL ,
[cs-username] [varchar] (50) NULL ,
[c-ip] [varchar] (50) NULL ,
[cs(User-Agent)] [varchar] (2048) NULL ,
[sc-status] [int] NULL ,
[sc-substatus] [int] NULL ,
[sc-win32-status] [varchar] (255) NULL ,
[time-taken] [int] NULL

You may need to modify depending on which fields are being logged on your web server.

You could do a BULK INSERT now to pull all the data into SQL Server. However, your log file probably has comments in it, which IIS typically adds each time the server restarts… these lines start with hash (#) and cause BULK INSERT to choke. If you have enough of these in your log file, the insert will fail.

You can remove comments from your log file by using the PrepWebLog utility. Command will be something like this:

preplog.exe c:\temp\iislogs\mylogfile.log > c:\temp\iislogs\mylogfile_new.log

Finally, the newly un-commented file can be read by SQL Server.

BULK INSERT [dbo].[iis_logtable_2]
FROM 'C:\temp\iislogs\mylogfile_new.log'

For more info, see How To Use SQL Server to Analyze Web Logs.

Installing an Ubuntu 9.10 VMware image on ESXi host


Here’s my process for creating Ubuntu VMs on and ESXi 4.0 host.

I downloaded a pre-built VMware 9.10 image from thoughtpolice. I chose the amd64 version and downloaded it via bittorrent.

This image won’t run out of the box on ESXi 4.0 – it must first be converted. For this, I needed to download the VMware vCenter Converter Standalone.

Convert the .vmx file and deploy it to the ESXi host using VMware vCenter Converter Standalone:

  • Choose “Convert Machine” button.
  • Select source type: “VMware workstation or other VMware virtual machine”.
  • Browse to the .vmx file.
  • For the destination, choose “VMware Infrastructure virtual machine” and enter appropriate login credentials.
  • Type the VM name.

That copies the VMX to the ESXi instance. On my setup, it took about 15 minutes over my LAN.

Once that was all settled, I performed the following standard updates in the VMware console:

# start VM in vSphere – go to console
sudo aptitude update
sudo aptitude upgrade
sudo nano /etc/hostname # change hostname as needed
sudo /sbin/shutdown -h -P now
# set up router so VM’s MAC address is linked to a single IP
# start VM again in vSphere – go to console
sudo aptitude install openssh-server


Also, update:

sudo nano /etc/hosts

sudo dpkg-reconfigure tzdata

Furthermore, I created a user as follows:

# create user
sudo useradd -d /home/myusername -m myusername
sudo passwd myusername
# show user's shell
getent passwd myusername
# change user's shell
sudo chsh -s /bin/bash myusername

To ease file transfers, I installed samba and set up a samba share.

Creating a VMware 4.0 host server with ESXi


I procured a Power Edge T105 server with the following specs:

  • Dual Core 4450B Processor 2x512K Cache, 2.3GHz Athlon for PowerEdge T105
  • 8GB, DDR2, 800MHz, 4x2GB,Dual Ranked DIMMs
  • 160GB 7.2K RPM SATA 3Gbps 3.5-in Cabled Hard Drive-Entry
  • On board Network Adapter


Rather than installing VMware on the (smallish) hard drive, I grabbed a 4GB thumb drive lying around to host the ESXi hypervisor.

This video was a decent intro on install options for ESXi.

Steps to install:

  1. Burn ESXi 4.0 installer ISO.
  2. Followed these instructions to install to Flash drive. In my installer, the flash drive came up as “Disk0 JetFlash Transcend 4GB”.
  3. Restart, enter BIOS, select to boot from Flash drive.
  4. Hit F2 to set root password.
  5. Connected ethernet cable to server. Configured router to grant static IP address for hypervisor’s MAC address.
  6. Installed VMware vSphere client on my laptop.
  7. Followed these instructions to set up my ESXi license key.


All set. Next, I’ll create some virtual machines. Some project ideas:

  • Source control server
  • FreeNAS file server for photos and music
  • Web farm
  • Test for alternative web servers: NGINX, Squid, etc.

Scalable Internet Architectures: Static Content


In some of my previous posts, I discussed ways to instrument static content references so their domain is configurable. This small aspect of application-level design makes it possible to offload static content delivery to a separate infrastructure from the application pages.

Chapter 6 in Scalable Internet Architectures focuses on Static content, that aspect of web applications that is sometimes an afterthought to web developers.

Before reading this chapter, I though in two steps for web application design: A single web farms delivering all static + dynamic content. To scale beyond that, the next choice is using a [potentially expensive] CDN with dynamic capabilities such as Akamai Dynamic Site Accelerator. It turns out there are many interesting solutions in between these two.

The first lightbulb to go off in my head was this: Even though a web application may rely on ASP.NET and therefore Windows, you can scale its ability to serve static content using commodity hardware and free operating systems.

Secondly, an Apache-based solution is not necessarily required either. There are many other free – simpler – web servers that can be leveraged for serving static content, including thttpd and Squid.

Rewriting references to static CSS, JS, and image content, part 2


Following up on the previous post, here are a few other options to consider:

  1. Use a custom Expression Builder to create a include script and CSS references. This syntax <%$ %> would work in No-Compile pages.
  2. Use combres. I tried the demo and really liked it. It automatically renames scripts as they are updated, plus includes minification and compression. Unfortunately, I don’t think it will will work with No-Compile pages… need to investigate further.