Thursday, May 27, 2010

Will The iPad Kill The Netbook?


Ever since Apple announced the iPad, there have been countless stories in the press about the iPad's effect on the netbook market.  I'm a big fan of netbooks and I agree that the netbook market is in trouble but it's not because of the iPad.

It's because of Windows.

Now, I don't mean this as a piece of simple-minded anti-MS snark (though I am fully capable ;-).  I'm serious.  Windows is the problem with netbooks.  Installing Windows on a netbook changes the device from a small effective portable Internet interaction device into a tiny, underpowered, laptop computer.

In the beginning of the netbook revolution, hardware makers chose Linux.  The first generation of netbooks featured small screens (7-9 inch) and solid-state disks.  To make use of this platform, they pretty much had to use Linux because of its small footprint and easy customization allowing manufacturers the freedom to create user interfaces appropriate to the device.  The fact that the OS license was free didn't hurt either given the price points that netbooks originally held.

So what went wrong?

First, Microsoft was able to respond to the threat to its consumer OS monopoly by releasing a version of Windows XP with ultra-cheap licensing provided that the computer was suitably underpowered.  Asus, for example, sold both Linux and Windows versions of it's netbooks for a time.  Both models cost the same but the Linux model had a larger drive.  Why? Because the Windows license placed a cap on size of the drive that would qualify the computer for the low-cost license.

Second, the Linux distributions supplied by the netbook makers were not very good.  I can personally attest to that.  My editor's eeePC 901 (pictured above with my own HP Mini 1116NR) came with the Asus version of Xandros and frankly, it sucked.  After struggling with it for several months, I replaced the Xandros with Ubuntu 9.04 Netbook Remix and now the machine is a delight.

Finally, in response to the inappropriate user interface Windows provides for small screen devices, netbook makers made netbooks larger with 10-12 inch screens and they gave up on solid-state drives.  Almost all netbooks today come with slow 160 GB hard disks.  So now you have a slow 12 inch laptop that costs about the same as a "real" laptop and isn't really that portable anymore.  No wonder nearly one-third of netbook shoppers are buying IPads instead.

Interface, Interface, Interface.

But the iPad should not be directly competitive with netbooks at the conceptual level.  In many ways the iPad is a remarkable device for content consumption.  Unlike a Windows computer, it requires virtually no system administration.  This makes the device a perfect "television of the future" where one just uses it to passively consume content.  However, its lack of a real keyboard and limited connectivity options makes it a poor choice as a portable Internet interaction device; a role that the netbook hardware platform excels in.

Clearly, Apple devised a near perfect user interface for a tablet, something Microsoft was never able to do.  It is possible that the next generation of netbooks will do better.  There have been a number of announcements of upcoming models that will be based on ARM chips using operating systems, such as Android, better suited to mobile devices.  Even as much as I like Ubuntu's netbook remix, it's still a crude hack to shoehorn a desktop OS onto a small screen computer.

Thanks for listening!  See you again soon.

Monday, May 24, 2010

Site News: 20,000 Downloads And New Series Navigation


A few updates on the state of the site:
  • The book reached the 20,000 download mark over the weekend.  Thanks everybody!  This represents the number of downloads performed from the Sourceforge site, but there are probably more since the book is mirrored at a variety of other sites throughout the world.
  • If you have been thinking about purchasing a printed copy of The Linux Command Line, now may be the time.  Lulu (my on-demand publisher) has a limited time, free shipping offer!  See the lulu.com home page for details.  You can order your very own copy directly from here.
  • Last week I updated many of my previous blog posts to add handy navigation links for the multi-part series posts.  It is now much easier to move from installment to installment in my popular series.  Included are Building An All-Text Linux Workstation, New Features In Bash Version 4.x, and Getting Ready For Ubuntu 10.04.
Enjoy!

Tuesday, May 18, 2010

Project: Getting Ready For Ubuntu 10.04 - Part 5

For our final installment, we're going to install and perform some basic configuration on our new Ubuntu 10.04 system.

Downloading The Install Image And Burning A Disk

We covered the process of getting the CD image and creating the install media in installment 3.  The process is similar.  You can download the CD image here.  Remember to verify the MD5SUM of the disk you burn.  We don't want to have a failed installation because of a bad disk.  Also, be sure to read the 10.04 release notes to avoid any last minute surprises.

Last Minute Details

There may be a few files that we will want to transfer to the new system immediately, such as the package_list.old.txt file we created in installment 4 and each user's .bashrc file.  Copy these files to a flash drive (or use Ubuntu One, if you're feeling adventuresome).

Install!

We're finally ready for the big moment.  Insert the install disk and reboot.  The install process is similar to previous Ubuntu releases.

Apply Updates

After the installation is finished and we have rebooted into our new system, the first thing we should do is apply all the available updates.  When I installed last week, there were already 65 updates.  Assuming that we have a working Internet connection, we can apply the updates with the following command:

me@linuxbox ~$ sudo apt-get update; sudo apt-get upgrade

Since the updates include a kernel update, reboot the system after the updates are applied.

Install Additional Packages

The next step is to install any additional software we want on the system.  To help with this task, we created a list in installment 4 that contained the names of all of the packages on the old system.  We can compare this list with the new system using the following script:

#!/bin/bash

# compare_packages - compare lists of packages

OLD_PACKAGES=~/package_list.old.txt
NEW_PACKAGES=~/package_list.new.txt

if [[ -r $OLD_PACKAGES ]]; then
    dpkg --list | awk '$1 == "ii" {print $2}' > $NEW_PACKAGES
    diff -y $OLD_PACKAGES $NEW_PACKAGES | awk '$2 == "<" {print $1}'
else
    echo "compare_packages: $OLD_PACKAGES not found." >&2
    exit 1
fi

This scripts produces a list of packages that were present on the old system but not yet on the new system.  You will probably want to capture the output of this script and store it in a file:

me@linuxbox ~ $ compare_packages > missing_packages.txt

You should review the output and apply some editorial judgement as it is likely the list will contain many packages that are no longer used on the new system in addition to the packages that you do want to install.  As you review the list, you can use the following command to get a description of a package:

apt-cache show package_name

Once you determine the final list of packages to be installed, you can install each package using the command:

sudo apt-get install package_name

or, if you are feeling especially brave, you can create a text file containing the list of desired packages to install and do them all at once:

me@linuxbox ~ $ sudo xargs apt-get install < package_list.txt

Create User Accounts

If your old system had multiple user accounts, you will want to recreate them before restoring the user home directories.  You can create accounts with this command:

sudo adduser user

This command will create the user and group accounts for the specified user and create the user's home directory.

Restore The Backup

If you created your backup using the usb_backup script from installment 4 you can use this script to restore the /usr/local and /home directories:

#!/bin/bash

# usb_restore - restore directories from backup drive with rsync

BACKUP_DIR=/media/BigDisk/backup
ADDL_DIRS=".ssh"

sudo rsync -a $BACKUP_DIR/usr/local /usr

for h in /home/* ; do
    user=${h##*/}
    for d in $BACKUP_DIR$h/*; do
        if [[ -d $d ]]; then
            if [[ $d != $BACKUP_DIR$h/Examples ]]; then
                echo "Restoring $d to $h"
                sudo rsync -a "$d" $h
            fi
        fi
    done
    for d in $ADDL_DIRS; do
        d=$BACKUP_DIR$h/$d
        if [[ -d $d ]]; then
            echo "Restoring $d to $h"
            sudo rsync -a "$d" $h
        fi
    done
    # Uncomment the following line if you need to correct file ownerships
    #sudo chown -R $user:$user $h
done

You should adjust the value of the ADDL_DIRS constant to include hidden directories you want to restore, if any, as this script does not restore any directory whose name begins with a period to prevent restoration of configuration files and directories.

Another issue you will probably encounter is the ownership of user files.  Unless the user ids of each of the users on old system match the user ids of the users on the new system, rsync will restore them with the user ids of the old system.  To overcome this, uncomment the chown line near the end of the script.

If you made your backup using the usb_backup_ntfs script, use this script to restore the /usr/local and /home directories:

#!/bin/bash

# usb_restore_ntfs - restore directories from backup drive with tar

BACKUP_DIR=/media/BigDisk_NTFS/backup

cd /
sudo tar -xvf $BACKUP_DIR/usrlocal.tar

for h in /home/* ; do
    user=${h##*/}
    sudo tar    -xv \
            --seek \
            --wildcards \
            --exclude="home/$user/Examples" \
            -f $BACKUP_DIR/home.tar \
            "home/$user/[[:alnum:]]*" \
            "home/$user/.ssh"
done

To append additional directories to the list to be restored, add more lines to the tar command using the "home/$user/.ssh" line as a template.  Since tar restores user files using user names rather than user ids as rsync does, the ownership of the restored files is not a problem.

Enjoy!

Once the home directories are restored, each user should reconfigure their desktop and applications to their personal taste.  Other than that, the system should be pretty much ready-to-go.  Both of the backup methods provide the /etc directory from the old system for reference in case it's needed.

Further Reading

Man pages for the following commands:
  • apt-cache
  • apt-get
  • adduser
  • xargs
Other installments in this series: 1 2 3 4 4a 5

Saturday, May 15, 2010

Project: Getting Ready For Ubuntu 10.04 - Part 4a

After some experiments and benchmarking, I have modified the usb_backup_ntfs script presented in the last installment to remove compression.  This cuts the time needed to perform the backup using this script by roughly half.  The previous script works, but this one is better:

#!/bin/bash

# usb_backup_ntfs # backup system to external disk drive

SOURCE="/etc /usr/local /home"
NTFS_DESTINATION=/media/BigDisk_NTFS/backup

if [[ -d $NTFS_DESTINATION ]]; then
    for i in $SOURCE ; do
        fn=${i//\/}
        sudo tar -cv \
            --exclude '/home/*/.gvfs' \
            -f $NTFS_DESTINATION/$fn.tar $i
    done
fi

Further Reading
Other installments in this series: 1 2 3 4 4a 5

Tuesday, May 11, 2010

Project: Getting Ready For Ubuntu 10.04 - Part 4

Despite my trepidations, I'm going to proceed with the upgrade to Ubuntu 10.04.  I've already upgraded my laptop and with Sunday's release of an improved totem movie player, the one "show stopper" bug has been addressed.  I can live with/work around the rest. The laptop does not contain much permanent data (I use it to write and collect images from my cameras when I travel) so wiping the hard drive and installing a new OS is not such a big deal.  My desktop system is another matter.  I store a lot of stuff on it and have a lot of software installed, too. I've completed my testing using one of my test computers verifying that all of the important apps on the system can be set up and used in a satisfactory manner, so in this installment we will look at preparing the desktop system for installation of the new version of Ubuntu.
  
Creating A Package List

In order to get a grip on the extra software I have installed on my desktop, I started out just writing a list of everything I saw in the desktop menus that did not appear on my 10.04 test systems.  This is all the obvious stuff like Thunderbird, Gimp, Gthumb, etc., but what about the stuff that's not on the menu?  I know I have installed many command line programs too.  To get a complete list of the software installed on the system, we'll have to employ some command line magic:

me@twin7$ dpkg --list | awk '$1 == "ii" {print $2}' > ~/package_list.old.txt

This creates a list of all of the installed packages on the system and stores it in a file.  We'll use this file to compare the package set with that of the new OS installation.

Making A Backup

The most important task we need to accomplish before we install the new OS is backing up the important data on the system for later restoration after the upgrade.  For me, the files I need to preserve are located in /etc (the system's configuration files.  I don't restore these, but keep them for reference), /usr/local (locally installed software and administration scripts), and /home (the files belonging to the users).  If you are running a web server on your system, you will also probably need to backup portions of the /var directory as well.

There are many ways to perform backups.  My systems normally backup every night to a local file server on my network, but for this exercise we'll use an external USB hard drive.  We'll look at two popular methods: rsync and tar.

The choice of method depends on your needs and on how your external hard drive is formatted.  The key feature afforded by both methods is that they preserve the attributes (permissions, ownerships, modification times, etc.) of the files being backed up.  Another feature they both offer is the ability to exclude files from the backup because there are a few things that we don't want.

The rsync program copies files from one place to another.  The source or destination may be a network drive, but for our purposes we will use a local (though external) volume.  The great advantage of rsync is that once an initial copy is performed, subsequent updates can be made very rapidly as rsync only copies the changes made since the previous copy.  The disadvantage of rsync is that the destination volume has to have a Unix-like file system since it relies on it to store the file attributes.

Here we have a script that will perform the backup using rsync.  It assumes that we have an ext3 formatted file system on a volume named BigDisk and that the volume has a backup directory:

#!/bin/bash

# usb_backup - Backup system to external disk drive using rsync

SOURCE="/etc /usr/local /home"
EXT3_DESTINATION=/media/BigDisk/backup

if [[ -d $EXT3_DESTINATION ]]; then
    sudo rsync -av \
        --delete \
        --exclude '/home/*/.gvfs' \
        $SOURCE $EXT3_DESTINATION
fi

The script first checks that the destination directory exists and then performs rsync.  The --delete option removes files on the destination that do not exist on the source.  This way a perfect mirror of the source is maintained.  We also exclude any .gvfs directories we encounter.  They cause problems.  This script can be used as a routine backup procedure.  Once the initial backup is performed, later backups will be very fast since rsync identifies and copies only files that have changed between backups.

Our second approach uses the tar program.  tar (short for tape archive) is a traditional Unix tool used for backups.  While its original use was for writing files on magnetic tape, it can also write ordinary files.  tar works by recording all of the source files into a single archive file called a tar file.  Within the tar file all of the source file attributes are recorded along with the file contents.  Since tar does not rely on the native file system of the backup device to store the source file attributes, it can use any Linux-supported file system to store the archive.  This makes tar the logical choice if you are using an off-the-shelf USB hard drive formatted as NTFS.  However, tar has a significant disadvantage compared to rsync.  It is extremely cumbersome to restore single files from an archive if the archive is large.

Since tar writes its archives as though it were writing to magnetic tape, the archives are a sequential access medium.  This means to find something in the archive, tar must read through the entire archive starting from the beginning to retrieve the information.  This is opposed to a direct access medium such as a hard disk where the system can rapidly locate and retrieve a file directly.  It's like the difference between a DVD and a VHS tape.  With a DVD you can immediately jump to a scene whereas with a VHS tape you have to scan down the entire length of the tape until you get to the desired spot.

Another disadvantage compared to rsync is that each time you perform a backup, you have to copy every file again.  This is not a problem for a one time backup like the one we are performing here but would be very time consuming if used as a routine procedure.

By the way, don't attempt a tar based backup on a VFAT (MS-DOS) formatted drive.  VFAT has a maximum file size limit of 4GB and unless you have a very small set of home directories, you'll exceed the limit.

Here is our tar backup script:

#!/bin/bash

# usb_backup_ntfs - Backup system to external disk drive using tar

SOURCE="/etc /usr/local /home"
NTFS_DESTINATION=/media/BigDisk_NTFS/backup

if [[ -d $NTFS_DESTINATION ]]; then
    for i in $SOURCE ; do
        fn=${i//\/}
        sudo tar -czv \
            --exclude '/home/*/.gvfs' \
            -f $NTFS_DESTINATION/$fn.tgz $i
    done
fi

This script assumes a destination volume named BigDisk_NTFS containing a directory named backup.  While we have implied that the volume is formatted as NTFS, this script will work on any Linux compatible file system that allows large files.  The script creates one tar file for each of the source directories.  It constructs the destination file names by removing the slashes from the source directory names and appending the extension ".tgz" to the end.  Our invocation of tar includes the z option which applies gzip compression to the files contained within the archive.  This slows things down a little, but saves some space on the backup device.

Other Details To Check

Since one of the goals of our new installation is to utilize new versions of our favorite apps starting with their native default configurations, we won't be restoring many of the configuration files from our existing system.  This means that we need to manually record a variety of configuration settings.  This information is good to have written down anyway.  Record (or export to a file) the following:
  • Email Configuration
  • Bookmarks
  • Address Books
  • Passwords
  • Names Of Firefox Extensions
  • Others As Needed

Ready, Set, Go!

That about does it.  Once our backups are made and our settings are recorded, the next thing to do is insert the install CD and reboot.  I'll see you on the other side!

Further Reading

The following chapters in The Linux Command Line
  • Chapter 16 - Storage Media (covers formatting external drives)
  • Chapter 19 - Archiving And Backup (covers rsync, tar, gzip)
Man pages:
  • rsync
  • tar
An article describing how to add NTFS support to Ubuntu 8.04
Other installments in this series: 1 2 3 4 4a 5