Archive for category Linux

Install s3fs on Amazon Clouds

s3fs is a FUSE filesystem that allows you to mount an Amazon S3 bucket as a local filesystem. It stores files natively and transparently in S3 (i.e., you can use other programs to access the same files).

The following instructions detail the steps to install the program s3fs on an Amazon EC2 running Debian 5.0.5.

  1. Install libfuse
    First, you need to install the package libfuse manually as the one provided via apt-get is too old (s3fs needs a version greater than or equal to 2.8.4).

    wget http://downloads.sourceforge.net/project/fuse/fuse-2.X/2.8.7/fuse-2.8.7.tar.gz
    tar xzf fuse-2.8.7.tar.gz
    cd fuse-2.8.7
    ./configure --prefix=/usr
    make install
    
  2. Install libxml
    You can simply install the package provided by apt-get:

    apt-get install libxml2-dev
    
  3. Upgrade mount
    Because of a problem between fuse and mount, you need to upgrade the version of mount:

    wget http://www.kernel.org/pub/linux/utils/util-linux/v2.21/util-linux-2.21-rc1.tar.gz
    tar xzf util-linux-2.21-rc1.tar.gz
    ./configure --prefix=/usr --without-ncurses
    make install
    

    For more information about this issue, please go to the following page: http://code.google.com/p/s3fs/issues/detail?id=228

  4. Install s3fs
    You can now install s3fs using the following commands:

    wget http://s3fs.googlecode.com/files/s3fs-1.61.tar.gz
    tar xvzf s3fs-1.61.tar.gz
    cd s3fs-1.61/
    ./configure --prefix=/usr
    make install
    

Note that I wanted to use s3fs to create incremental snapshot-style backups with rsync. Unfortunately, as mentioned on the following page, it didn’t work because s3fs doesn’t support hard links: http://code.google.com/p/s3fs/issues/detail?id=46

, , , , , , , , , , ,

No Comments

Samba access problem with Mac OS X 10.6+

This is a problem I encountered when I upgraded Mac OS X from the version 10.5 (Leopard) to 10.6 (Snow Leopard). One of my friend also got a similar problem when she upgraded to the version 10.7 (Lion).
This issue was affecting the access to the network shares set up with Samba (version 3.0.24) on my D-Link DNS-323. For some reason, I wasn’t able to authenticate on the shares as soon as I upgraded to Snow Leopard!

Here is the error message I was getting:

After browsing a few forums on the web, I finally found a solution. 🙂
I simply had to change the security mode in the Samba configuration file (smb.conf) to read:

security = USER

Note that this property can be found under the [global] section.

For more information about the Samba security mode, please read the following article by Jack Wallen:
Understanding Samba security modes

, , , , ,

No Comments

S3 command failed if the time is not synced

This is already the second post about the s3sync ruby program. The first article was focused on monitoring s3sync with Zabbix.

I will talk on this one about an error I got when running the S3 synchronisation:

S3 command failed:
list_bucket prefix /data max-keys 200 delimiter /
With result 403 Forbidden
S3 ERROR: #
s3sync.rb:290:in `+': can't convert nil into Array (TypeError)
	from s3sync.rb:290:in `s3TreeRecurse'
	from s3sync.rb:346:in `main'
	from ./thread_generator.rb:79:in `call'
	from ./thread_generator.rb:79:in `initialize'
	from ./thread_generator.rb:76:in `new'
	from ./thread_generator.rb:76:in `initialize'
	from s3sync.rb:267:in `new'
	from s3sync.rb:267:in `main'
	from s3sync.rb:735

As you can see, this error is not very human-friendly! 😮 The only thing we know is that the S3 command failed because of the error can't convert nil into Array. It looks to me like an internal error within s3sync…

But after some investigation, it appears it is simply because the system date on the server is not correct. I cannot tell you how much time I spent on this one!  😯

Anyway, if you are doing automatic backups as describe on John Eberly’s blog, you need to add the following code at the top of your upload.sh script:

# update the system date
/usr/sbin/ntpdate 3.uk.pool.ntp.org 2.uk.pool.ntp.org 1.uk.pool.ntp.org 0.uk.pool.ntp.org

NB: please find below the command lines I use to install ntpdate on a Debian server:

apt-get install ntpdate
dpkg-reconfigure tzdata

, , , , , , ,

2 Comments

Monitor s3sync with Zabbix

s3sync is a ruby program that easily transfers directories between a local directory and an S3 bucket:prefix. It behaves somewhat, but not precisely, like the rsync program.

I am using this tool to automatically backup the important data from Debian servers to Amazon S3. I am not going to explain here how to install s3sync as it is not the purpose of this article. However, you can read this very useful article from John Eberly’s blog: How I automated my backups to Amazon S3 using s3sync.

If you followed the steps from John Eberly’s post, you should have an upload.sh script and a crontab job which executes this script periodically.

From this point, here is what you need to do to monitor the success of the synchronisation with Zabbix:

  1. Add the following code at the end of your upload.sh script:
    # print the exit code
    RETVAL=$?
    [ $RETVAL -eq 0 ] && echo "Synchronization succeed"
    [ $RETVAL -ne 0 ] && echo "Synchronization failed"
    
  2. Log the output of the cron script as follow:
    30 2 * * sun /path/to/upload.sh > /var/log/s3sync.log 2>&1
    
  3. On Zabbix, create a new item which will check the existence of the sentence “Synchronization failed” in the file /var/log/s3sync.log:

    Item key: vfs.file.regmatch[/var/log/s3sync.log,Synchronization failed]
  4. Still on Zabbix, define a new trigger for the previously created item:

    Trigger expression: {Template_AmazonCloud_Debian:vfs.file.regmatch[/var/log/s3sync.log,Synchronization failed].last(0)}=1

With these few steps, you should now receive Zabbix alerts when a backup on S3 fails. 🙂

, , , , , , , , ,

No Comments

Refresh GeoIP automatically

GeoIP is a very useful tool provided by MaxMind. It can determine which country, region, city, postal code, and area code the visitor is coming from in real-time. For more information, visit MaxMind website.

This tool is also coming with an Apache module allowing to redirect users depending on their location. For example, we could redirect all users from France to the French home page of a multi-language website, or we could block the traffic to users from a specific country.

To install this module on a Debian server, you simply need to run the following command:

apt-get install libapache2-mod-geoip

But, how does this module work? How does it know where the user comes from?  😯
It is actually quite simple: GeoIP is using a mapping file of IP address by country. On Debian, this file is stored in the folder /usr/share/GeoIP and is named GeoIP.dat.

However, the IP addresses are something which change all the time. So this file will get out-of-date very quickly. This is why MaxMind provides an updated file at the beginning of each month for free. To read the installation instructions, please click the following link: http://www.maxmind.com/app/installation

This is good and well, but who will remember or even have the time to update this file every month? And imagine if you have to do this on hundreds of servers?
The solution is to use a shell script which will download, extract and install the updated GeoIP file automatically once a month:

#!/bin/sh

# Go in the GeoIP folder
cd /usr/share/GeoIP

# Remove the previous GeoIP file (if present)
rm GeoIP.dat.gz

# Download the new GeoIP file
wget http://geolite.maxmind.com/download/geoip/database/GeoLiteCountry/GeoIP.dat.gz

# Remove the previous GeoIP backup file
rm GeoIP.dat.bak

# Backup the existing GeoIP file
mv GeoIP.dat GeoIP.dat.bak

# Extract the new GeoIP file
gunzip GeoIP.dat.gz

# Change the permission of the GeoIP file
chmod 644 GeoIP.dat

# Reload Apache
service apache2 reload

You can place this file in your root folder and set up the following crontab job:

0 0 3 * * /root/update_geoip.sh

This will execute the script automatically on the third day of every month.

, , , , , , ,

2 Comments