Ubuntu can mount ISO files, and IMG files after converting them to ISO

Today I needed to create an OEM Microsoft Office 2007 CD and found that I could download the disks directly from the Microsoft site. However, the files that I downloaded were in IMG format. At first I was puzzled, but quickly (via Google) found out that they were essentially ISO files. However, I did not quickly find anything in Ubuntu that would burn an IMG to disk.
Diligent searching finally revealed that while there were not really ways to burn an IMG to disk, or mount an IMG file directly, there is a tool called ccd2iso that converts the IMG to ISO format.

First I had to install the ccd2iso package via Synaptic package manager, or I could have used ‘sudo apt install ccd2iso’.

After installing this I could simply run the following command from terminal:

ccd2iso myfile.img myfile.iso

The same methods can be used for other image type files:
mdf2iso -> myfile.mdf
nrg2iso -> myfile.nrg

Now I have a regular iso file that can be used to serve our purposes by burning to disk or mounting:

sudo mount -o loop myfile.iso mountname
 
or
 
sudo mount -o loop -t iso9660 myfile.iso mountname

The .nrg files can also be mounted in this manner without converting to ISO by using:

sudo mount -o loop,offset=307200 myfile.nrg mountname

NOTE: if this doesn’t work and you get an error like: “Unrecognized sector mode (0) at sector 0!” it may be due to the limitations of the ccd2iso. In my case the MS Office disk had multiple sessions, and I could not convert it to ISO.

Another post I found on Ubuntuforums said to try the following:

growisofs -dvd-compat -Z /dev/dvdrw=dvd.img

Where /dev/dvdrw is your dvd/cd burner.

FOLLOWUP:
The IMG file I had from Microsoft was a multi-session disk so I was not able to use the steps above. However, when I simply changed the file extension to ‘.iso’ it worked fine. There seems to be very little difference between IMG and ISO.

Finding a text string inside a file on a Linux server

It never fails that I find myself hunting for a way to search for a particular text string in files.  Usually I know the file, but often times I also find that I am completely unsure what file contains the string.  Or while I am writting some code I need to find how many files use a certain function.

I know that using grep is the best way to search on a Linux server, so I start there.  Here is the command syntax:

grep "text string to search for" /path/to/search

Examples
To search for a string called “myFunction” in all text files located in /var/www/html/*.php use:

grep "myFunction" /var/www/html/*.php

To search recursively in all sub-directories you would alter the command by adding the -r option:

grep -r "myFunction" /var/www/html

Now you have probably noticed that grep prints out the matching lines containing your string, but you may also need the filenames of the files containing the string instead. You can use the -H option to narrow the output the filename followed by the line containing your search string, like so:

grep -H -r "myFunction" /var/www/html

This would output something like:

...
your_file.php: line containing myFunction
..

To print out just the filename you can cut command like this to clean the output further: (Note the one after the f, not an L)

grep -H -r "myFunction" /var/www/html | cut -d: -f1

The new cleaner out put would be like:

...
your_file.php
...

Backup files from Linux to a Windows server

Ok, this may be my last disaster recovery and backup blog for a long time. As you can probably tell from the title this blog entry is all about keeping backup strategies as cheap as possible.

My strategy is to backup all of my Windows and Linux servers to one central Windows server that is running a Tivoli backup agent. All of my servers are hosted elsewhere, and since it costs $99.00 per server to backup I am getting the most for my money by only backing a single server to tape/SAN. However that single server carries all of the files that need to be remotely backed up to tape/SAN.

My earlier posts show how to backup the Windows servers:
Windows backup bat script using xcopy

Also, how to backup the Windows Domain Controller:
Backup Windows Domain Controller using NTBACKUP via cmd

And I also showed how to backup a Linux server to a local file:
Linux backup using CRON to local directory

Now I will show how I moved the files backed up on the Linux servers to the Windows server prior to tape/SAN backup. I have decided to use Samba and mount a directory pointing to a shared folder on the Windows server. Lets begin:
Continue reading Backup files from Linux to a Windows server

Linux backup using CRON to local directory

As many have pointed out I am on a backup and disaster recovery kick lately. Some would say that it is about time, others are simply glad to see that data is now being backed up. I have found that it is easiest to zip up files on a local machine prior to moving them to a final destination. So lets get started:

I have multiple Linux servers with many websites on each, as well as database. So I created a script that simply tar’s the files, then gzips them with the date in the filename for archiving.

Here is the file named ‘backupall.sh’ that I save in a place reachable by the user I will use to schedule this cronjob:

#!/bin/sh
date
echo "############### Backing up files on the system... ###############"
 
backupfilename=server_file_backup_`date '+%Y-%m-%d'`
 
echo "----- First do the sql by deleting the old file and dumping the current data -----"
rm -f /tmp/backup.sql
mysqldump --user=mysqluser --password=password --all-databases --add-drop-table > /tmp/backup.sql
 
echo "----- Now tar, then zip up all files to be saved -----"
tar cvf /directory/to/store/file/${backupfilename}.tar /home/* /var/www/html/* /usr/local/svn/* /etc/php.ini /etc/httpd/conf/httpd.conf /tmp/backup.sql /var/trac/*
gzip /directory/to/store/file/${backupfilename}.tar
rm /directory/to/store/file/${backupfilename}.tar
chmod 666 /directory/to/store/file/${backupfilename}.tar.gz
 
echo "############### Completed backing up system... ###############"
date

Continue reading Linux backup using CRON to local directory

UltraEdit on Linux and MAC…finally ! ! !

I received an email that just made my day. It was the announcement that UltraEdit will finally be available on Linux! The screenshots show it on Ubuntu, and they say there will also be a version for MAC. (Initially it will only be packaged for Ubuntu with tar balls for the others, but soon there will also be packages for Suse, and Redhat) And it is very close to release, supposedly Alpha in April 2009.

UltraEdit on Ubuntu
UltraEdit on Ubuntu

You can find out more on the Blog Post, or you can see the Formal Product Page.