Disk defragmenting (was "Re: No Sound :(")

Mark mhullrich at gmail.com
Mon Nov 15 22:02:35 UTC 2010


On Mon, Nov 15, 2010 at 1:39 PM, Douglas Pollard <dougpol1 at verizon.net> wrote:
>
> Mark that is likly right but I have to use Premier video editor in Xp
> for some of my work and I would likely make the new drive a usb drive so
> to use for Ubuntu and XP.  The problem is I would have to defrag it
> using windows.  I have no idea How long it would take to defrag a 1TB
> drive. But I will look into it.  Using Premiere I usually take video
> clips from one drive and save to a different one  that way all the
> putting and taking is not on the same drive.  If you don't do that
> Premeir crashes on a regular basis. Cinelerra crashes fairly regular too
> but it remembers everything you have done so that when you restart you
> are right back where you left off.   You have to save Premeir manually
> and I often forget for 20 or 30 minuets at a time and it is really
> aggrivating when you loose 20 minutes of work.
>
Fair enough, but you already have 860GB that need the same treatment,
so saving defrag time isn't really an issue if you already have to do
that.  Even some of the more recent offerings on Buy.com and elsewhere
have 1TB USB drives available for around $20 less than a 2TB USB drive
(Fantoms in this case).

You can carve up any disk drive into more manageable portions if
that's of concern to you.

I'm going to go out on a limb here and posit that I don't see a
significant advantage to defragmenting a file system (other than FAT
file systems) because unless you are extremely organized and never,
ever change a file's location or have everything laid out on a whole
slew of (small) separate disk drives, you're going to encounter i/o
elevator collisions whenever you run, say, a modern OS like Ubuntu
that runs over 100 processes all the time.  (I'm currently at 207 with
almost nothing going on at the moment.)

I understand that there are many times when you want to take advantage
of the slight speed increase you get from defragmenting in repeated
mass sequential accesses to large files, but you still have to temper
that against all the other i/o that's going on to the same disk, every
one of which will interfere to some extent in that smooth of an
operation.  I recently worked at a company that builds massively
parallel processing database appliances where the general mode of
operation is sequential writing of 30GB chunks at a time.  That's a
strong case for preempting fragmentation, but I'm pretty sure that
your video editing doesn't fit that model too well.

Either way, when you deal with that much data, you're going to have to
spend (waste?) some time from either the hip-hop random accesses or
from defragmenting a lot.  Unless you are sure that the speed increase
is worth the difference vs. what you lose in time to defrag in the
first place, what's the point?

I see we've strayed from the topic - perhaps a new thread would be
wise (hence the new thread...).




More information about the ubuntu-users mailing list