disk defragmentor

G

Gary S. Terhune

What a total crock. You haven't made any tests worth speaking of. You've
made some observations of one drive. Well, I've made hundreds of similar
observations and they contradict yours 99.99% of the time.

In the case of defragmenting wearing out the drive, you didn't even do any
tests. You just repeated some gossip.

--
Gary S. Terhune
MS-MVP Shell/User
www.grystmill.com

"Lord Turkey Cough" <spamdump@invalid.com> wrote in message
news:64Atj.541$%W6.439@newsfe2-gui.ntli.net...
> The proof of the puffing is in the eating.
> All the detailed explantions in the world will not change
> test results which contradict them.
>
> It's a case of once bitten twice shy, and I have been bitten
> 2 or 3 times by the defragging myth. I don't intend getting bitten
> anymore.
>
>
>
> "MEB" <meb@not here@hotmail.com> wrote in message
> news:%23Q8LId4bIHA.4196@TK2MSFTNGP04.phx.gbl...
>>
>> Debated upon whether I would get into this ridiculous argument, but:
>> I'll just add this comment, which happens to coincide with most of the
>> material on the subject. [Oh boy another web page]
>>
>> Fragmentation happens to cause significant impact upon hard drives,
>> particularly in the NT/XP environment as files are not stored in the
>> fashion
>> one would [non-informed] generally think off.
>>
>> Ponder upon these overly simplified explanations:
>>
>> Many files are created or modified each time their application is run or
>> accessed, and the system does not use the next available hard drive space
>> to
>> place those file segments/additions, but may place them in any unused
>> space
>> on the disk. This creates files which might extend from the base address
>> [fat address or MFT] to anywhere else on the partition handled by jump
>> instructions or other, which indicates the location of the next segment
>> needed. These may once again be jumped to the next segment that may
>> actually
>> be at the opposite end of the disk/partition. Picture that happening
>> several
>> dozen times during the access of that one file. During this time, the
>> hard
>> drive controller, OS, and the algorithms used, may place other segments
>> elsewhere on the disk, either temporarily or permanently.
>> Think of a large file and picture the number of additional head movements
>> needed to access JUST that one file and the extra time [additional
>> nanoseconds] needed, then consider that there are likely a dozen or more
>> additional files [dlls and other exes, etc.] needed for that one
>> application
>> which are also fragmented taking that same whipping head motion picking
>> up a
>> fragment here and there...
>> Now let's picture that application has a data base of information, new
>> information is added to that base but is stored wherever it was created.
>> After running that same application and saving those new bits of data,
>> that
>> data base now exists in several thousand non-contiguous sectors of the
>> hard
>> drive. To view or access that data base ALL those segments must be found
>> and
>> brought together for the visual display, so these scattered bits are
>> temporarily collected in the swap file and/or memory.
>> All of this, of course, takes more head movement and time than if the
>> files
>> were contiguous and the application's other needed files were also closer
>> together.
>>
>> A good indication is when intermittent Windows errors begin to show up
>> for
>> some reason or hard drive access times become excessive. If one goes to
>> Safe
>> Mode, shuts off Windows handling of virtual memory [swap] then deletes
>> the
>> win386.swp file after a restart in DOS, restarts to Safe mode and
>> defrags,
>> then turns ON Windows management when done restarting to Windows Normal
>> Mode and behavior will be noticeably improved. Part of the reason is that
>> the SWAP file is no longer scattered all over the disk, and is contiguous
>> [Fat systems]. NT's defragmentation is of course different as are the
>> results..
>>
>> Regarding new installations and defragging:
>>
>> A major misconception is that a newly installed OS is defragmented and
>> arraigned closely on the disk. As the files are expanded areas of the
>> disk,
>> various areas are used to hold temporary copies of those files in any
>> available area of the disk. Each file may first be copied, then expanded,
>> then added to the proper directory or may be placed in temporary storage
>> pending installation order, then placed with some directory [listed as
>> part
>> of].
>> Each time the file is written, it takes up space on the drive, which may
>> or
>> may not be the next contiguous area, and may be some scattered areas upon
>> the disk [other segments of files may already be using an area which
>> might
>> have been used].
>> The directories themselves [via the table] assign the "base" area then
>> list
>> the various temporary and permanent locations of the files listed under
>> the
>> various directories. Nothing at this stage requires these files are
>> actually
>> assigned an area of the disk in which all the directory's files are
>> located
>> within a specific segment of the disk, e.g., one file after the other or
>> one
>> sector after the other. Continuing to use a newly installed disk without
>> ever defragmenting it, will eventually cause errors and at minimum,
>> slower
>> loading times noticeable after extended usage.
>>
>> The first defragmentation done on a disk attempts to align the various
>> individual segments of the files into contiguous areas/segments. If one
>> has
>> used something like "Align for faster Windows Loading" [MSDeFrag - not a
>> recommended setting}, then the files are arraigned according to the
>> monitored access, space required while running, and other factors held
>> [created by taskmon] in C:\WINDOWS\APPLOG\ and using Logitec Mouse as
>> example LOGI_MWX.LGC to supposedly place the file in an area conducive to
>> its loading and any additional space it might require while
>> loading/running
>> [some exe files temporarily expand on disk and become fragmented in the
>> process]. IF this is a fairly new system or taskmon has been disabled
>> then
>> the files will NOT be arraigned properly as there is not enough saved
>> details.
>> Successive deframentations generally take less time, and decrease file
>> movement.. Watch a defragment tool: it checks the fats, then folders
>> then
>> files, and only adjusts what has or is fragmented [unless one uses the
>> Align
>> for Faster Windows which WILL constantly move files around based upon the
>> logic files {which is why its not recommended}]
>>
>> Now, should any wish to complain this provides no conclusive proof, then
>> they should get off their dead behinds and actually look at a
>> fragmemented
>> disk and defragmented disk with a hex/disk editor. THEN come back and
>> post,
>> maybe someone will listen, though I doubt it ...
>> Or if you like visuals displays, then run MS Defrag and look at the
>> details,, and watch it move files around trying to place all the file
>> segments together.
>>
>> The short answer is: defragmentation can decrease OS access times and
>> reduce wear and tear on your hard disk. Load times ARE NOT a definitive
>> display of problems with defragmentation, but with the routines used, and
>> INCLUDING a fragmented swap file.
>>
>> --
>>
>> MEB
>> http://peoplescounsel.orgfree.com
>> _________
>>
>>
>>
>>

>
>
 
M

MEB

"Lord Turkey Cough" <spamdump@invalid.com> wrote in message
news:64Atj.541$%W6.439@newsfe2-gui.ntli.net...
| The proof of the puffing is in the eating.
| All the detailed explanations in the world will not change
| test results which contradict them.
|
| It's a case of once bitten twice shy, and I have been bitten
| 2 or 3 times by the defragging myth. I don't intend getting bitten
anymore.
|
|
|


Really??? And where are those test results... got a link or three so we can
verify ...

Did you bother to read the post concerning ONE of the likely reasons for
YOUR supposed problems with defrag [fragmented swap], or did you just blow
them off?
If the swap is in several areas of the drive, after defragging [without
doing as I showed] you will notice a load delay as Windows adjusts to the
new free segments on the disk, the secondary segments may be well back on
the disk, hence longer loads. You will NOT achieve maximum startup speed,
unless you follow the deletion of the swap > defrag > and re-start, because
then the swap will be in one area of the disk [contiguous], rather than two
or more.... In fact, to achieve maximum performance the command line should
be "defrag.exe /P", which even moves some of those files which supposedly
can not be moved. Here's the catch, guess what, I ACTUALLY do physically
test these types of things, AND do use disk editors and other to verify
.....

However, I think this group already realizes that you really aren't
concerned with what actually does or does not occur, you're apparently
primarily interested in spouting uninformed and unproven ideas. You're a
conservative/Republican right?

--

MEB
http://peoplescounsel.orgfree.com
_________


| "MEB" <meb@not here@hotmail.com> wrote in message
| news:%23Q8LId4bIHA.4196@TK2MSFTNGP04.phx.gbl...
| >
| > Debated upon whether I would get into this ridiculous argument, but:
| > I'll just add this comment, which happens to coincide with most of the
| > material on the subject. [Oh boy another web page]
| >
| > Fragmentation happens to cause significant impact upon hard drives,
| > particularly in the NT/XP environment as files are not stored in the
| > fashion
| > one would [non-informed] generally think off.
| >
| > Ponder upon these overly simplified explanations:
| >
| > Many files are created or modified each time their application is run or
| > accessed, and the system does not use the next available hard drive
space
| > to
| > place those file segments/additions, but may place them in any unused
| > space
| > on the disk. This creates files which might extend from the base address
| > [fat address or MFT] to anywhere else on the partition handled by jump
| > instructions or other, which indicates the location of the next segment
| > needed. These may once again be jumped to the next segment that may
| > actually
| > be at the opposite end of the disk/partition. Picture that happening
| > several
| > dozen times during the access of that one file. During this time, the
hard
| > drive controller, OS, and the algorithms used, may place other segments
| > elsewhere on the disk, either temporarily or permanently.
| > Think of a large file and picture the number of additional head
movements
| > needed to access JUST that one file and the extra time [additional
| > nanoseconds] needed, then consider that there are likely a dozen or more
| > additional files [dlls and other exes, etc.] needed for that one
| > application
| > which are also fragmented taking that same whipping head motion picking
up
| > a
| > fragment here and there...
| > Now let's picture that application has a data base of information, new
| > information is added to that base but is stored wherever it was created.
| > After running that same application and saving those new bits of data,
| > that
| > data base now exists in several thousand non-contiguous sectors of the
| > hard
| > drive. To view or access that data base ALL those segments must be found
| > and
| > brought together for the visual display, so these scattered bits are
| > temporarily collected in the swap file and/or memory.
| > All of this, of course, takes more head movement and time than if the
| > files
| > were contiguous and the application's other needed files were also
closer
| > together.
| >
| > A good indication is when intermittent Windows errors begin to show up
for
| > some reason or hard drive access times become excessive. If one goes to
| > Safe
| > Mode, shuts off Windows handling of virtual memory [swap] then deletes
the
| > win386.swp file after a restart in DOS, restarts to Safe mode and
defrags,
| > then turns ON Windows management when done restarting to Windows Normal
| > Mode and behavior will be noticeably improved. Part of the reason is
that
| > the SWAP file is no longer scattered all over the disk, and is
contiguous
| > [Fat systems]. NT's defragmentation is of course different as are the
| > results..
| >
| > Regarding new installations and defragging:
| >
| > A major misconception is that a newly installed OS is defragmented and
| > arraigned closely on the disk. As the files are expanded areas of the
| > disk,
| > various areas are used to hold temporary copies of those files in any
| > available area of the disk. Each file may first be copied, then
expanded,
| > then added to the proper directory or may be placed in temporary
storage
| > pending installation order, then placed with some directory [listed as
| > part
| > of].
| > Each time the file is written, it takes up space on the drive, which may
| > or
| > may not be the next contiguous area, and may be some scattered areas
upon
| > the disk [other segments of files may already be using an area which
might
| > have been used].
| > The directories themselves [via the table] assign the "base" area then
| > list
| > the various temporary and permanent locations of the files listed under
| > the
| > various directories. Nothing at this stage requires these files are
| > actually
| > assigned an area of the disk in which all the directory's files are
| > located
| > within a specific segment of the disk, e.g., one file after the other or
| > one
| > sector after the other. Continuing to use a newly installed disk without
| > ever defragmenting it, will eventually cause errors and at minimum,
slower
| > loading times noticeable after extended usage.
| >
| > The first defragmentation done on a disk attempts to align the various
| > individual segments of the files into contiguous areas/segments. If one
| > has
| > used something like "Align for faster Windows Loading" [MSDeFrag - not a
| > recommended setting}, then the files are arraigned according to the
| > monitored access, space required while running, and other factors held
| > [created by taskmon] in C:\WINDOWS\APPLOG\ and using Logitec Mouse as
| > example LOGI_MWX.LGC to supposedly place the file in an area conducive
to
| > its loading and any additional space it might require while
| > loading/running
| > [some exe files temporarily expand on disk and become fragmented in the
| > process]. IF this is a fairly new system or taskmon has been disabled
then
| > the files will NOT be arraigned properly as there is not enough saved
| > details.
| > Successive deframentations generally take less time, and decrease file
| > movement.. Watch a defragment tool: it checks the fats, then folders
then
| > files, and only adjusts what has or is fragmented [unless one uses the
| > Align
| > for Faster Windows which WILL constantly move files around based upon
the
| > logic files {which is why its not recommended}]
| >
| > Now, should any wish to complain this provides no conclusive proof, then
| > they should get off their dead behinds and actually look at a
fragmemented
| > disk and defragmented disk with a hex/disk editor. THEN come back and
| > post,
| > maybe someone will listen, though I doubt it ...
| > Or if you like visuals displays, then run MS Defrag and look at the
| > details,, and watch it move files around trying to place all the file
| > segments together.
| >
| > The short answer is: defragmentation can decrease OS access times and
| > reduce wear and tear on your hard disk. Load times ARE NOT a definitive
| > display of problems with defragmentation, but with the routines used,
and
| > INCLUDING a fragmented swap file.
| >
| > --
| >
| > MEB
| > http://peoplescounsel.orgfree.com
| > _________
| >
| >
| >
| >
|
|
 
B

Bill in Co.

MEB wrote:
> "Lord Turkey Cough" <spamdump@invalid.com> wrote in message
> news:64Atj.541$%W6.439@newsfe2-gui.ntli.net...
>> The proof of the puffing is in the eating.
>> All the detailed explanations in the world will not change
>> test results which contradict them.
>>
>> It's a case of once bitten twice shy, and I have been bitten
>> 2 or 3 times by the defragging myth. I don't intend getting bitten
>> anymore.

>
> Really??? And where are those test results... got a link or three so we
> can
> verify ...


He has them in his back pocket! (Didn't you check that)? LOL.

> Did you bother to read the post concerning ONE of the likely reasons for
> YOUR supposed problems with defrag [fragmented swap], or did you just blow
> them off?


Was that a rhetorical question? :)
 
L

Lord Turkey Cough

"Bill in Co." <not_really_here@earthlink.net> wrote in message
news:utpndFOcIHA.4936@TK2MSFTNGP03.phx.gbl...
> Lord Turkey Cough wrote:
>> "Bill in Co." <not_really_here@earthlink.net> wrote in message
>> news:OxEwnvNcIHA.6060@TK2MSFTNGP04.phx.gbl...
>>> Lord Turkey Cough wrote:
>>>> The proof of the puffing is in the eating.
>>>> All the detailed explantions in the world will not change
>>>> test results which contradict them.
>>>
>>> You haven't presented any, because there aren't any.

>>
>> I have.
>> Time your PC running normally then defrag and time it again.
>> Present you results here.
>>
>> I am not doing it myself because I know it is a waste of time.

>
> Same here. Touche.


Ah so you agree defragging is a waste of time.
Anyone who thinks it is a good thing would be happry to do it.
Still I am glad you accept defragging is a waste of time.


>
>
 
L

Lord Turkey Cough

"Gary S. Terhune" <none> wrote in message
news:OjRqBNOcIHA.5988@TK2MSFTNGP06.phx.gbl...
> What a total crock. You haven't made any tests worth speaking of. You've
> made some observations of one drive. Well, I've made hundreds of similar
> observations and they contradict yours 99.99% of the time.


Have heard lots of stories about disk failures, I have never had one and I
have
soome very old drives.
Of course I didn't abuse them by defragging.

So not only do I save on buying defragging software I save on doing backups
and
backup software and trying to recover data from a failed disk.
A big saving in time and mooney all round.

>
> In the case of defragmenting wearing out the drive, you didn't even do any
> tests. You just repeated some gossip.
>
> --
> Gary S. Terhune
> MS-MVP Shell/User
> www.grystmill.com
>
> "Lord Turkey Cough" <spamdump@invalid.com> wrote in message
> news:64Atj.541$%W6.439@newsfe2-gui.ntli.net...
>> The proof of the puffing is in the eating.
>> All the detailed explantions in the world will not change
>> test results which contradict them.
>>
>> It's a case of once bitten twice shy, and I have been bitten
>> 2 or 3 times by the defragging myth. I don't intend getting bitten
>> anymore.
>>
>>
>>
>> "MEB" <meb@not here@hotmail.com> wrote in message
>> news:%23Q8LId4bIHA.4196@TK2MSFTNGP04.phx.gbl...
>>>
>>> Debated upon whether I would get into this ridiculous argument, but:
>>> I'll just add this comment, which happens to coincide with most of the
>>> material on the subject. [Oh boy another web page]
>>>
>>> Fragmentation happens to cause significant impact upon hard drives,
>>> particularly in the NT/XP environment as files are not stored in the
>>> fashion
>>> one would [non-informed] generally think off.
>>>
>>> Ponder upon these overly simplified explanations:
>>>
>>> Many files are created or modified each time their application is run or
>>> accessed, and the system does not use the next available hard drive
>>> space to
>>> place those file segments/additions, but may place them in any unused
>>> space
>>> on the disk. This creates files which might extend from the base address
>>> [fat address or MFT] to anywhere else on the partition handled by jump
>>> instructions or other, which indicates the location of the next segment
>>> needed. These may once again be jumped to the next segment that may
>>> actually
>>> be at the opposite end of the disk/partition. Picture that happening
>>> several
>>> dozen times during the access of that one file. During this time, the
>>> hard
>>> drive controller, OS, and the algorithms used, may place other segments
>>> elsewhere on the disk, either temporarily or permanently.
>>> Think of a large file and picture the number of additional head
>>> movements
>>> needed to access JUST that one file and the extra time [additional
>>> nanoseconds] needed, then consider that there are likely a dozen or more
>>> additional files [dlls and other exes, etc.] needed for that one
>>> application
>>> which are also fragmented taking that same whipping head motion picking
>>> up a
>>> fragment here and there...
>>> Now let's picture that application has a data base of information, new
>>> information is added to that base but is stored wherever it was created.
>>> After running that same application and saving those new bits of data,
>>> that
>>> data base now exists in several thousand non-contiguous sectors of the
>>> hard
>>> drive. To view or access that data base ALL those segments must be found
>>> and
>>> brought together for the visual display, so these scattered bits are
>>> temporarily collected in the swap file and/or memory.
>>> All of this, of course, takes more head movement and time than if the
>>> files
>>> were contiguous and the application's other needed files were also
>>> closer
>>> together.
>>>
>>> A good indication is when intermittent Windows errors begin to show up
>>> for
>>> some reason or hard drive access times become excessive. If one goes to
>>> Safe
>>> Mode, shuts off Windows handling of virtual memory [swap] then deletes
>>> the
>>> win386.swp file after a restart in DOS, restarts to Safe mode and
>>> defrags,
>>> then turns ON Windows management when done restarting to Windows Normal
>>> Mode and behavior will be noticeably improved. Part of the reason is
>>> that
>>> the SWAP file is no longer scattered all over the disk, and is
>>> contiguous
>>> [Fat systems]. NT's defragmentation is of course different as are the
>>> results..
>>>
>>> Regarding new installations and defragging:
>>>
>>> A major misconception is that a newly installed OS is defragmented and
>>> arraigned closely on the disk. As the files are expanded areas of the
>>> disk,
>>> various areas are used to hold temporary copies of those files in any
>>> available area of the disk. Each file may first be copied, then
>>> expanded,
>>> then added to the proper directory or may be placed in temporary
>>> storage
>>> pending installation order, then placed with some directory [listed as
>>> part
>>> of].
>>> Each time the file is written, it takes up space on the drive, which may
>>> or
>>> may not be the next contiguous area, and may be some scattered areas
>>> upon
>>> the disk [other segments of files may already be using an area which
>>> might
>>> have been used].
>>> The directories themselves [via the table] assign the "base" area then
>>> list
>>> the various temporary and permanent locations of the files listed under
>>> the
>>> various directories. Nothing at this stage requires these files are
>>> actually
>>> assigned an area of the disk in which all the directory's files are
>>> located
>>> within a specific segment of the disk, e.g., one file after the other or
>>> one
>>> sector after the other. Continuing to use a newly installed disk without
>>> ever defragmenting it, will eventually cause errors and at minimum,
>>> slower
>>> loading times noticeable after extended usage.
>>>
>>> The first defragmentation done on a disk attempts to align the various
>>> individual segments of the files into contiguous areas/segments. If one
>>> has
>>> used something like "Align for faster Windows Loading" [MSDeFrag - not a
>>> recommended setting}, then the files are arraigned according to the
>>> monitored access, space required while running, and other factors held
>>> [created by taskmon] in C:\WINDOWS\APPLOG\ and using Logitec Mouse as
>>> example LOGI_MWX.LGC to supposedly place the file in an area conducive
>>> to
>>> its loading and any additional space it might require while
>>> loading/running
>>> [some exe files temporarily expand on disk and become fragmented in the
>>> process]. IF this is a fairly new system or taskmon has been disabled
>>> then
>>> the files will NOT be arraigned properly as there is not enough saved
>>> details.
>>> Successive deframentations generally take less time, and decrease file
>>> movement.. Watch a defragment tool: it checks the fats, then folders
>>> then
>>> files, and only adjusts what has or is fragmented [unless one uses the
>>> Align
>>> for Faster Windows which WILL constantly move files around based upon
>>> the
>>> logic files {which is why its not recommended}]
>>>
>>> Now, should any wish to complain this provides no conclusive proof, then
>>> they should get off their dead behinds and actually look at a
>>> fragmemented
>>> disk and defragmented disk with a hex/disk editor. THEN come back and
>>> post,
>>> maybe someone will listen, though I doubt it ...
>>> Or if you like visuals displays, then run MS Defrag and look at the
>>> details,, and watch it move files around trying to place all the file
>>> segments together.
>>>
>>> The short answer is: defragmentation can decrease OS access times and
>>> reduce wear and tear on your hard disk. Load times ARE NOT a definitive
>>> display of problems with defragmentation, but with the routines used,
>>> and
>>> INCLUDING a fragmented swap file.
>>>
>>> --
>>>
>>> MEB
>>> http://peoplescounsel.orgfree.com
>>> _________
>>>
>>>
>>>
>>>

>>
>>

>
 
B

Bill in Co.

Lord Turkey Cough wrote:
> "Bill in Co." <not_really_here@earthlink.net> wrote in message
> news:utpndFOcIHA.4936@TK2MSFTNGP03.phx.gbl...
>> Lord Turkey Cough wrote:
>>> "Bill in Co." <not_really_here@earthlink.net> wrote in message
>>> news:OxEwnvNcIHA.6060@TK2MSFTNGP04.phx.gbl...
>>>> Lord Turkey Cough wrote:
>>>>> The proof of the puffing is in the eating.
>>>>> All the detailed explantions in the world will not change
>>>>> test results which contradict them.
>>>>
>>>> You haven't presented any, because there aren't any.
>>>
>>> I have.
>>> Time your PC running normally then defrag and time it again.
>>> Present you results here.
>>>
>>> I am not doing it myself because I know it is a waste of time.

>>
>> Same here. Touche.

>
> Ah so you agree defragging is a waste of time.


Whoooosh.........! No big surprise there though.
 
G

Gary S. Terhune

"Lord Turkey Cough" <spamdump@invalid.com> wrote in message
news:xDKtj.635$%W6.612@newsfe2-gui.ntli.net...
>
> "Gary S. Terhune" <none> wrote in message
> news:OjRqBNOcIHA.5988@TK2MSFTNGP06.phx.gbl...
>> What a total crock. You haven't made any tests worth speaking of. You've
>> made some observations of one drive. Well, I've made hundreds of similar
>> observations and they contradict yours 99.99% of the time.

>
> Have heard lots of stories about disk failures, I have never had one and I
> have
> soome very old drives.
> Of course I didn't abuse them by defragging.


I have SEEN, HANDLED, WORKED WITH all these drives, not just heard stories.
I've retired plenty of drives, 99.99% due to manufacturers' defect, not
wearing out. I have seen very old drives (low GB or smaller) with years of
hard duty, seen the same diskswith hardly any use. Believe me, the heavy-use
ones were the ones that ALSO got regulrly defragged. there was hardly ay
correlation between use and wearing out and *NO* correlation with
defragging.

> So not only do I save on buying defragging software I save on doing
> backups and
> backup software and trying to recover data from a failed disk.
> A big saving in time and mooney all round.


For personal needs, no money need be spent. For large commercial purposes,
definitely worth the money, the time, the...whatever it takes to maintain a
well defragged disk.

Again, more guessing, more baseless gossip. Only reason I answer is because
it's 5am and I'm bored.

--
Gary S. Terhune
MS-MVP Shell/User
www.grystmill.com
 
D

DaffyD®

Don't you mean disk defragmenter?
--
{ : [|]=( DaffyD®

If I knew where I was I'd be there now.


"Lord Turkey Cough" <spamdump@invalid.com> wrote in message
news:rYZrj.9104$zg.8159@newsfe5-win.ntli.net...
>
> "Pepperoni" <Pepperoni@discussions.microsoft.com> wrote in message
> news:CE574FD5-CC58-4D0C-99B0-EECFFA1C7BC0@microsoft.com...
> > My disk defragmetor keeps running in a loop and I have tried everything

i
> > can
> > and nothing seems to be working. So, if you have any help to offer

please
> > tell me.

>
> Defragging is a waste of time.
>
>
 
Back
Top Bottom