problems with fish:/ (garbage collection)
Gabriel Dragffy
dragffy at yandex.ru
Sun Jun 11 02:28:09 UTC 2006
It will create a new process on the remote machine. You should browse with the
sftp:// protocol. The fish: protocol was developed specifically for ssh that
didn't have sftp running, the fish: protocol uploads a small pearl applet to
the remote host and executes this to run as a kind of sftp server. This would
mean you are starting more processes on the remote machine. If you can use
sftp then it will make use of the sftp server all ready running on the remote
host thus not executing any extra services. Thus, my hypothesis is that it's
not a problem with the fish:// protocol but the way in which it is being
used, furthermore I postulate that switching to using sftp:// (if supports)
will eliminate your problems.
Gabe
On Sunday 11 June 2006 09:16, cliff1976 wrote:
> Hi all,
>
> I love how easy it is to browse/read/edit remote files using Konqueror and
> fish. However, my web hosting company just told me that it's causing
> hundreds of zombie processes and thus a drain on their resources.
>
> This is the first I've heard of this. I've been using fish:/ to
> browse/read/write remote files for almost a year (starting with Hoary, then
> Breezy, and now Dapper). I would love to hear any others' experiences on
> this topic.
>
> For now, my web host has suggested I report this behavior as a bug to the
> developers; I've asked them for some specifics, but have nothing (yet) to
> report.
>
> Can anyone recommend any other remote development tools I can run within
> KDE? I'll hate to fall back to using vim for all my coding and scp for my
> individual file transfers. I'll miss Kate terrribly. :-(
>
> Thanks in advance for any advice.
> Cliff
More information about the kubuntu-users
mailing list