BaseSeed

Daniel Stone daniel at fooishbar.org
Thu Sep 9 08:02:27 CDT 2004


On Thu, Sep 09, 2004 at 08:39:20PM +0800, John wrote:
> James Gregory wrote:
> >On Wed, 2004-09-08 at 14:04 -0700, Matt Zimmerman wrote:
> >>On Thu, Sep 09, 2004 at 04:54:34AM +0800, John wrote:
> >>>How else do you do this?
> >>>summer at Dolphin:~$ time lynx -dump  http://www.x.com/ | tail
> >>>30. http://www.ebay.com/
> >>>31. http://www.paypal.com/cgi-bin/webscr
> >>>32. http://www.paypal.com/cgi-bin/webscr?cmd=p/gen/fdic-outside
> >>>33. http://www.paypal.com/cgi-bin/webscr?cmd=p/gen/privacy-outside
> >>>34. http://www.bbbonline.org/cks.asp?id=20111061155818568
> >>>
> >>>I regularly want a list of URLs for some reason, often to get a list of 
> >>>files to download with wget or (sometimes) with curl.
> >>>
> >>You don't need a browser at all if you only want to extract URLs.
> >>
> >>wget -O- http://www.x.com/ | urlview
> >
> >You can also go to mozilla and click 'page info'. There's a links tab
> >there with all the links for the page. But if you want to download
> >everything on a page, wget -r will work.
> 
> I can't do either of those in a script. You're missing the point.

I think the point that our standard desktop users would not want to be
doing this (if they are scripting stuff, I'm sure they're more than
capable of using apt) has also been missed.

-- 
Daniel Stone                                              <daniel at fooishbar.org>
"The programs are documented fully by _The Rise and Fall of a Fooish Bar_,
available by the Info system." -- debian/manpage.sgml.ex, dh_make template
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: Digital signature
Url : http://lists.ubuntu.com/archives/sounder/attachments/20040909/4989fcad/attachment.pgp


More information about the sounder mailing list