Home: Need a Custom or Prewritten Perl Program?: Throw Down The Gauntlet:
Extract all links of all pages of a website



Del
New User

Jan 19, 2004, 3:14 PM


Views: 37875
Extract all links of all pages of a website

Hi there,

I need a software wich I could ask it to research for example "www.domain.com" and then it extracts all links in all pages of that website. I have tried many softwares but it just extract the links in the main page. So if I keep on doing this way I will need to access page by page in the website...

Can you help me with? Do you a software that could do that?

Regards,

Daniel

*P.S I appreciate if you could send a copy to dbpalma@msn.com


davorg
Thaumaturge

Jan 28, 2004, 1:52 PM


Views: 37860
Re: [Del] Extract all links of all pages of a website

Looks like you need wget.

--
Dave Cross, Perl Hacker, Trainer and Writer
http://www.dave.org.uk/
Get more help at Perl Monks