Download And Search Multiple Pages Using PHP

August 2, 2011 | By

Here is a simple way to loop through a list of URLs, download the page and search the contents for a string.

The script starts by checking if the post variable $_POST[‘urls’] is set. If it is, its content is split by line breaks into the array $urls to easily read one URL at a time using the foreach loop.

I used the set_time_limit() function to reset the script’s time limit to 5 minutes before processing each URL to make sure the script does not timeout if a slow page was accessed or the user entered a long list of URLs.

The script then executes the function isonline(), defined below, which returns the HTTP status code and places the contents of the page in the variable $content which is passed by reference. If isonline() returned the text ‘OK’ the variable $content is searched, otherwise the returned HTTP status code is printed out.

The PHP function flush() is used to display the output at the end of each pass through the loop instead of waiting for the script to end completely for the output to be displayed.

The isonline() function uses CURL to download the contents of the page to $content and return ‘OK’ if it was successful or the HTTP status code if it wasn’t. The CURL options set the target URL and instruct it to download the body of the page, send back what it receives, set connection timeout to 5 minutes and follow redirects.

 

Download Source Code

 

Filed in: PHP | Tags: ,

Comments are closed.