How to use GatherProxy scraper
Gather proxy from. By default you will use website sources from gatherproxy.com for harvesting proxies. But you can also use your custom url sources by select My URLs option. When using this option, gatherproxy provide some expressions, them allow grabber proxies with very smart methods.
Custom Url source expressions
- #date# - Replace this text by current date on your PC.
Example: if today is 2013-07-24 and i have a link:
http://google.com/search?q=proxy+server+list+#date# This link will be:
http://google.com/search?q=proxy+server+list+2013-07-24 when program scraping.
- #date-1# - Replace this text by yesterday of current date on your PC.
#loop(startval,endval,counter)# - This option allow you loop pages. Its usefull and save your times if you want to grab all pages in a topic.
Example: I have a link: http://example.com/forum/threads/topic-name/1.html This is page 1 of "topic name". This topic have 15 pages and i want to grabber all without copy manual.
http://example.com/forum/threads/topic-name/#loop(1,15,1)#.html This url will return 15 urls following when program scraping
Explain for parameters:
startval - value start
endval - value end
counter - plus "counter" each time loop
Ex: #loop(10,30,5) returns: 10 - 15 - 20 - 25 - 30
- #deep(1)# - This is so great feature for you. You can using it for deep scan url. (this expression must at end of link)
http://google.com/search?q=proxy+server+list+#date##deep(1)# it will return
http://google.com/search?q=proxy+server+list+2013-07-2013#deep(1)# Then deep scan this url. What is the deep scan?
Yep, the program will go this url, grab all links from that url and harvest proxies from links grabbed. Please look my video demo for clarify. Another Example:
This url allow you scrape 3 pages from google search.
#dfil(keyword)# - This expression availbe if you are using #deep(1)# function. It allow you filter url scraped.
Ex: if you want to scrape all thread in a example.com forum but just threads with url contain keyword "scrapebox":
Filter Proxy option. After gathering proxies completed, if you are check on this option. The program will be automatic filter proxy by criteria setting on Proxy Filter tab
Start proxy checker after gathering completed option. The program will be automatic start proxy checker if you checked on this option.
Quick Menus. Paste (import from clipboad) this feature allows you import quickly your proxy list, or the text file which include proxy servers. Example: Goto Hidemyass proxy list page, Press Ctrl + A >> Right click and select copy >> Goto GatherProxy and using "Paste (...)" menu and you will see
In this tab, you can filter your proxies by more criteria. Notes:
Each port must in one lines for port filter
The format for ip address range: 192.168.1.22 18.104.22.168 ([lower ip] [upper ip])
This feature similar to http://whatismyipaddress.com/blacklist-check
If you want to get highest speed. Please select correct option "checking for". If you want to check proxy only, please select "Proxy checker" option only. If you want to check socks, please select "Socks 4" or "Socks 5" and uncheck "Proxy checker"
last updated version: V7.7 (2013/12/17)
How to get 1000+ google passed proxies? Please check here
New features demo: