Best Blackhat Forum

Full Version: [GET] 5 millions links for scrapebox
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
This list is a filtered list from scraping. I scraped millions of urls and I ran them thru scrapebox to see which urls would come back successful as postable. I got a success shown from scrapebox on all of these URLs. The only filter I applied to the urls that I scraped was that I dumped all the urls that showed failed in scrapebox.

There are auto approve urls among this list, but the larger portion of it is moderated. By crafting some clever comments you can get links on urls that other people are not getting links on.

Cutting to the chase:

List contains 4,922,910 urls from 353,303 domains.

All are compatible with Scrapebox Fast Poster.

This list is broken down into smaller files so you can easily post to the smaller chunks, as scrapebox would likely crash if you tried to post to all 4.9 million urls in 1 run.

This is the first list in the series like this, although more will follow.

All of these work in Scrapebox Fast Poster, however this list could be used with any blog commenter that will comment to those platforms.

Specifically these are from:

Wordpress
Moveable Type
Blog Engine
B2Evolution



Download (107.4 MB):
http://www.datafilehost.com/download-c3711263.html
http://www.datafilehost.com/download-360d4e20.html
thanks i test now
Is scrapebox fast poster different from scrapebox?
Scrapebox Has Three Types of Posting Techniques.. 1. Fast Poster, 2. Slow Poster and 3.Manual Poster...

This list is for Fast Poster but you can do this also with Slow and Manual poster.. You will get those Option on Scrapebox...
Most of the links are dead and 90+ % majority are NO FELLOW, this list is just useless.. Kindly do not post any of list without testing at your own end.
Quote: List contains 4,922,910 urls from 353,303 domains.
You should filter your lists better man. That is 13.9 urls per domain. You don't want to post many times to the same site! If an admin sees 10 comments waiting from you, he will deny them all.

There is a tool for scrapebox that will combine many lists together, and not keep the root domain itself (you cant post on the homepage of a wp blog so site.com is useless!), and it will only accept x number of urls from the same site.

I combined all my posted urls together, and told it to not accept more than 5 urls of the same site.. and ended up with 2.2 mil.

Split your list up into manageable smaller lists. check the start and end of each list to ensure you don't have urls from a site in 2 seperate lists (meaning it didn't seperate them when splitting your lists up). Then grab a huge list of good proxies (I run over 1k proxies) and PR check the entire lists. When done with each list, remove urls that have no pagerank, or 0. Export urls with pr, so you don't have to pr check again in the future! Then check all the urls to see if they are indexed. Removed any urls not indexed.. When your done, you have only urls with PR that are indexed.

I think the tool also kept the highest pagerank urls when cleaning.. So if you told it to only accept 5 urls from the same domain, it wouldnt remove a PR5 url for a lower pagerank url of the same site.. So you don't lose the highest pr pages! IMPORTANT!

When your done, you have the highest pagerank pages of the sites, and not 15 urls of the same site!

I then split out my lists into better ones like PR3 and higher and I have a PR5 and higher list. I also have a list of no pagerank and pr0-pr2 urls, all mixed.. I run this list too, so Google sees a diverse set of links, not just PR3 and higher.. so it looks natural..

The key to scrapebox, is managing your lists (URLs).. I have many many folders, lists, and its all cleaned and ordered! That is the key, good house keeping! Scrapebox "still" is really really powerful, if you know how to use it to it's full potential!
Hi all. good lucky to you.
Reference URL's