53.gif

Search (advanced search)
Use this Search form before posting, asking or make a new thread.
Tips: Use Quotation mark to search words (eg. "How To Make Money Online")

01-04-2013, 06:24 PM
Post: #11
RE:
tried this method....found couple of tumble blogs...which have deleted their account ,
When tried to create a blog with similar name it says - "name already taken by other user."
it seems this trick doesn't work anymore...Too bad.
01-05-2013, 04:39 AM (This post was last modified: 01-05-2013 04:42 AM by TrafficSource.)
Post: #12
RE:
No what you found was unconfirmed accounts. In other words the site was registered but not completed. You say found a couple which clearly shows you spent 0 time doing this. I found 50000 PR1+ sites all open for reg. And I havenot even come close to getting through my entire lists to check all.

Scraped 3.7 billion urls to do it but hey. When you put the effort in you get the results.

Your post should read.
Made my usual half ass attempt at something...found nothing...couldnt figure this method out like every other I tried...gave up after 5 minutes....seems this this trick works...its just beyond my ability...Too bad.


Dude I found so many high PR blogs with this method that I was wiping my ass with the PR 2 and giving them away in the hundreds. So many thousands of open blogs found that PR2 didnt make the cut.
02-01-2013, 11:13 PM
Post: #13
RE:
What is the format we should use when using Scrapebox? I tried ".tumblr.com/post/" {1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|​​9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6​|​7|8|9|0}{1|2|3|4|5|6|7|8|9|0} and other variations but i only get a few hundred sites at a time (and very few of them open for reg).

I also noticed that it scrapes for a lot of sites that do not have ".tumblr.com/post" in the url.

Thanks
02-03-2013, 04:15 AM
Post: #14
RE:
(02-01-2013 11:13 PM)divatz Wrote:  What is the format we should use when using Scrapebox? I tried ".tumblr.com/post/" {1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|​​9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6​|​7|8|9|0}{1|2|3|4|5|6|7|8|9|0} and other variations but i only get a few hundred sites at a time (and very few of them open for reg).

I also noticed that it scrapes for a lot of sites that do not have ".tumblr.com/post" in the url.

Thanks
Yeah you should have no space between t/" {1
02-05-2013, 08:05 PM (This post was last modified: 02-05-2013 08:06 PM by regme07.)
Post: #15
RE:
Bro, thanks for post, so mach bandwith for this work..
Tried to simplify it by print: site:tumblr.com "We couldn't find the page you were looking for." -inurl:post|tagged|page . But google B****** clue a lot of pages with 404=( / can you share me few pr2-3 links?
(02-03-2013 04:15 AM)TrafficSource Wrote:  
(02-01-2013 11:13 PM)divatz Wrote:  What is the format we should use when using Scrapebox? I tried ".tumblr.com/post/" {1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|​​9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6​|​7|8|9|0}{1|2|3|4|5|6|7|8|9|0} and other variations but i only get a few hundred sites at a time (and very few of them open for reg).

I also noticed that it scrapes for a lot of sites that do not have ".tumblr.com/post" in the url.

Thanks
Yeah you should have no space between t/" {1
50.gif
02-05-2013, 08:30 PM
Post: #16
RE:
Theres your problem m8 -inurl:
Thats gonna get you G cockbocked str8 away. And with over 5 billion urls to scrape you really need to use so many eaxact post footprints. If not youll just end up with a load of Duplicates.
02-11-2013, 10:02 AM (This post was last modified: 02-12-2013 05:23 AM by kronosss.)
Post: #17
RE:
Hi all, i want to try this method but i have some problems with the footprint for SB.
i tried this
".tumblr.com/post/{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|​​9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6|7|8|9|0}{1|2|3|4|5|6​|​7|8|9|0}{1|2|3|4|5|6|7|8|9|0}"

and this
Code:
".tumblr.com/post/813423302"
".tumblr.com/post/667734138"
".tumblr.com/post/612864516"
".tumblr.com/post/714378694"
".tumblr.com/post/226666908"
".tumblr.com/post/630672416"
".tumblr.com/post/480823222"
".tumblr.com/post/001509130"
".tumblr.com/post/676736486"
".tumblr.com/post/823717354"
".tumblr.com/post/945889359"
".tumblr.com/post/087960524"
".tumblr.com/post/574565073"
".tumblr.com/post/194791221"
".tumblr.com/post/881169923"
".tumblr.com/post/021600769"
".tumblr.com/post/948305072"
".tumblr.com/post/013495464"
".tumblr.com/post/878487219"
".tumblr.com/post/878363381"
".tumblr.com/post/627727237"
".tumblr.com/post/373461931"
".tumblr.com/post/116842939"
".tumblr.com/post/816741312"
".tumblr.com/post/512514826"
.............more spin var.

and poor results...i mean 100 pages Confused

and best results with this footprint: Angry

Code:
".tumblr.com/post" "j"
".tumblr.com/post" "p"
".tumblr.com/post" "8"
".tumblr.com/post" "n"
".tumblr.com/post" "r"
".tumblr.com/post" "7"
".tumblr.com/post" "c"
".tumblr.com/post" "6"
".tumblr.com/post" "q"
".tumblr.com/post" "f"
".tumblr.com/post" "0"
".tumblr.com/post" "v"
".tumblr.com/post" "e"
".tumblr.com/post" "t"
".tumblr.com/post" "j"
".tumblr.com/post" "8"
".tumblr.com/post" "p"
".tumblr.com/post" "l"
".tumblr.com/post" "b"
".tumblr.com/post" "9"
".tumblr.com/post" "a"
".tumblr.com/post" "u"
".tumblr.com/post" "h"
".tumblr.com/post" "p"
".tumblr.com/post" "9"
".tumblr.com/post" "z"
".tumblr.com/post" "g"
".tumblr.com/post" "q"
".tumblr.com/post" "j"
".tumblr.com/post" "b"
".tumblr.com/post" "t"
".tumblr.com/post" "a"
".tumblr.com/post" "g"
".tumblr.com/post" "w"
".tumblr.com/post" "1"
".tumblr.com/post" "w"
__________________________________
07-05-2013, 02:22 AM
Post: #18
RE:
thanks for sharing
07-05-2013, 06:15 PM
Post: #19
RE:
Thanks for the guide TrafficSource.

Appreciate the work and tips.

+Rep you deserve it. Lol
07-05-2013, 07:36 PM
Post: #20
RE:
Thanks for the method OP, but I didn't understand after scraping and filtering 404s, how do you find which domains can be registered again and are not used at the moment.
60.gif




39.gif