(06-14-2014 07:12 AM)roukousnel Wrote: [ -> ]Hello, thx for the share...
I have something to ask; for the scrapebox to work with google do I have to use only Google passed proxies?
I'm a newcomer to all this stuff and i really can't make scrapebox to work with big G
.
.
Hello
roukousnel,
Over last few months I have noticed Google have changed the way of detecting proxies, even if you use Google passed proxies still you will end in problems.
after long tries I have discovered a trick and I would like to share it with BBHF members.
To avoid Google detecting your use of proxies you would need to parse the same proxy twice, one with the fake port the followed by the actual port, what this does is bypass the way google marks the proxy address, also it would prevent google proxy detection from get black listed via the NOD of your connection.
The age of just load proxies and go have ended or at least ending soon, so we need to find a better way to go around google's proxy detection.
I am working with my friends in the university on an application that loads all proxies dynamically then acts as a single NOD with multi-proxies feed, in such a way your SEO application doesn't need to load a proxies list anymore, but instead it will load a single address which is your local PC. for short it is 127.0.0.1 or localhost.
What the application does in the process is perform the FAKE switching of ports for the same proxy, and keeps moving on to next proxy and so on, and that is done in a super fast speed, that your SEO application wouldn't suffer waiting or reporting a bad connection.
To answer your question about ScrapBox, I must say SB was a very good tool past years but its power is coming to an end with Google improving the all process.
The list of proxies I will provide and update a not just another list of proxies, but they are permanent fast/Elite and stable proxies, and as I have mentioned in the first post on the thread those proxies are harvested per/min tested then moves to a DB, then again get tested for a full 7 days, then the application will look at the ratio of those top stable and fast secure proxies and mark them as permanent then got added to the provided list on this thread, for everyone to download.
And again the same process will be related with those proxies plus new discovered proxies and so on.
The goal of this Thread is to have a solid list of proxies for everyone to use, that doesn't stop you from harvesting proxies locally, but you would notice the different when you use the provided list.
Remember the list contains 1000s of proxies, and will keep around 1500 proxies, and you would never need more than 500 proxies to rotate on any SEO application process, anything over 500 proxies is just a waste of resources because you wouldn't submit an article to over 500 articles website in anyway.
Let us go by the quality instead of the quantity, that is the main key for any SEO project.
Cheers
Sandra
