Tutorial Super Sneeky Adplanner Trick

I hinted in my post about creating your own scraper that I would come back and explain why the hell one would want to scrape odigger?


There are two reasons why i scrape odigger – and this post sheds some light on the first reason.

Im gonna show you in the video below how I use adplanner to exactly tell me which offers are most relevant to a target domain without manually looking up demographics!!

For example

Lets say I want to target with PPV and promote email submits from my favourite email/zip submit network .

Instead of manually choosing the offers, or looking at the demographics of and guessing which offers work….

…I let google ad planner tell me which ones are most related!!

One damaging admission though.

I made a mistake in the video. See if you can spot it.

The Video

Cool dont you think

oDigger Scraper

Ive updated the oDigger scraper with an extra column that changes all domains to top domains only. subdomain URLs dont do very well in ad planner.

Heres the link again:

Google Spreadsheet oDigger Scraper

have fun playing around with this.


User Comment:
can’t see the video?

User Comment:
should be there now.

User Comment:
ok thanks.. it’s ok now

User Comment:
tijn I get this message when I try to view the spreadsheet – do I need a US Proxy to view.?

User Comment:
^ same here.. and i’m in US

User Comment:…=en_GB&newcopy

remove &newcopy…FQU0E&hl=en_GB

User Comment:
I keep getting this error when trying to watch the video:

"Sorry, you do not have permission to watch this private video."

I have a Vimeo account, what else am I missing?

User Comment:
tjin, you’re sick man. Love it!

User Comment:
Can’t modify the cell to paste my link in there. It says the cell cannot be edited.

User Comment:
Never mind. Made my own real quick. Awesome tutorial to do this. Thanks. 🙂

User Comment:
if the link above is not working to load a copy of the spreadsheet, try and clear your cookies & relogin to your google docs account. This has worked for some.

Removing &newcopy means you will access the master spreadsheet – but in readonly mode. Therefore you wont be able to update the URL.

Let me know if this sorts it out.

If not ill post the formulas here so you can create your own copy.


User Comment:
Anyway to get the formula to make our own, I can’t get the paste function to work either even after logging in again…

User Comment:
Thanks for the helpful spreadsheet! Had a little trouble with opening the Google Doc, but the following worked for me:

1. Login to Google Docs
2. Copy tinj’s Google Spreadsheet link
3. In the Google Docs homepage, paste the link and hit enter

For some reason, the link wouldn’t open straight from the forum.

User Comment:
You can get up to 200 results per page in odigger. Just write 200 in the &num variable in the url:…co_id=&pt_id=&num=200

It seems that’s the real limit. I tried with 300 and 500 and the results were still 200 per page.

Hope this helps.


User Comment:
This is all I get back from the spreadsheet when I input the url from odigger. Any suggestions?

Offer URLDomainUnique Full DomainsNo Subdomains

User Comment:
holy shit… thanks a million!

I wish google wouldn’t use such crappy javascipt interfaces. It would be much easier to fully automate this.

User Comment:
Yes when using peerfly the spread sheet just has


etc. for all 100 offers, it is not pulling the full domain. I’m guessing the full URL is no longer there in Odigger? When you click the preview button on oddigger the url is

User Comment:
Yeah sorry guys. Peerfly have changed their data sent to oDigger and oDigger have changed some stuff as well so this trick is no longer working! ;(

User Comment:
Hey tijn,

Whilst this doesn’t work with oDigger anymore, … it will still work with our affiliate network own sites, right?
E.g. if I log into NeverBlue, and click on ‘Find Campaigns’, I can then select ‘Show All Campaigns’ at the bottom, and see all 1281 of their campaigns.
In their system, they show the actual url of the offer, which I can grab from the html code from the following tag:

<td class="preview"><a href="" target="_BLANK" ><img src="img/icon-test-preview-lp.gif" alt="Preview Offer" title="Preview Offer" /></a></td>

However, I have nooooooooooo idea how to alter your spreadsheet to rip these out.

Edit: Or perhaps there is a way to do it ‘in browser’ using GreaseMonkey or some other equally foreign plugin.

Can you help us out? I’m sure this would be applicable to quite a few affiliate networks, assuming they list their previews of offers as the actual offer urls.


User Comment:
1. In the spreadsheet, change cell A7 to:
=importxml(A4,"//td[@class=’result-preview-link’][email protected]")

2. Then insert a new column between A and B.

3. In the new B7, put this:

4. Copy that down the rest of the column B

Et voila.

I would share an updated spreadsheet, but I don’t use google docs and have no idea how to do that – sorry!


User Comment:
@extremesg -> problem is that the Neverblue data is behind a login and therefore the google spreadsheet cant get at it ;(. There is a way i think to login to sites with Google Apps Script but thats getting very advanced.

User Comment:
Hey tijn, you can export all Neverblue data from their api url.
It’s really cool because you get access to all the creatives for each campaign

You might need to ask for access, I don’t remember.

Don’t know if you use Java, but here is how I do it…

String urlString = "hxxps://";
protected byte[] downloadContentFromUrlInBytes(String urlString, String username, String password)
throws ConnectException {

byte[] byteContent = null;

try {
URL url = new URL(urlString);
String userPassword = username + ":" + password;

// Need to work with URLConnection to set request property
URLConnection uc = url.openConnection();
if (username != null && username.trim().length() > 0) {
// Encode String
String encoding = new sun.misc.BASE64Encoder().encode(userPassword.getBytes());
// String encoding = Base64Converter.encode (userPassword.getBytes());
uc.setRequestProperty("Authorization", "Basic " + encoding);

InputStream is = uc.getInputStream();

//br = new BufferedReader(new InputStreamReader(is));
//String line;
//while ((line = br.readLine()) != null)

BufferedInputStream bufis = new BufferedInputStream(is);
ByteArrayOutputStream bos = new ByteArrayOutputStream();
byte[] buf = new byte[1024];
int len;
while ((len = > 0) {
bos.write(buf, 0, len);


byteContent = bos.toByteArray();
} catch (MalformedURLException e) {"Invalid URL", e);
} catch (ConnectException e) {
throw e;
} catch (IOException e) {
LOG.error("URL read error. Unavailable or username/password incorrect.", e);
return byteContent;


User Comment:
About the Peerfly issue, what I did was install the ‘copy links’ firefox plugin,
then displayed all offers in PeerFly, selected them all, right-click -> Copy Links -> Copy Selected Links
then pasted into Excel, and removed the duplicate entries (each entry had http and https urls) using the following Excel macro:

Sub DeleteHttpsRows()
    Dim c As Range
    Dim SrchRng
    Set SrchRng = ActiveSheet.Range("A1", ActiveSheet.Range("A65536").End(xlUp))
        Set c = SrchRng.Find("https", LookIn:=xlValues)
        If Not c Is Nothing Then c.EntireRow.Delete
    Loop While Not c Is Nothing
End Sub

The list of urls you end up with can then be put into tijns Excel spreadhsheet and let it do it’s magic.


User Comment:
Awesome. Didnt know that

User Comment:
This forum filled with hardcore stuff

User Comment:
Thanks guys….

So, being a total coding fcukwit, … how do I take sean’s java code, and test this out on a Mac with my NeverBlue account.
(Lets assume you don’t need to ask for permission to their API – I’ve no idea if that’s true or not).

User Comment:
lemme package that ‘neverblue downloader’ Java code into an exe file for you.. I’ll post it later today.


User Comment:
Hi Seank,

Thanks for your kind offer, … but alas I’m a Mac dude.
Is there another way to do it, that will work on both PC and Mac?

I thought I might be able to use GreaseMonkey to do it on Firefox, … but apparently I’m too r3tarded to work it out!

User Comment:
get vmware fusion and install windows xp on a virtual machine. Works perfectly for most software

User Comment:
@tijn, I use VMWare, … i just don’t have it fired up 24/7. For the purpose of what we are doing here, there should be a way to get this to work platform independent, right?
I was thinking along the lines of a script I could run in GreaseMoney, or something similar.

User Comment:
yeah im sure there is
and i think you can run java locally on a mac with the java VM but im not too familiar with this stuff

User Comment:
this spreadsheet is now not working or what?

User Comment:
Nope not worky anymore due to changes at odigger

User Comment:
tijn, thank you so much for this awesome tutorial.
I tried it and it worked like a charm.
40% conversion rate choosing the best matched offer for my traffic!

Thanks to you!

With Neverblue, I did it the dumbest way ever:

– First, I launched the offer’s search I wanted.
– Then, I simply displayed the source code.

==> Selected all the offers, copying the HTML code starting at:

<table cellspacing="0" class="offers">

and ending at:


And pasted on an HTML file.

I saved it on ‘temp.html’.

– After that, I have replaced all the following strings:
<td class="preview"><a href="
<td class="preview">


" target="_BLANK" ><img src="img/icon-test-preview-lp.gif" alt="Preview Offer" title="Preview Offer" /></a></td>

– Saved the .html file.

– Displayed temp.html on my brower.

– Very often, Excel is freezing when copying tables from web browsers, so I copied and pasted the table on notepad, then copied and pasted again in Excel.
(I told you that it was the dumbest tactic ever)

And that’s all, it worked perfectly well.

At the very end, I used to filter parameters after /.

(I hope you guys understand my english )

Anyway, it’s just a newbie method that took me no more than 2 minutes.

User Comment:
This is simply epic

User Comment:

Originally Posted by julien

tijn, thank you so much for this awesome tutorial.
I tried it and it worked like a charm.
40% conversion rate choosing the best matched offer for my traffic!

Awesome result! Better then I have had with this method.

User Comment:
It seems adplanner changed their format and now I can’t figure out how to do this. I was trying it with a list of sites from sitescout without any luck.

User Comment:
Create your list

Add your domains and save your list

Start your research – follow steps below screenshot

Then click start search

Enter the domain your targetting

Restrict the sites to show to your defined list

Select your list in the Limit To row, submit

And sort by Comp Index

User Comment:
Awsome Tutorial. Thanks Tijn! this method might not work anymore but we could still manually enter offers that are actually related to the domain that is making $$$ into ad planner to see which ones are most related to the domain that has the traffic to the already chosen offer can’t we? I mean we could hand pick a whole bunch of them? just wondering I know it might sound obvious but just wanna make sure,


User Comment:
Does this not work anymore? I havent tried recently.

tony -> re your point yes you can do that, but in the past on PPV it has actually not had a dramatic result for me in most cases.

User Comment:
I was looking at this today. Tijn, I think you scraper could still work if you could program it to go to the preview page on oDigger and then pull out the iframe src > <iframe scrolling="yes" marginheight="0" marginwidth="0" border="0" frameborder="0" src=" ampaign=cpl&utm_term=a89vb&utm_content=live_room" ></iframe>

Thats one I was looking at today. I have no clue how to do this, but it’d be freaking sweet if someone else could take a look. I may kick around see what I can come up with.

User Comment:
I wrote a quick scraper for this few days back after reading this thread. It now takes all offers from certain search parameters, fetches each offer one by one and follows the offer URLs to get the final LP URL. However it’s not very good because many of the networks use different redirecting (meta, JS, etc.) and if the redirect is anything except a normal server-side 302/301, the script has to know how to get the next URL in the chain. And that means a lot of work.

Another thing is that the whole Adplanner trick doesn’t actually work that good anymore because the whole only GDN data change.

The Article Published IN 08-06-2011 02:53 PM

Share To More ()