The Most Active and Friendliest
Affiliate Marketing Community Online!

“Propeller”/  Direct Affiliate

Gold How to Use Footprints to Build Super Relevant Backlinks!

The first thread on AffiliateFix to reach over 1 MILLION views!

Jay you the man! :ninja:
 
Thanks for posting this, Jay. This technique is also very useful when you’re trying to find specific content on any site. For example, if you need to site information from a government website but you’re not sure if or where they would cover the topic. I do this often for the USCIS website when I’m writing content for one of my clients who is an immigration lawyer. I hadn’t though about using it for forums/backlinking though. Thanks for the tip!
 
I definitely don't think a footprint is "too broad" for finding links... they're specific enough to find the exact pages that you can build links on and broad enough to allow you to find a heap of them.

The operators you mentioned are definitely useful for other stuff but really aren't that great for finding pages to build links on. Limiting yourself to just one URL or whatever isn't going to help you find a diverse range of place to build backlinks at all.

you are absolutely wrong about that, you just didn't understand the operators properly it seems :)
since with "inurl:" you are not searching 1 url only, but you match a string inside a url!
so say for example you want to search for sites that run a certain script to leave backlinks, you can do: inurl:/aska.cgi
and then only get results that have "aska.cgi" in the url, not in the text. similar with the "intitle" operator, it gives you the ability to create better targeted footprints. just have a look at most scraping tools that use footprints such as hrefer, they use a combination of all methods. also keep in mind that "powered by" and "inurl" searches have been abused by scrapers since more than a decade now, so if you run many such queries, google will require you to fill a captcha and return less results for your queries, hence why you usually use scraping tools for this task who can use hundreds of proxies to avoid this. just for the record i coded a few scrapers myself over the years so i'm very familiar with this topic.
 
Last edited:
Thanks to help other with the use of footprints.

A really good way is also to make webprofile. You can often give it good juice witj internal linking.

My present for you:
Go to PureVolume™ | We're Listening To You
register, you can make dofollow links

Then, listen, comment, like,etc....
Your profile page will get good juice and your rankings will increase.

You can look, The TF-CF is really good.

Enjoy it
 
MI
Back