Google Comments Bad Link and 50% of Traffic Jams

Google Comments Bad Link and 50% of Traffic Jams

A post on Search Engine Journal discussed commented John Mueller of Google on Googles comment on a bad link, plus a correlation between bad referral links and traffic reduction. A publisher Digital Marketing Agencies in Southampton said Google’s Mueller Google Webmaster Tools (GSC) reported more than five hundred pages of reference of the two domains. The direction of traffic is correlated with a decrease of 50%. Google’s John Mueller commented on the correlation between referral links and traffic reduction.

Publisher associated GSC indicates the direction of the two domains for up to four links to every page of their site. When visiting the pages publishers to see that they are empty, there is no content on the page.

They stated that the appearance of these links refer correlated with a 50% reduction in traffic.

publisher asked:

“… if this is a scenario where the disavow tool makes sense or not Google detects them as unnatural and will ignore them as a ranking signal?”

Google’s John Mueller commented on the mystery of the “empty” and what they might be:

Read Also:- What is Split Testing Feature for Facebook Ads?

“It’s really hard to say what you see here. It’s certainly possible there are pages out there that shows a blank page to the user and then they show a full page to Googlebot. “

It is a reference to a page that displays one page to another page to Google and to others. This practice is called cloaking.

Mueller is clear that the possibility that the page may be cloaking. It is an explanation of what publishers will probably see and do not address the issue of the two ratings.

Mueller went on to ignore the landing page as a technical error rather than malicious attempts to sabotage the issuer ratings.

He said: “From the point of view, I would simply ignore their pages.”

He then suggested checking Mobile-friendly page with Google’s test to see what the page looks like when Googlebot sees them. It tests for cloaking, to see if the page shows the page to other pages for Google and non-Googlebot visitors.

Mueller then commented on the correlation between the referral link and a 50% reduction in traffic:

“I do not think this is something that you need to resist. It might look strange on the link to report but I really would not worry about it. In connection with the reduction of traffic you see, from my point of view that may be associated with the link. There is no real situation … where I can imagine that essentially empty page will cause problems associated with the link. So I’m just going to ignore it.

If you decide to put them in a file rejection anyway … just keep in mind that this will not affect how we show the data in the search console. So the report will continue to show their link.

I do not think there is any reason to use the disavow file in this case. So I’ll just leave them be. “

What Do Publishers See?

What publishers saw an old and common phenomenon called the referral spam. The original reason referrer spam is that in certain free analytics program beginning in 2000 published a list of referrers in the form of a link.

That creates an opportunity for spamming the site with fake referrals from a spam site to other sites to link from a public analysis page.

This analysis page is not linked from other pages of the site. It is only in the URL automatically.

Most of the sites no longer have pages of analysis. But the practice continues, perhaps in the hope that if enough publishers click on a link that Google will see it as a form of popularity that will help their ratings.

What this hangout publishers are probably looking at is a reference produced.

Steering is not real. In fact, the link Digital Marketing Companies Southampton does not exist. It usually occurs in the referrer spam.