The New Challenges In Search Engine Security

In online publishing we see every kind of attempt to hack and scam websites. For publishers, this has become time-consuming and costly. The bad guys began to hijack websites mainly for database and banking information. Now they have become a source that distributes viruses and other malicious infections picked up by search engine crawlers.

I discovered this through an expert who introduced me to the Google Search Engine Console when I set out to find out why the company continued to drop in the search rankings. My team found the mistakes within our coding. But this wasn’t all we found through Google Search Console, which provides tools to find and disavow bad URL links that can hurt your overall search ranking.

Some websites are hit worse by the bad guys in search engines than on publisher or other sites. These search engine attacks are dangerous because they hijack sites with infected links, but more importantly, they use keywords, key phrases, and well-known brand companies.



For example, they might say “XYZ Company has a new sweepstakes today.” From this sentence, you have five keywords they can use in any combination to spread malware across the net.

Most of the spam pages have hundreds of keywords made up of brand names and competitors. When multiple companies or websites are mentioned with the same keywords in a search query result, this turns into thousands of bad search links that are picked up by the search engines. 

To make matters worse, I have in my Google Search Console more than 1,500 disavow links just in one file folder for the Https version of a domain name. Then I have a file for the older HTTP, which is not secure. The HTTP link that has another 1,000 bad links.

Why do this? With every combination of the domain name you create -- whether with different hyperlinks, www, or just the -- the version serves different search results. In mere seconds, one bad search engine link can spread and connect to thousands of brand-name keywords across the internet. That is how the new search attacks work.

Once I accidentally clicked on the wrong link and my antivirus program lit up. That really focused my attention on what the search companies are doing or not doing in protecting the public.

The solutions to the problem of bad URL links in the search engines is not an easy fix. It requires all search companies to have antivirus on their servers. An anti-virus can be programmed in several ways to scan a server in the off-peak hours.

Doing this daily won’t be practical, but once every three days or once a week would be. The search engines could share the bad-link data with each other for the benefit of all. This step alone would solve much of the problem.

Lastly, there are many live domains on the net not being used. I believe the Internet Corporation for Assigned Names and Numbers (ICANN), a nonprofit responsible for coordinating the maintenance of domain registrars and hosting companies, could “park” names that are not being used. They could take them offline, so they remain separate from the hacked contents. This would severely cripple domain name hijacking.

This problem of search engines and bad URL links have hurt many individuals who unknowingly clicked on these links It would improve search engine rankings and revenue for many big businesses, from small to large. If stronger measures are not taken to stop all spam, hacking, and intrusions then everyone will be hurt in one way or the other including the search engines.  

Next story loading loading..