08 Oct

Search engine optimization and mesothelioma

What does a malignant lung cancer, caused by asbestos, with search engine optimization?

Year 2006 the United States Senate passed a resolution that mesothelioma cancer considered as occupational disease and severance packages were guaranteed to all patients who have denounced in court and they won a judgment.

Combining the possibility of getting millionaires lawsuits and compensation guaranteed by the State, That was as close to paradise for law firms.
The immediate consequence was that many law set off for the “hunt” of patients with mesothelioma, to represent them in court.
So much so, that law firms announcing their websites on google content network came to pay up 65$ whenever a user did click on a sponsored link related word search “mesothelioma”, to leave them instead of competition.

This also led to many webmasters devote to writing articles, pages and blogs talking “Mesothelioma”, and try to position their pages at the top of Google search. Because if a user came to one of these pages and did click on a banner or text link of a lawyer who was announced in the content network, The webmaster could earn a 40 $ with every click (Amazing is not it?).

Hence the closure is opened, and many webmasters started using black hat SEO techniques (search engine optimization with unethical techniques) to position at all costs their websites at the word “mesothelioma”. To do this using different techniques, such as repeating the word “mesothelioma” several hundred times in the text of the page using a type of very small print and the color of the bottom of the page. This makes it invisible to the human eye but deceives(three) the robot with Google.

When Google technicians realized percale, improved programming the robot and took “retaliation”. The new version of the robot detected these pages webmasters who outsmarted and put them in a “blacklist”, thus disappeared from the lists of results in Google searches.

One thing they have in common black hat SEO techniques is that “cheat” for a while the robot with Google, but every time Google launches a new improved version of their algorithm, severely penalizes all pages trying (and getting) trick in the past. The effectiveness of these techniques is well summarized by the phrase “bread today and hunger tomorrow”.

So try to trick the Google robot is something that works but only temporarily. The only way to position a page sustainably is NOT cheating and publish quality content.