Black hat SEO is a set of activities designed at pushing websites further up search engine result pages (SERPs) that go against the terms and conditions of search engines, most notably Google’s Webmaster Guidelines.
Black hat SEO can be risky because, when caught, search engines can impose penalties on websites. These penalties can be algorithmic or manual. In the worst case scenarios, websites can end up with permanent bans from Google or Bing, which would be a major problem that could wipe out all their search traffic.
Which SEO activities are against the rules?
As search algorithms are continuously evolving, the SEO tactics that are considered black hat are subject to change at any time. The best place to head for a detailed rundown is the search engine websites themselves.
Automatically generating content
Websites that auto-generate content in an attempt to influence and manipulate search engine rankings risk breaking the rules and getting a penalty. Black hat SEO tactics related to auto-generated content include: using software to pump out keyword-stuffed content that doesn’t make sense, translating text automatically without any human input, and automatically scraping and publishing content from RSS feeds.
Taking part in dodgy link schemes
Inbound and outbound links are an important factor for search engine rankings, but there are right ways and wrong ways to go about building them. Black hat link building schemes include automating inbound link building, buying or selling links, and forcing people to link to your website as part of any terms and conditions. Taking part in dodgy link building schemes to manipulate search engine results risks breaking search engine rules and facing the consequences.
Hiding text or links from people
Deliberately hiding text or links from people, but showing them to search engines, may go against search engine rules. Sneaky tactics that go against the rules include setting font size to zero, hardly using any anchor text in links, placing white text on a white background, and having text in HTML but hiding it with CSS. Google acknowledges that not all hidden text is deceptive and that some, such as accordions, tabs and image ALT attributes, are very important to have for accessibility.
Copying content from other websites
Filling a website with somebody else’s content is a bad idea for many reasons. Search engines are very good at spotting stolen or duplicate content, so trying to improve your rankings on the back of somebody else’s hard work could damage your rankings and land you in trouble.
Cloaking is when you show one thing to search engines but something different to people who visit your website. This breaks search engine rules because there’s a risk that search engines will send traffic expecting a web page that is different to what is there in reality.
Attempting to boost the amount of high-quality content on your website by scraping it from other websites is not a good way to improve search engine rankings. Google frequently highlights the importance of good quality original content as a ranking factor. In an article about scraping, Google makes this clear when they comment that ‘it’s worthwhile to take the time to create original content that sets your site apart’ and suggests that scraping should be avoided.
Misusing structured data markup
Structured data markup is a way of providing Google with information about what the different sections of your pages are about. You can do this by adding code that can help sections of your web pages stand out on SERPs by displaying them as rich results which look different to normal search results. Attempts to use structured data in a way that doesn’t reflect the actual content of a page can land you in trouble with search engines. So can structuring data for hidden parts of your web pages.