Old SEO Practices that no longer work

Google is changing their algorithms every month and every SEO needs to know what works and what doesn’t. Some of the older SEO’s were taught to do SEO pre Panda and Penguin so there is alot of old misinformation out there.

Heavy use of internal site link anchor text

You no longer have to use heavy use of your anchor text on internal site links. This used to be something that would weigh heavy and have positive impacts on rankings. Google no longer really uses this as a ranking signal and even penalizes it when it looks like its being done in a spammy fashion. If the link is anywhere on the page, make sure it flows well with the rest of your content.

Don’t make it look sketchy or in small print so nobody can see it. You want everything on your website to be for searchers vs. search engines. Google is very smart now and crawls pages like a user would and can tell spammy content easily.

Multiple pages for every keyword variant your trying to rank for

This is an old seo tactic that would help Google get your exact match keywords and direct them to a specific page on your site for that specific keyword.

Google is much smarter now and they are switching to an intent and topic model to rank their search results. You now want one page targeting all the keywords you possibly can. “Thats why guys at Moz and Backlinko are always telling you to write great 2000+ word content.” Says Ron Spinabella. Search Engines love lots of content that is relevant and can rank for a bunch of different search terms.

Using Directories, paid links, blog commenting, forum comments, forum signatures, etc

Directories are a thing of the past, even if they have a high Domain Authority or Trust Score. Google can tell these sites from the rest because they are thin with content and are comprised almost entirely of outbound links. Press release’s also got discounted in how much link juice they passed because they saw it as a pay for play and unnatural link building technique.

Comment links were the first thing to get penalized, Google saw this as highly spammy because of programs like ScrapeBox that could automatically scrub the internet for these sites and use a bot to post comments to create links. Private link networks and reciprocal links also got the boot for their manipulative nature in the link building process.

By Ron Spinabella

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Blog at WordPress.com.

Up ↑

%d bloggers like this: