U bent hier

5 SEO Hacks for Dynamic Websites

afbeelding van globecom

In today’s online world, few websites can escape the rapidly changing Google algorithm. With paid links, and now content farms, on the spam list, it is imperative that you work with a proven white-hat SEO and web design company that won’t get into trouble with the search engines. If you can’t afford a big agency, start by reading this onsite SEO guide to get you moving in the right path.

Information Architecture

Creating an SEO-friendly hierarchy is very important to ensure that your website is positioned internally for the best possible results in the search engines. This is especially useful for larger e-commerce or resource sites with thousands or hundreds of thousands of pages where the majority of them may not be crawled on a regular basis.

The best way to illustrate proper information architecture is through an example. Let’s say you run an animal website and you discuss everything related to animals. At a category level, your content might be separated based on the type of animal you talk about such as dogs, cats, birds, ligers, etc. Sub-categories seek to breakdown the category in more specific groups. For example, the dog category might include feeding, training and exercising. Once these two information layers are identified, you should begin crafting content tailored to your categories and sub-categories. Some content ideas could be Top 5 Food Brands for Dogs and How to Train Your Dog in 1 Week.

Interlinking with Silos

Interlinking within and between keyword silos helps with telling the search engine crawlers which pages should rank for which keywords. There are many tricks that can be used for interlinking.

The most common form of interlinking is placing a link within a new article or post, with a target keyword and point it to another page or article on your website. This is a great way to direct crawlers to older pages that have not been indexed in a while. To use the example in the previous section, let’s say you post your How to Train Your Dog in 1 Week article and you mention something about dog treats in your article. It would be a great opportunity for you to link that term to a previous article written about dog treats. If you haven’t written anything about it, you may think about writing a new article that discusses dog treats and interlink them.

Another great technique to use for interlinking is implementing a related articles plug-in. If your platform does not support plug-ins, you may want to think about implementing a Related Articles section manually at the bottom of each post. This is a lot more time consuming but provides results. Sharing related articles with your visitors is also great usability practice.

Keyword Hierarchy

Keyword hierarchy is a little more challenging but will result in an even more structured website that will be clear and organized for the search engines. Keyword hierarchy is the process of matching the proper keyword themes with the proper content silos.

When matching keywords with specific categories, sub-categories and content, it is important to keep in mind that shorter tail keywords should be placed higher in the hierarchy while longer tail keywords should be placed lower in the hierarchy. The strongest pages tend to be the homepage or top-level category pages, which means the most competitive keywords should be placed on the pages that will be the strongest.

Duplicate Content

Duplicate content is when two different URLs display identical page content. This can be cause for multiple reasons and usually occurs when websites’ has custom search bars. Here are some basic tags and commands that can be used to remove the duplicate content and ensure that your site is not penalized in the search results.

Robots.txt:

Adding parameters to robot.txt allows for entire folders to be blocked from the search engines. For example, let’s say you changed a parameter on your website from /dog/ to /dogs/ you may consider blocking /dog/ from the search engines to remove duplicate content from the search engines. Of course, you have to be 100% sure that you will not erase every bit of content in robots.tx. This will cause a huge loss of traffic and indexed content. Be sure that your content is placed in the new folder or proper 301 redirects are implemented.

Noindex:

Noindex tag is a great, quick fix for any duplicate content issues. It is placed in the meta tag of the page you want to block. The code is as follows:

meta name=”robots” content=”noindex”

Canonical:

Canonical tags are sometimes quicker to be crawled by Google (in my experience) and is placed in the link tag as well. The code for adding a canonical tag is as follows:

link rel=”canonical” href=”…”

URL Rewrites

URL rewrites are done on the server side and are pretty technical. However, in some extreme cases, “dirty” URLs can be the death of your website. URLs may not be indexed by Google and other search engines if they cannot be read by the crawlers. To solve this, URLs should be reduced to a reasonable length and should contain keywords if possible. The best way to go about is to have it mirror the hierarchy of the site. For example, http://…/dogs/breeds/huskies/ is a clean URL that informs both search engines and users what the page is about.

Related posts:

  1. 11 Smart Keyword Related Apps to Help You Simplify Your SEO Task
  2. What Every Web Designer Must Know About SEO
  3. 10 Extreme Productivity Hacks That Will Create More Time

Onze klanten

From the blog

afbeelding van globecom
afbeelding van globecom

Changing next number in a Drupal serial field

After searching for 2 days on Drupal.org without finding a good way to change the starting point of a serial field I

Read more...
afbeelding van globecom

Automatisch PDF maken, mailen en toevoegen als bijlage

Voor een klant had ik de uitdaging het volgende te maken.

Read more...

Neem contact op

  • Globecom
          Schoolstraat 245
          7606EM
          Almelo
  • +31 (0)634924795
  • info@globecom.nl

Laatste Tweets

Latest Shots