Ensuring That Your Web Page Is Search Engine Friendly

Whether you create your own web pages or contract this job out to a professional, it is best to learn and share the most effective ways to ensure that your website is search engine friendly. Search engines such as Google, Yahoo!, and Bing, to name a few, have some limitations when it comes to crawling the web and deciphering the content contained. To put it simply, the way that you develop your website has a major effect on whether or not that website will rank favorably within the search engine. In this article, we will break down each component of SEO, individually to get a clear understanding of how to correctly optimize your website.

Laying The Foundation

Since HTML is the standard language of the web, proper coding of HTML should be a major factor while laying the foundation for your website, as this determines how successfully the search engine will be able to crawl your website and read the contents. This becomes a non-issue when using a CMS (content management system) such as WordPress Joomla or Drupal. But is important to remember if coding the website yourself or hiring a third party. Whatever information you consider as being the most important within your website’s content should be laid out using this standard language’s text format. Search engines have a much harder time reading JavaScript or images. You can read more about creating a website using HTML at W3Schools.com

HTML Is Too Basic

It is easy to throw images or Javascript items onto your website, but it won’t be helpful unless done correctly. Unfortunately, Flash and image files have no real value when it comes to the search engine as it can only read HTML and text. If you feel that HTML is too basic and isn’t as visually appealing as the effects of a Flash-based website, below are a few simple options that are easy to implement.

N

Video/Audio Content – Take the time to incorporate a closed-captioned version of your video or audio. This simple form of a transcript will help the engines crawl your text-based content.

N

Flash and Java plug-ins – Use text to go along with these items so the content can still be crawled.

N

Images – By adding the “alt” attribute to your jpg, png, and gif files, you are giving the engines a nice description in text form of what your images are trying to convey.

Making A List And Checking It Twice

So how will you know if your techniques are working? You can use simple SEO tools such as SEO Browser (http://www.seo-browser.com) or CachedView (http://cachedview.com/) to see your website as the search engines do. There are thousands upon thousands of websites out there that you will find are not ranking successfully because of the simple mistake of not using crawlable text within the content of the website. Envision a Flash website that you have seen in the past and put the URL of that website into one of the browsers that we mentioned above. You may notice that there is no rich content available to be seen. The content that is embedded in these type of websites cannot be seen or interpreted by the engines thus causing no or low ranking.

* Sitemaps:

You can think of a sitemap as a failsafe device. A sitemap is a list of every page on your website, that is then submitted to Google. This will ensure that if one of your pages isn’t linked to, it will still be found by Google.

Link Arrangement

Let’s not forget about the simplistic part of how search engines search for content on your website. To have proper link arrangement is another important factor when developing your website. A link structure that is crawlable creates a hiking trail or easy-access path through your entire website. If your website has 10 pages but you are only linking to 5 of those pages on your homepage, then the rest of your content is more than likely not being seen. You can have the best content and keywords on one of these other pages but guess what? The search engines may not even know that they exist because there is not a direct link to them. In short, each of your web pages must to linked together somewhere. Since a search engine crawls from one page to the next, via links, if there is a break in the link the next page will not be seen.

Page one must link to page two, but page three can be linked to by either page one or two.

Common Causes Of Unreachable Web Page Content

Too Many Links

It is understandable that search engines can only crawl a limited amount of content on a page to prevent spam. So if your web page is cluttered with too many links, some (if not most) of them will not have the opportunity to be indexed. To avoid this, simply avoid having an abundance of links on a single page. Earlier it was believed that a single page should have no more than 100 links on a single page. But since the rule has been updated allowing more links as long as they are of high quality and relatable. This rule seems to only apply to overly spammy links.

Frames

Although frames and iframes have the ability to be crawled, using them is best left up to advanced users who know the in’s and out’s of how the engines follow links placed in frames.

Online Forms

Protected pages behind a submission form that you may have enabled for your visitors may be at risk of not being indexed. Whether there is a login requirement or just a capture page form before the content is revealed, the engines may not attempt to go further than the form itself.

Robots.txt

The robots.txt file or Meta Robot tags allows the webmaster to stop crawler access to certain pages. While this is a great way to stop random bots, it can also deter the search engines from attempting to crawl the page if it is not set up correctly.

Links and JavaScript

Like mentioned earlier, if you must use JavaScript links, it is best to use them along with HTML text otherwise the search engines will not see much value in the link itself.

Link Attributes

Link attributes are often forgotten components that tell us the description of the link. It is sad to report that search engines disregard almost all of these except for a select few. Let’s look at the Rel=”nofollow” attribute.

Example: < a href=”https://www.google.com/” rel=”nofollow” >Search Engine< / a >

By using your attribute in the above way, you are telling the search engines that it should not follow the link included on your website. As we said earlier when once site gives a link to another site, it can count as a vote. That vote says, look, I am linking to this site because I find its contents valuable. The “nofollow” attribute says, disregard that. This attribute is often used on affiliate links, or when linking to an outside product you do not wish to include in a “vote of value”.

Keywords, Keywords, Keywords

Everything surrounding what a website ranks for is based on the content found on that site. Understanding how keywords work is an essential part of SEO. It is always in the webmaster’s best interest to include their keywords and keyword phrases in the crawlable content of anything that they create. The engines keep tabs on all of the content and pages around the world wide web in indexes that are keyword-based. When a person does a web search, the engine works its magic by matching up valuable pages based on that keyword or phrase. No stone is left unturned either. Things that may seem simple are just as important as the keyword itself. Be sure that when you are creating content keywords, that you take into consideration spelling, grammar, punctuation, and even capitalization. These seemingly simple items can make a world of difference in between your website and your competitor’s. Remember: the search engine’s goal is to match the web searcher up with the website that is most relevant to what the user searched for. To get your page up to the top, always consider using keywords in important places like in metadata, text, and titles which we will go over in greater detail later.

Be Specific With Keywords

While a word is a word, you have to consider broad vs specific. It is a good idea to also look at single words versus phrases too. If you are in the online gambling industry and are promoting your casino site, it would make more sense to use a specific country as part of your keyword phrase as opposed to a keyword phrase of just “online casino.” it will give the searcher more relevant results based on the country that they are able to play from. If your keyword phrase is just “online casino” and your competitor decides to use “USA online casinos,” whose website would show up first in the search results for a web user looking for USA friendly online casinos specifically? Be specific with keywords and be creative because keyword strength will make a huge difference in search results.

Stuffing Keywords

When it comes to using keywords, the endgame should not be wanting to rank all of your keywords. Instead, you want your keywords to match what you have to offer your visitors. You should strive for a balance between User Experience and SEO, with the user always coming first. It is a misconception that stuffing keywords into your websites throughout will help it rank higher when in reality, keyword stuffing is actually considered abuse. This is also careless and lessens the value of your website. Keywords, along with key phrases should be placed naturally, throughout the article. This is an area where less is more, as each article should flow naturally, without over-using keywords. Always be sure that the keywords you are using match the content that relates to what you want to be seen.

Optimizing With Your Keyword Phrase

The easiest way to help speed up the process of creating a fast-ranking web page is to be sure that you are optimizing with your keyword phrase. Here are the best locations for inputting your keyword phrase in the creation of your website

Use your keyword phrase in the title of your page

The title tag is used to provide a headliner that is viewable in the search results and on the top of the browser that the user is using while on your website. You never get a second chance to make a first impression and this is exactly what your title tag is used for. Think of the title as being a description as to what your website is all about. It will also be a good idea to brand yourself here too. Keep in mind that search engines usually only show 75 characters at max in search results. Also, consider that the title is what people would share if they are sharing your link via social media. Bring your keyword phrase to the front of the title tag if you can and remember that this title will be bolded when it shows up in search results. Since this is what people see first, what you use here will determine whether or not someone wants to click further.

Use your keyword phrase naturally at least once at the top of the page
Use your keyword phrase at least once in the meta description section

While search engines will not use the information in this section as a way to rank your page, the meta description will provide a more detailed look at your content. How? The meta description will appear underneath your title when it appears in the search results visible to your potential web visitor. Take the time to create this section because otherwise, the search engines will use other elements from your website to fill in the blanks.

Use your keyword phrase naturally at least three times throughout the page: “Welcome to example.com where we rate the top Phoenix Wedding Photographers by experience, quality, and price.”
Use a variation of your keyword phrase at least once in the ‘alt’ attribute of an image link: < img src=”wedding-logo.png” alt=”Phoenix Wedding Photographer” >
Use your keyword phrase in the URL: example.com/phoenix-wedding-photographers
Use a mix of your keyword phrase at least once within your content: “wedding photographers in phoenix”
Obtain a few high-quality links from relatable websites using your keyword phrase within the anchor text: < a href=”http://example.com/phoenix-wedding-photographers” > Phoenix Wedding Photographers < /a >

Using Meta Tags

Using meta tags is helpful to increasing rank as it gives the search engines a quick glimpse as to what content you have to offer. These tag descriptions can also appear in the search results as we explained above.

Robots.txt

Use the <a href=”https://support.google.com/webmasters/answer/6062608?hl=en”>Meta robots.txt</a> file to dictate how the search engines regard your website. This is a simple text file that is placed in the index directory of your web server. This file gives the search engine direction on how to treat your website.

index

This is the default so it does not need to be added, but what it does is tells the search engine robots that it is ok to index the web page.

noindex

This item tells the search engines not to index this page in the web results.

none

This is just a shortcut to “noindex” and tells the search engines to disregard this page completely.

nosnippet

This prevents search engines from caching the page and showing a snippet of the said page in the search results.

follow

Regardless whether or not the page can be indexed, this item tells the search engines to follow the links on the page anyway.

nofollow

This tells the search engines to not follow any links on the page at all.

noarchive

This restricts the search engines from showing any cached copy of this page.

It Pays To Be Creative

It is realized that there are topics and niches out there that multiple people write about and create web pages upon. However, the search engines are very hip to the fact that stolen and/or duplicated content is out there. As the intelligence of how search engines work has developed over the years, the search engines are now assigning a lower rank to web pages that have duplicated content. Now there are some situations where content is not actually stolen but it appears more than once across a single or even multiple websites and the search engines do not like that. The search engines want to provide a clear and relevant search result to users when they are searching. So for example, if you are a blogger who has more than one blog site, it is best to make each post unique even if you are promoting the same item within those posts on both of your websites.

There are tools to that many webmasters use to know if content is unique. By simply putting a link into: <a href=”http://copyscape.com/”>http://copyscape.com/</a> a webmaster will know if their content is unique. If your content is not unique, it will not rank highly as the credit will always go to the original source.

All About The URL

– Use strong keywords within the URL strategically but don’t overdo it.

– Use K.I.S.S.(Keep It Simple Stupid) – If possible, keep your URL short and easy to remember.

– Use hyphens to separate words within your URL (pages) because depending on the applications that the visitor is using, other symbols (plus signs, underscores) may not be translated to appear correctly.

– Try to stay away from dashes and numbers when choosing your domain name. This makes it easier on the viewer and looks less spammy.

– Use Absolute Paths in place of Relative Paths when linking from within your website. There are web scraping tools out there that others use to extract or steal your content. By updating your internal links to use absolute paths, this tells the search engines that you are the originator, therefore, the site that stole from you will not outrank your site.

Canonical Tag

If you are the blogger who has duplicated content across more than one website or if you are a webmaster who utilizes Content Management Systems (usually showing the same content in different locations, within the same website), there are options available to let the search engine credit the original source. Let’s say you have 5 pages that are known to be duplicated content, but we also know that Google will only rank one as the original. The canonical tag will point the search engine to the master URL, the one that you desire the rank for.

301-Redirects

If you have content that you wish to move to a new location, you can add a 301-redirect to the older pages. What this will do is redirect the user, as well as the search engine from where the old content once was, to where the content is now. What the code 301 tells does, is tells the search engine that the redirect is permanent. This way they can update the old URL with the new URL within their search results.

Above all else, as a webmaster Search Engine Optimization should always come second to user experience, and Google agrees. A website built for the customer will be seen by the search engine as the ultimate form of SEO, and as user metrics become more advanced this will continue to ring true. Remember that bounce rate, social shares, natural links and page views all play a part in the algorithm. Build a quality site for your visitor, one that is unique and full of useful information and the rest will come easily.

Skip to toolbar