X

Basic ASP.NET SEO

SEO is popular topic nowadays and as ASP.NET is growing its popularity also as platform for public systems. Therefore SEO is going to be more important topic also for ASP.NET developers. The world of SEO is always in rapid change – can you imagine that last year Google published 450 updates to it’s search engine algorithms? Although the field is changing fast there are some key points that has been fundamental and unchanged this far. Let’s see what is the minimum you should do and how you can make your sites more SEO friendly on ASP.NET.

At first I will give you some first ideas what you can do to make your site better. And after this list (that hopefully makes you think) I will tell you one little point – what SEO is all about.

NB! This blog entry describes only the technical side of SEO you may want to consider when building your public sites. There are many other topics how your site gets more visible and how you can make more traffic. But I will leave these topics for some of the next blog entries.

1. Page titles

One of the most important things are page titles (between <title></title> tags). When somebody is searching something then these titles are shown as links in search results. One of the common mistakes is using same title for all pages. Imagine how worthless are the following search results for user.

TheVeryBigCorporation
Some fragments from page that may not be useful for user.

TheVeryBigCorporation
Some other fragments from page as description.

TheVeryBigCorporation
Almost no content, let’s show what we have. Date Modified 01/19/2008 Titles by Marc and Lewis Custom services.

TheVeryBigCorporation
Some fragments from page that may not be useful for user.

TheVeryBigCorporation
Some fragments from page that may not be useful for user.

When using page titles carefully you give searchers a reason to visit your site – if your site offers something they are looking for. They can easily see from results if this page may offer something they are interested in.

Adding title to your pages is not something complex. If you built CMS then you have titles anyway inserted for each page. If you built product catalog then use product names as page titles. If you think further you discover that you have more data already there in your databases you can use in page titles.

2. Use meaningful URLs

"Nice" URLs is the other important point. Instead of using long URLs that contain many query parameters, you should use URLs that are formatted as URLs of static pages. Some experts say that it is enough, I don’t agree – it is not enough. Look at the following two URLs.

somesite.com/index.php?op=content&act=view&id=100.html

somesite.com/content/view/100.html

First one looks like some alian language. The other one is better at one point – it is shown correctly by almost every mail client in this world (except the buggy ones of course). But for visitor this URL doesn’t have any value. Visitor may get some idea that he or she is looking at content and then there is hundred. What is this hundred stand for? Dunno.

Now let’s go one step further and let’s use meaningful URLs instead of nice ones. Meaningful URL is nice URL that has some meaning for visitor. I mean if I give you meaningful URL you can figure out what you may find on page behind this URL. Here is the meaningful version of previous URL.

somesite.com/printers/laserjet-1200.html

Now you have URL that tells you what you can find when you follow it. In this case it hopefully opens the page that introduces HP LaserJet 1200.

You can use components like UrlRewritingNet.UrlRewrite for nice URLs. Also be noticed that IIS7 has URL mapping support that provides you with mod_rewrite-like features (mod_rewrite is very popular Apache HTTP Server module).

3. Tell about the structure of your content

Every page has some structure. Title, maybe teaser, paragraphs, headings and so on. Maybe some citations and quotes and why not some important points you want to emphasize. Most of robots are not very strong on analyzing CSS. To make your page semantically correct follow these steps.

  1. Use headings to divide longer stories to parts that make sense to readers. You can use <h1>…<h5> tags by example. These tags are made for this.
  2. If you need to emphasize some sentence of your text, put it between <strong>, <em> or <u> tags.
  3. Use <cite> for citations, <quote> for quotes and so on. Let robots know what parts your pages contains. 

This way you also make your pages easier to read to your visitors. I am sure they are happy if you provide correctly structured content that is easy to read.

You can make your users life easy – use WYSIWYG editors like FCKEditor, so your users can create structured content by theirselves. FCK has also support for ASP.NET and offers ASP.NET control you can use on your pages for free. Integrating these editors to your site is nothing complex.

5. Test your site on extreme conditions

What happens when your site is slow and requests often timeout? Well, some of these requests are made by robots and if they continuously cannot connect your site they will dsrop it form their indeces. Make sure your site responds acceptably fast to requests also under heavy load.

As a visitors we don’t like slow sites. We want pages to open fast and when page is opening every second is like decade for us.

You can use tools to make automatic stress tests to your site. Just record your paths, configure flood and run the tests. When you use performance counters you may find out almost all weak parts of your site before it goes to production.

6. Test your AJAX site in terms of SEO

When using AJAX in your site then you are usually in hard trouble – search engine spiders will see only few parts your site because they doesn’t run JavaScript. Yes, they can analyze it but they doesn’t run it. So, most content of AJAX intensive sites is invisible for robots and it will never get indexed.

To make your AJAX site search engine friendly try to avoid initial content loading over JavaScript. At least for these parts of your page that you want to be indexed. Also you have to make it easy to robots to navigate through your site.

If you want to see how robots will see your AJAX site then use a simple trick. Turn off JavaScript support from your browser and try to visit your AJAX site. Now you will see what really gets indexed when robots are indexing your site.

The End – The Point

The fundamental point of SEO is not technically optimize your sites for robots and indexing algorithms. Search engines are developed differently – they try to find more and more accurately how well page fits for humans needs.

Each of previous six point – if you noticed – mentions visitors and how things are getting better for them. Exactly – better to visitors, not robots.

If you develop your pages with visitors in mind then you are best SEO expert ever seen!

Liked this post? Empower your friends by sharing it!
Categories: ASP.NET

View Comments (11)

  • I'm not sure that item #2 falls into the category of SEO. Yes, it's nice for the human user to see the meaningful title but the search engine spidering your site doesn't really care what the URL is; as long as the content is readable and well structured it's happy.

  • ca8msm, I wouldn't say it doesn't fall into the SEO category. Many search engines do take the URL into consideration although it is calculated with a much lesser weight than other items.

    To illustrate my point type "product laserjet 1200" into Google. You'll see that within the first page about half the url's contain the search keywords. Try a similar query on Yahoo and you'll get a handful of pages with the keywords in the url as well.

    Of course it takes a number of optimizations to get good SEO results, not just friendly url's.

  • URL is pretty important part, I have to say. This far I have noticed that file name part is very important and path part of URL is less important or not counted. There are many fine tuning tips and tricks that people have found correlating with search results.

    The king of tips and tricks is this one: produce content that is useful for people to earn links from other sites - it is the gold of SEO area.

  • "To illustrate my point type "product laserjet 1200" into Google. You'll see that within the first page about half the url's contain the search keywords. Try a similar query on Yahoo and you'll get a handful of pages with the keywords in the url as well."

    I'm not sure that does illustrate the point. Just because the filename appears in the URL for those products, it doesn't mean that using a non-descript URL would have achieved a lower ranking. It is proven that search engines are perfectly capable of reading URL's such as ...?categoryid=1 and indexing the contents of that page. On the other hand, no-one here knows what exactly what algorithms any of the major search engines use, so you cannot say for certainty that using the filename has a more positive effect than not using one; whereas I can show you results where it hasn't has a negative effect (google for "ASP.NET ajax forums" as an example).

    Don't get me wrong, I'm not disagreeing with using friendly URL's (and I use them myself), but I think it's more for the users benefit than any search engine (hence why I don't think it's not really an SEO issue).

  • The article was very good.The 7 points explained about the Basic ASP.NET SEO by DigiMortal was fantastic.I also wanted to know about the Web Analytics.

Related Post