All In a Week’s Work in SEO: Sitemaps, DNS, CSS & Internet Explorer
As an SEO, I am continually looking for ways for my clients to improve their overall visibility as well as increase usability – two very important issues for an effective online marketing strategy. There are three topics I would like to speak briefly about in this post: sitemaps (not Google Sitemaps) but real old-fashioned sitemaps that are supposed to be a helpful resource for users as well as search engines to find all of the pages on a site, how a DNS setup really can have an effect on search engine positioning and indexing, and finally CSS – a somewhat new and wondrous language to me – and the issues that arise from its use in Internet Explorer 6.
Sitemaps
Simply put, a sitemap is a way to provide users and engines a path to all of the pages on your site. Keeping this page updated, clean, concise and organized can benefit a site. For example, by providing users and engines links to the 5-10 most recent news articles and archiving the rest on pages organized by year, is a way to provide links to all news articles without listing them all or just linking to an umbrella news page. Also, by updating the list of most recent articles, new and fresh content is being added to the sitemap as frequently as your company creates news articles. If you can keep your sitemap to around 100 links, you are golden. According to Google’s Webmaster Guidelines:
“Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages.”
DNS / Server Setup
I definitely am not claiming to be any sort of expert on server setups, but the way a server generates URLs or as I learned earlier this week, the way a server allows subdomains with a CNAME, can greatly affect search engine positions and indexing. So be sure only one version of your site resolves and that your URLs are consistent and somewhat clean. Google’s Webmaster Guidelines urge….
“Don’t create multiple pages, subdomains, or domains with substantially duplicate content.”
“If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.”
CSS
We all know that the engines can’t read JavaScript, so recently, I suggested a client use a CSS dropdown in place of his current JavaScript powered menu. Because I use FireFox for a majority of my testing, when he emailed me later in the week and said he couldn’t get it to work, I was perplexed. I tested it in Internet Explorer just to see what happened and, sure enough, it didn’t work. A member of my team tested it in IE on his computer and it worked…still perplexed. We worked with the code for a bit until it dawned on us, I still have IE 6 on my computer and he has the updated, IE 7 version on his. And here lies the problem, IE 6 doesn’t support the a:hover function. Just something to keep in mind.
In closing, I would just like to mention that a quality SEO looks at much more than just title and meta tags. There are many, many reasons sites can have trouble positioning in the SERPs, and a good SEO will look into these issues and suggest quality fixes to these issues. A good SEO also focuses on usability and works with the client to improve the optimization of the site as well as the experience of the users.