Search engine optimization is one of the most important parts of your digital marketing strategy. However, it is also one of the most difficult channels to explain to people. If someone isn’t spending their days knees deep in Google Analytics and Webmaster Tools, it’s highly likely that they do not understand a lot about the intricacies of SEO.
Therefore, when sending over SEO deliverables to clients, explaining what they are getting, why they are getting it, and how we did it is extremely important and can be tricky at times.
Luckily, there are SEO experts out there, like our very own Sean Dilger, to explain the what, why, and how of various SEO deliverables to clients (and fellow marketers). A few weeks back, Sean conducted an amazing Lunch and Learn for us in which he thoroughly explained many of our SEO deliverables, why we do them, and how we do them. So, without further adieu, here is a complete rundown on some of our SEO marketing services and deliverables:
In-Depth Keyword Ranking Analysis
The very first thing that Sean and the rest of the SEO team does when we begin an SEO campaign is conduct an in-depth keyword ranking analysis. Here, the SEO team pulls clients’ existing rankings and compare them to their competitors’ rankings using a handy tool called SEMrush.
We perform a current keyword ranking analysis for several reasons. The main objective of the in-depth keyword ranking analysis is to identify how the client is performing and compare them to their competitors in order to identify any low-hanging fruit keyword opportunities. Once we have this information, we are armed with the right information to carry out on-page SEO optimizations.
Google Analytics Benchmarking & Search Console Configuration
Once we have identified keyword opportunities, the next step is to ensure that the client’s Google Analytics account is set up properly. We begin by benchmarking current website performance metrics to ensure that we have numbers to measure against and compare to once our SEO campaigns have been running for a few months. We typically benchmark YoY, MoM, and QoQ metrics to make sure we are being as thorough as possible.
A few other things we do when setting up our client’s Google Analytics account include setting up executive dashboards, goals, and e-commerce conversion tracking. We also record the website’s current Domain Authority.
This deliverable also includes configuring Google Search Console, which is a free Google service that helps companies monitor and maintain their site’s presence in the SERPs. We configure the Search Console to track the overall health of a website. During configuration we look at a variety of metrics that help understand the health and performance of your website, any issues that exist, and how to fix those issues.
Website Crawl And Analysis
Making sure that your website is crawlable is extremely important if you want to rank higher in the SERPs. Why? Because pages that can’t be crawled and indexed will never rank very high, which means that you are missing out on the opportunity to attract thousands of new customers organically.
This is one of the many reasons why we perform a website crawl. Other reasons include: identifying any issues that could be holding back SEO efforts, identifying the maximum number of pages on the website, and ensuring that your site is healthy (from a technical SEO perspective).
When we perform a website crawl analysis, we look at:
- The site’s HTML pages and architecture
- All meta data and current URLs
- Response status codes (200, 301, 302, 404, 501, etc.)
- Duplicate URLs
- Duplicate page titles, meta tags, and H1s
- Page titles, meta tags, and H1s
- Image sizes
Duplicate Content Analysis
Duplicate content refers to substantial blocks of content within or across domains that either completely matches other content or are appreciably similar. Google will punish sites with duplicate content – preventing you from moving up in the SERPs (this is not good).
Therefore, we run a duplicate content analysis to identify any internal or external duplicate content that may be hurting your ranking opportunities. If we identify any duplicate content, we will work with our content team to write new, unique content that speaks to our target consumer.
You’re probably thinking what if another site steals MY content? What other websites publish is out of your control.
In order to protect your content, use a self-referral canonical tag on your pages to ensure that you aren’t punished for somebody else’s plagiarism.
Review & Configure Robots.Txt File
The robots.txt file tells search engines which pages to index and crawl. It is important that all of the pages you are optimizing are crawlable if you want them to rank. The goal of this deliverable is to identify the pages that are currently being blocked by robots and take the necessary steps to index them and get them crawled.
In addition to identifying pages that are blocked by robots, we also go through and block certain pages ourselves.
XML Sitemap Configuration
An XML sitemap tells search engines about your website structure and updates it automatically with sitemaps. It also contains a list of every URL on your website. We configure clients’ XML sitemaps for several reasons.
One important reason why we evaluate and configure the current sitemap and structure is to ensure that the website is getting indexed properly. We also make sure that only the most important pages are included in your XML sitemap. This is because every time Google crawls your website, there’s a finite number of URLs they will crawl. Therefore, we want to prioritize the most important pages over less valuable pages.
Google launched the Penguin update back in 2012 and has continued to release new updates throughout the years. Penguin looks at aggressive link-building campaigns and unnatural links pointing towards your website and punishes those sites with spammy or suspicious links. In order to get on Google’s good side, every link that points to your website should be natural.
This is why we conduct a link cleanup. During this process, we identify bad links and remove and disavow these links.
Keyword Research & Mapping For On-Page SEO
One of an SEO’s biggest deliverables, keyword research and mapping, is done to drive traffic to the top trafficked pages. During the keyword research and mapping process, the SEO team takes a look at the current positions and revenue analysis for each page and does further research in order to align with search volume and competition of the keyword. Once we’ve vetted each keyword, we apply keywords in the title tags, H1s, and meta descriptions.
Schema markup is a code that you put on your website to help the search engines return more informative results for users. This helps to ensure that we are providing users with more information in the SERPs, which has been proven to increase click-through-rates. Oh yeah, did we mention that it helps clients rank better too?
CRO, or conversion rate optimization, is a digital marketing strategy designed to increase conversion rates among website visitors through continuous testing and optimization. During the CRO analysis, the SEO team tracks how users engage with your website pages, identify areas of opportunity, and provide data for A/B testing environments.
CRO is a great strategy if you are looking to increase conversion rates on any given page. Who wouldn’t want that?!
Explaining and understanding SEO deliverables is no walk in the park, I’ll be the first to admit it. However, I hope that Sean’s insights and explanations help you sift through your next round of SEO deliverables! We also conduct additional SEO deliverables for clients including link building for off-page SEO. Contact us today for an SEO audit and learn abut our SEO packages!