Starting a site and introducing your brand on the web can be hectic. On top of the research needed to arrive at an SEO friendly business-centric domain name that people can relate to. This is one of the most important decisions that will contribute to the extent of traffic on your site.
SEO tacticians have devised many practices to help rank a brand. Which has come to evolve greatly over the years, as Google changes its algorithm and issues new rules/reforms to showcase only the best-relevant content at the top.
All done to help anyone browsing, to arrive at the most informative and credible result.
Besides, sticking to a single domain, making it readable and bringing keywords; matching it to your page is also equally important. Make Your URL Structure Easy-to-Read and exclude Special Characters. Further use keywords relevant to content and avoid symbols and other jargon.
But beyond all this, another concern for many brands is the traffic that gets split between several versions of the same page and how failing to use one concrete page can lower the traffic to your webstore.
So, which version, or which URL for SEO is more effective, how URL in SEO is important, and finally what are the URL best practices is what we’ll explore in this write up:
Composition of a URL?
URL’s are unique, human-readable Internet addresses of websites. The URL parts come to include; a top-level domain name, also referred to as an extension (.com, .org, .gov and so on), sub-domain, the actual domain name, which sits as the IP address & a protocol that precedes all the above.
The top-level domain (TLD) which forms the suffix and the domain name, which is the second level of a domain's hierarchy plus the unique address of a site, together make for the root domain.
The first part called the “Protocol” can be FTP, HTTP and HTTPS. The “S” indicates the use of a Secure Sockets Layer (SSL) certificate to encrypt the connection between the end user & server. This is essential when it comes to protecting sensitive info from unauthorized access.
We will wrap our blog around this; as the web crawlers start pushing for one of the pages less secure.
Different Site Versions
Consolidate separate versions of the site in the eyes of Google. A www.Virtina.com and a Virtina.com are 2 different sites. Thus, viewed and processed differently. Which is further complicated by HTTP and HTTPS; with preference going to the latter.
Due to the difference in the two; the failure to use the right one can diminish the website’s SEO health and performance. This is because not all URLs are created equal.
Most SEO’s use the “301 redirect” to bounce a visitor from one version of their site to another one. This will gradually tell the search engines that a URL has moved to another destination. Some even suggest the use of canonical tags to resolve this issue.
Or, you also get the option to set the preferred version in Google Webmaster Tools (Google Search Console). But this only addresses the problem for Google, one of its limitations. In the absence of this, you may end up splitting the SEO juice between all the different versions that exist.
Specify the version (preferred domain); which would help the webmasters & SEO’s to manage and track organic traffic. Google detecting the www and non-www as separate entities/properties, means you need to see the two in the same manner.
Beyond this, it’s important to add and verify every one of your website’s URLs to get a better picture. You are doing this to ensure the crawlers & tracking bring all the attention to the best version.
Google Search Console
Get to add all versions of your domain name as properties to the Google Search Console and from there tag the ones that will be set as primary version.
As this tool lets every admin setup and clean up the web pages correctly, to dedicate more visitors to the intended version – thereby improving your website’s organic visibility & performance.
Thus, URL Redirect is essential, by giving weight to the HTTPS:// version, which will forward every arrival on the variation page to the principal one. Add URL’s to the console and manage the same with absolute ease; besides each one providing valuable insights to Google on which one to go with.
If the website has any subdomains, or language/country/content or otherwise specific subdirectories; the additional subdomains(blog/news), language subdirectories (en, de, au, etc.) content-specific subdirectories and staging subdomains must all have the HTTPS-protocol.
This will help Google to establish the one that gets higher importance.
Which also includes all the subdomains that are not meant for indexing, pages that don’t hold any useful content- with the likes of an admin login page & other pages in staging. Once done, you’ll be able to review and get valuable insights on how Google “sees” the website.
301 Redirects & URL Parameters tool
“301 redirects” are often cited as the best method to get rid of unwanted duplicates. But you’ll need to ensure a smooth transition to the other pages before we dump the old URL’s. Pick the one that best serves as the canonical URL and then send traffic from the identical ones to the preferred one.
Some sites use URL parameters for page variations that are not very significant. So essentially you end up displaying the same content for different URL’s in which case you know that Google is crawling your site inefficiently. 301 redirects tell Google bots that a redirected URL is a better version than a given URL.
If you happen to have more than 1000 pages and the duplicate pages vary only by parameters, then you can make use of the URL parameter tool. The tools help to prevent Google from crawling URLs that contain specific parameters, or parameters with specific values.
But the default setting is “Let Googlebot decide” in which case it will analyze the site to determine the best way to handle the parameters.
So, try as much to specify how Google crawlers will read and process the pages of your site with specific parameters. But, take up these measures, only if you are an expert level SEO; as incorrect settings can cause Google to ignore important pages.
Canonical Tags and Duplicate Content
When to use Canonical tags – when you have 2 URL’s (one with HTTP and one without) that contain the same content, it gets confusing for Google crawlers. In the end, you end up diluting your SEO value. Avoid Duplicate Content.
Thus, a Master page must be established to not drive traffic to the other versions, which exists as a separate page to the search engines. Canonical tags are a useful piece of code to have when you have multiple versions of what is essentially the same page. Tell Google which one is your preferred version.
So, what are Canonical Tags and How important are Canonical tags? A Canonical tag can tell Google that a specific URL is like a master copy & you want that to appear in Search Engine Results Pages (SERP).
Canonicalization is a way to navigate this duplicate page problem, a way to tell search engines which pages on your site are the master pages. In other words which URL should you focus on in SERP; which will then optimize your site for higher traffic & improved rankings.
It’s simple, once Google detects one of the pages as the Canonical version, then every other page that is another version of the original page is tagged as a duplicate. As such the crawlers will approach it less often, thereby reducing the traffic to that page.
From your side, if you don’t explicitly tell Google which URL is canonical, Google will decide for you, or might consider them both of equal weight, which will lead to unwanted behavior
As such, it is vital that we integrate canonical tags for the following reasons:
- You want your visitors to arrive at a specific URL; which also is the one that you want your audience to see in search results.
- Helps search engines to consolidate link signals for similar or duplicate pages into a single preferred URL.
- To simplify tracking metrics for a specific piece of content.
- To manage syndicated content to your preferred URL.
- You want to avoid Google Bots from spending “crawling time” on duplicate pages.
- When using a sitemap, don’t include non-canonical pages.
Use Canonical tags to self-reference a page; whereby the several variations of the page are tied into one preferred URL. This will then serve as the most important version. We recommend having a canonical element on every page.
Apply a self-referencing canonical, to avoid any potential SEO/duplicate content risks. You don’t want the Google Bot floating around the less secure page and tagging it canonical. Which’d then mean that the main version will be crawled less frequently.
Google will look at several factors (signals) such as; http / https; page quality; URL in a sitemap; and any "rel=canonical" labeling. Google uses the canonical pages as the main sources to evaluate content and quality. So, how do you implement canonical tags is what we’ll see going ahead:
- Add a rel=canonical <link> tag for all duplicate pages, pointing to the canonical page. Useful in eCommerce pages when a product as several different URL’s.
- Send a rel=canonical HTTP header in your page response. Especially useful with PDF files.
- A little less powerful signal for Google Bot is specifying the canonical pages in the sitemap.
In the absence of all this, Google will freely decide which is the best version or URL. Also, as per Google, don’t use robots.txt file and URL removal tool for canonicalization. Don't specify different URLs as canonical for the same page.
For hreflang tags, make the hreflang link point to the canonical version of each URL. Every language should have a rel= canonical link pointing to itself and not just the en-gb page. Finally, when linking within your site use the canonical URL, linking consistently to the same helps Google understand your preference.
HTTPS:// & Why you should migrate to secure HTTP
Online security is a hot topic these days. Google recently updated the core algorithm of their search rankings. As a result of this update, it added a few more parameters to weigh the quality of a web page. The quality will establish its position in the webpage.
These parameters are Expertise, Authoritativeness, and Trustworthiness (E-A-T).
So, trust is an important factor when it comes to optimizing content for searches. And one of the best ways to do this is to use a URL structure that exudes trustworthiness. So, migrate your eCommerce site to secure HTTP or HTTPS (if you haven’t done so already).
Google favors site versions that are more equipped to provide a secure experience. HTTPS is a ranking signal for Google; having an SSL certificate on the site & redirecting traffic to the HTTPS:// version causes Google to recognize that as the best version of your website.
Encrypt all communication between the users & your website and safeguarded against malicious attacks. The presence of HTTPS in your URL structure will make your visitors more comfortable, and they will be more likely to check out your site.
The businesses that collect sensitive information and process any payments are recommended to acquire an SSL certificate. Thus, enable HTTPS:// and give the impression of a safe website. With cybercrime & identity theft on the rise, any brand would need to show its ability to provide an encrypted connection.
Apply a canonical tag that brings the “cleanest version” of all the URLs in the group. This helps to converge/funnel the SEO value, that each of these 3 URL’s come to divide amongst themselves to one single canonical URL.
Users are more likely to trust sites with HTTPS when they try to capture sensitive information; which in turn improves conversions & leads to improved orders. Set your HTTPS redirect now.
Conclusion
Follow the best practices for URL’s to help them assist your website SEO in ranking higher. There is more than one way to access a domain but, your IT team should bring together all the versions towards a single best page to ensure SEO health is not compromised at any time.
The URL structure is a crucial element of your on-page SEO. As such, optimize the same more holistically for both On-page and Off-page SEO. Canonical tags affect SEO, so use them properly.
Due to a “copy page” existing on the site, the SEO crawlers can get tripped to identify which one of them is the original. Make sure to merge and redirect all the traffic to the preferred version. Address all the variables needed to optimize the best version of a page. Your URL and SEO go hand in hand.
Building a unified backlink profile or increasing the authority of the website can become a problem if there exist many entities of the same site. Thus, when creating your backlink profile, it's crucial how you index the site on Google. Your preferred site version affects every reference of your URL.
Use one consistent & definitive URL, when you use internal links, link to your website from other sources, and build your backlink profile. A deviation from this will trigger different data in the Search Analytics section, and in other metrics. Choose your preferred site version & track online performance.
Now, if this is not already regulated on your valuable eCommerce store and your brand hasn’t been following URL best practices over the years, then its high time you consulted an expert SEO curator in Virtina. We will analyze your site’s URL’s for best viewership. Get short, easy, and fluent URL’s.
The various versions of your one URL will be merged, to not divide the SEO value across several entities, and made more mindful of the site version. So that you can be more consistent in using the correct URL. Virtina brings all the versions under one and streamlines your backlink profile.
Virtina can help you to increase your revenue, improve profit and enhance customer experience.