Build a Contextual Link for Google and Microsoft - Link Best Practices

Build a Contextual Link for Google and Microsoft - Link Best Practices

February 3, 2024
Marketing Two Cents

Table of Contents (Click to show/hide)

Data Analysis Skills | DEANLONG.io
10.54%

Monthly
Active User Rate

Daily Budget
$5000

Daily
Campaign Budget

Click-through rate increase
60%

Increase
Click-through Rate

an icon shows a lightbulb that indicates being creative | DEANLONG.io
15%

Growth
Return on Investment

Data Analysis Skills | DEANLONG.io
#1

Customer
Segmentation

Daily Budget
#2

Prioritisation of
Limited Resources

Click-through rate increase
#3

Competitive
Responses

an icon shows a lightbulb that indicates being creative | DEANLONG.io
#4

Consumer
Change

‍

This is About The Best Practices of Building Links Internally

‍

As Google crawls a site, it looks for effective links that allow it to discover new content and establish connections between two or more sites. When used correctly, they should also help the crawler better understand the topic and intent of a website, which, in turn, improves the site’s SEO ranking. However, this only applies if the links were implemented with SEO in mind.

To help my fellow web and content developers improve their SEO rankings, I’ll quickly run through some of the best practices for creating links and then go over what Google is looking for. I recommend applying these strategies as you build your platform and remember that Google can demote your site for improperly using links.


What Makes a Good Link – Technical Considerations


From a technical standpoint, Google searches for specific formats that allow it to crawl and extract information from a link. These are straightforward segments of code that are easy to integrate:


Crawlable Links - HTML <a> Tag


Google will only be able to crawl a link if it includes an <a> HTML anchor element and an href attribute. If neither tags are present, or if they are coded using a different format, Google will likely be unable to parse and extract the information. The same format applies to Microsoft Bing:


  • Crawlable:
  • <a href="https://linkedwebsite.com">
  • <a href="products/cheeses">
  • Uncrawlable:
  • <a "https://linkedwebsite.com">
  • <span href="https://linkedsite.com">

‍


Additionally, the URL in the <a> element must resolve into an actual website to which the crawler can contact and receive information. In other words, the URL must resemble a URL:

‍

  • Crawlable:
  • <a href="https://linkedwebsite.com/linkedproductcheeses">
  • <a href="products/cheeses">
  • Uncrawlable:
  • <a "https://linkedwebsite.com">
  • <a "javascript:goTo('linkedproductcheeses')">


Note here that JavaScript inserts are acceptable, as long as they resemble the URL format shown above.


Well-Defined Anchor Text - Wrapped By  HTML <a> Tag


Anchor text is the text that’s visibly shown as a hyperlink. This information should be added within the <a> element, as shown here:


  • <a href="https://www.deanlong.io/blog/best-practices-for-youtube-video-seo">best practices for boosting Youtube video visibility</a>

‍

If the <a> is left empty, Google may use the title attribute as a fallback. If you are building links directly into images, use the alt attribute of the img element to build the anchor text:

‍

  • <a href="/blog/google-crawl-and-index-iframe-seo"><img src="https://assets-global.website-files.com/5ea82056771c9dd4b6f61bdc/6235459ae9fbf8664e54e955_DL_optimising-iframe-for-SEO_Main_Dean_20220319.png" alt="two hands holding a display panel with text How Does Google Crawl and Index iFrames with black background and white font"/></a>


It will render the image with a link overlay below:

‍

two hands holding a display panel with text How Does Google Crawl and Index iFrames with black background and white font


If you are inserting the text using JavaScript, use Google’s URL Inspect Tool to guarantee that it’s visible in the rendered HTML.


‍

What Makes a Good Link – Contextual Perspectives


Beyond their basic format, Google will also judge links based on their qualitative features. The linked sites should be well-researched and trusted pages that offer quality user experience. In general, it’s best practice to link to authoritative sources, such as .org’s, .edu’s, and .gov’s, rather than citing lower-quality blogs or webpages. Additionally, Google will look for the following:


High-Quality Anchor Text


The exact wording of anchor text and link title attribute should be descriptive, to the point, kept in context, and related to the page it links to. A reader should have a basic understanding of where the link will lead them before they click. If not, Google will consider the link lower-quality:


  • Good Anchor Text: We proudly stock a wide <a href=”https://linkedwebsite.com”>selection of chocolate truffles</a>
  • Bad Anchor Text: Find our full selection of chocolate truffles <a href=”https://linkedwebsite.com”>here</a>


Keep anchor text relatively short and make it as natural as possible. The crawler will consider the text before and after the link so, if it’s a jarring sentence, the link won’t rate as highly. Google will demote your links if you try cramming keywords into them and classify your site as spam.

‍

On February 3, 2024, Roger Montti published a compelling article on Search Engine Journal "Sentence-Level Semantic Internal Links For SEO," which presents a thorough and persuasive argument for the use of descriptive anchor texts in internal and external linking strategies.

‍

Enhanced Descriptive Example:

  • Before: "Explore our Sydney patio furniture collection."
  • After: "Plan your outdoor oasis with our Sydney patio furniture collection."

This revised example ensures the surrounding text implies a browsing or shopping intent, making it clear that clicking the link will lead to a collection ideal for those looking to furnish their outdoor spaces.

‍

External Link Examples:

When linking externally, maintaining a descriptive and contextually relevant approach is essential, considering the tone and intent. Here are refined examples for external links:

‍

Bad Example Improved Approach
"Click here for more information." "Discover the history of ceramic art on ArtHistory.net."
"Visit this page for details." "Read the latest outdoor design trends on HomeDecorBlog.com."

‍

Refined Anchor Text for Clear Intent:

  • Before: "Learn more about our handmade ceramic pots."
  • After: "Discover the craftsmanship behind our handmade ceramic pots making process."

‍

This adjustment clarifies the anchor text's intent, indicating that the linked content focuses on the process and artistry behind the products, which is more specific and informative for the reader.

Testing Method - A Fool-Proof Approach:

‍

To ensure your anchor text is both contextual and descriptive, ask yourself:

  • "If I encountered this anchor text out of context, would I have a clear understanding of what to expect on the linked page?"
  • "Does this anchor text align with the intent or action I want the reader to take?"

For example, consider the sentence: "Our blog offers a deep dive into the latest gardening trends." If the anchor text is "latest gardening trends," ensure the linked content indeed provides a comprehensive exploration of current trends in gardening, matching the reader's search intent for up-to-date information.

‍

For further reading and to understand the foundational principles behind these strategies, refer to Google's guidelines on crafting effective anchor texts and creating crawlable links:

‍

Cross-Referenced Internal Links


Links don’t have to send readers to an outside source—they can also guide them to other areas of your website. By cross-referencing your own platform with internal links best practices, Google will be better able to understand the layout of your website and your readers will also find it easier to navigate. This factors into your SEO ranking and can improve overall user experience.


Use External Links to Cite Sources


External links should be used to provide evidence for your website. Linking to an authority proves that your site is trustworthy.


However, be careful to not overdo it. If you sense that a source may not be reliable or you don’t want to link back to it, you can use add nofollow to the code to distance yourself from it.


Likewise, if you’ve been paid to include a link, it’s considered best practice to add either rel="nofollow" or rel="sponsored" to the link.


‍

How to Acquire Links


Both Google and Microsoft Bing prefer organic links that naturally drive traffic toward high-quality and respected sites. The more links your site receives, the more respected it becomes. This can be seen as a flow of value. The more people connected to your web, the more valuable it becomes. Therefore, you should work to acquire new links to improve your site's ranking.


You can do so by utilising the following strategies:

  • Guest Blogging – Generate high-quality for another website.
  • Skyscraper Technique – Improve your competitor’s content and then reference the same backlinks.
  • Link Inserts – Link to a site that offers more information about something they’ve already referenced
  • Ego Bait – Shine a positive light on another platform and they’ll reciprocate in time.
  • Testimonials and Case Studies – Give positive feedback about their products or services.
  • Exchange Links – Offer to link back to them if they agree to link to you.
  • Resource Page Link Building – Collaborate and share a good reference that fits their existing content.
  • Broken Link Building – Help them fix a “dead” link on their page by providing a replacement.
  • Image Link Building – Ask to get credit for using your image.
  • Unlinked Mentions – Ask to make any mention of your brand “clickable.”
  • Link Moves – Ask to make changes to an existing link pointing at your website.
  • HARO and Journalist Requests – Give an “expert quote” for their article.
  • PR – Give them a killer story to cover.


Applying these strategies will allow you to rise in the rankings far faster than simply relying on good content.

‍

However, Understanding Link Spam Detection Study Helps In Building Research-Based Link-building Structure

‍

In the web spamlink detection study (Becchetti, L. et al, 2007), the spam detection technique uses the structure of web links to identify spam. It operates independently of page content by analyzing how pages link to each other. This method involves statistical analysis and machine learning algorithms that consider the link patterns and behaviors characteristic of spam pages, such as abnormal linking practices or the creation of link farms. The structure-focused approach allows for the identification of spam without needing to analyze the text on a webpage, making it a valuable tool in the fight against web spam.

‍

In this study "Link Analysis for Web Spam Detection" (same study mentioned above) abnormal linking practices and the creation of link farms are defined as follows:

‍

Abnormal Linking Practices: These practices are manipulative in nature, aimed at distorting the natural link ecosystem of the web to favour certain pages. Examples of such manipulative techniques include:

  1. Link Buying: Purchasing links from other websites to artificially boost the number of inbound links.
  2. Excessive Link Exchanges: Engaging in disproportionate reciprocal linking schemes that lack relevance and serve no real user purpose.
  3. Hidden Links: Placing links on a webpage that are invisible to the user but are crawled by search engines.
  4. Automated Link Creation: Using automated tools to generate links at scale without regard for content relevance or quality.

‍

Creation of Link Farms: A link farm is described as a densely connected network of webpages that are created for the sole purpose of manipulating link-based ranking algorithms. This form of collusion involves a group of users who manipulate the link structure with the intent to improve the ranking of one or more pages within the group. The study specifies that pages in a link farm may have a high number of inbound links (high in-degree) but maintain little meaningful relationship with the rest of the web graph. Additionally, link farms can expand their reach by acquiring links from non-spam sites through advertising or purchasing expired domains that were once used for legitimate purposes​​.

‍

Attachment
The difference between link farm and natural link reference, source: Becchetti, L. et al, 2007

‍

Degree-Based Measures: It computes statistics like the number of in-links and out-links of a page. A very high number of either could suggest spam, as normal pages typically do not have an extremely unbalanced link ratio.

‍

Distribution of the fraction of new supporters found at varying distances (normalized), obtained by backward breadth-first visits from a sample of nodes, in four large Web graphs [BaezaYates et al. 2006]
Distribution of the fraction of new supporters found at varying distances (normalized), obtained by backward breadth-first visits from a sample of nodes, in four large Web graphs [BaezaYates et al. 2006

i

Image of Distribution of the number of new supporters at different distances, for pages in different PageRank buckets. This means the higher the page authority is, the less distance from its supporters
Image of Distribution of the number of new supporters at different distances, for pages in different PageRank buckets. This means the higher the page authority is, the less distance from its supporters

PageRank: This Google's algorithm measures the importance of webpage based on the number of links. If a page has an unnaturally high PageRank without quality content or external links from reputable sites, it could indicate spam.

‍

TrustRank(Gy¨ongyi et al. 2004): It assigns a level of trust to each page, with lower scores indicating a higher likelihood of spam. TrustRank relies on the concept that good pages rarely link to spam pages.

‍‍

Truncated PageRank: Similar to PageRank, but it ignores the contribution of links at the first few levels. This can be useful to combat spam, as spam pages often have many close connections in link farms.‍

‍

Supporter Estimation: This approach involves looking at the neighborhood of a page and estimating the number of 'supporters' or genuine links a page has. A spam page might have a high in-degree (many links) but little real connection to the rest of the web graph.‍

‍

Supporter definition: Link analysis algorithms assume that every link represents an endorsement, in the sense that if there is a link from page x to page y, then the author of page x is recommending page y. We call x a supporter of page y at distance d, if the shortest path from x to y formed by links in E has length d. The set of supporters of a page are all the other pages that contribute towards its link-based ranking.

‍

Bit-Propagation Algorithms: These algorithms propagate rank or score through the network, considering the link structure. Anomalies in bit propagation patterns could indicate spammy behaviours.‍

‍

Probabilistic Counting: It uses probabilistic methods to count the number of distinct elements (like links or supporters) in large datasets. This can be used to detect spam by identifying unnatural patterns in the link structure.

‍

These techniques collectively aim to detect spam by analyzing the web graph's topology and identifying patterns that are not typical for regular, high-quality pages. For example, link farms, which are groups of interconnected pages created specifically to manipulate search engine rankings, would have abnormal link structures that these algorithms can identify. But of course there are a lot of foreign and hard-to-understand terms above so I made this table to give a further explanation of the detection technique with some suggestions on how to improve our link acquisitions:

‍

Link Spam Detection Technique

Definition

Example

Now What

Truncated PageRank

Ignores direct contributions from the first levels of links to reduce the impact of close-knit spam networks.

A page receives most of its PageRank from a tightly connected group of links.

Strengthen link diversity by gaining organic links from a variety of sources and distances.

Estimation of Supporters

Calculates the number of unique domains contributing to the ranking of a host.

A website sees a sudden, artificial increase in supporter hosts.

Ensure gradual and natural growth in backlinks from reputable and diverse domains.

Supporter Metrics

Analyzes the rate of change in the number of supporter hosts at various distances.

Disproportionate number of supporters at a distance compared to closer levels.

Aim for a balanced link profile that gradually builds authority without sudden spikes.

Bit-Propagation Algorithms

Tracks binary values through the link structure to detect irregular patterns.

An unexpected pattern of link formation that deviates from the norm.

Monitor link acquisition patterns and avoid practices that lead to unnatural bit propagation.

Probabilistic Counting

Estimates the number of distinct elements (like unique linking domains) without storing all link information.

A low number of unique domains despite a high volume of incoming links.

Focus on acquiring high-quality links from a wide range of unique and authoritative sources.

Statistical Techniques

Analyzes the web graph's statistical properties to identify outliers or anomalous behaviours.

A page has an unnaturally high in-degree compared to its neighbourhood.

Regularly audit backlink profiles and disavow links that could be interpreted as spam to maintain a healthy link ecosystem.

‍

Why do we care?

‍

Tinkering practical link acquisition plan on how to navigate the detection techniques and maintain a website's integrity in the face of potential spam identification. This study & my two cents provide a concise guide for webmasters and SEO professionals to understand spam detection mechanisms and how to ensure their sites are optimized against these measures.

‍

Google Prioritised Your First Text and Image Links (Source: Zyppy)

‍

In Zyppy's article on Google's Selective Link Priority. It discusses Google's Selective Link Priority and its impact on SEO. It explains that when a page has multiple links pointing to the same URL with different anchor texts, Google may selectively use only certain anchor texts for ranking purposes. This is a shift from the earlier understanding of 'First Link Priority.' The article includes tests to explore how Google's link priority rules work, revealing that Google might count the first text link and the first image link, but not necessarily all anchor texts on a page. For optimizing SEO in light of these findings, the article suggests being mindful of how Google may interpret multiple links to the same URL on a page.

‍

Attachment
Prioritise Your First Text and Image Links for Better Google Ranking, Source: Zyppy

‍

‍

Conclusion


The basic principles are simple—format your links correctly, structure them around natural text, avoid keyword stuffing, and link to authority sites. As you build your platform, you can collaborate with other websites to increase the flow of value and rise through the SEO rankings.



References


  • https://developers.google.com/search/docs/crawling-indexing/links-crawlable
  • https://www.bing.com/webmasters/help/link-building-7a3f99b7
  • https://www.bing.com/webmasters/help/webmaster-guidelines-30fba23a
  • https://www.searchenginejournal.com/google-link-guidance-takeaways/480173/
  • https://www.searchenginejournal.com/seo-internal-links-best-practices/214886/
  • https://ahrefs.com/blog/link-building
  • https://searchengineland.com/google-publishes-new-link-best-practices-393169
  • https://www.deanlong.io/blog/image-seo-best-practices-for-visibility
  • https://www.searchenginejournal.com/how-to-use-haro-link-building-pr/462126/
  • https://zyppy.com/seo/internal-links/selective-link-priority/
  • Becchetti, L. et al. (2007). Link Analysis for Web Spam Detection. [online] Available at:https://chato.cl/papers/becchetti_2007_link_analysis_web_spam_detection.pdf

‍

Best Practices
Google
SEO
Technical SEO
Industry Update
Dean Long | Expert in Growth MarketingHongxin(Dean) Long

Dean Long is a Sydney-based performance marketing and communication professional with expertise in paid search, paid social, affiliate, and digital advertising. He holds a Bachelor's degree in Information Systems and Management and is also a distinguished MBA graduate from Western Sydney University.

Related Posts