Andrea Ranieri Palma
Since 2016, number of websites has doubled, reaching a total of 1.8 billion (they were slightly less than 1 billion in 2016). As a consequence, the number of irrelevant web pages has grown rapidly, determining a negative effect on users capacity to find right and truthful information in a concise time-lapse.
For this reason, SEO practices have become increasingly more popular during last years, establishing the acronym as one of the most used buzzwords over the internet, featuring multiple appearances in internet gurus own blogs and social media posts.
SEO acronym stands for “Search Engine Optimization”. All of the best practices and tools meant to improve a websites position in search results, overall visibility and exposure, in response to a given “search query”, fall into this definition.
When users insert a search query, it results into a SERP (Search Engine Results Page) page which contains all websites related to keywords used to prompt the query they refer to.
As of the latest developments in the field, Google detains, “de facto”, a monopolistic position within the search engines market, thus becoming a technological standard everybody refers to when tweaking SEO strategies and best practices.
As all technical aspects related to websites development and maintenance, SEO practices might be wrapped in a mysterious mist, which elicits awe and fear in those observing. This is usually due to their technical nature which entails a certain degree of knowledge when it comes to code and read a website’s source code, which contains fundaments of SEO best practices
As a consequence of limited understanding of SEO best practices, the internet is flooded with poorly indexed websites, with no setup to improve exposure and positioning in search results related to preferred search queries. This phenomenon drastically lowers quality of traffic received by websites, as a consequence of poor positioning in search results page and poor adherence to best practices and standards given by Google’s community
Thus, here at Fontoso we decided to write a short little guide containing 5 best practices easy to implement which will immediately yield results for any website’s SEO performance:
Content is king! Every properly indexed website is build on strong textual content.
Indeed, search engines crawl each website HTML markup and classify their performance according to how adherent is their textual content in regards of the main keywords being used in it. These “web crawling” algorithms are responsible for scoring a given website in relation to a user-generated search query, thus influencing a website final position within a search results page.
Therefore, choosing and identifying the right set of keywords, which describe well what a website is all about, is instrumentally important in order to achieve commercial and exposure objectives to make a website successfully active on the internet.
Our main advice to our clients is to monitor best-performing competitors’ websites within a given search query and to analyse their textual content to have a better grasp of which keywords perform the best within that search query. In this way, there’s little room left for mistakes and chances to write relevant content are noticeably increased, leading to better performances since early stages of a website’s own life on the web.
Although, it is useful to remember that stealing textual content without properly citing its source is highly discouraged and might generate penalties assigned by web crawlers when examining content being featured on a new page, left alone that it entails malicious use of intellectual property and a clear sign of copyright infringement which might backfire more than benefit a company. Thus, we strongly discourage this practice to our clients and readers
n every website, within the
HTML markup, there’s an
<head> section which contains important information about functional and technical characteristics. Among these information, a set of meta-tags has to be featured.
Here are the most important:
The title tag will signal to web crawlers what kind of content it could expect to find within the webpage. Our advice is to follow a structure as shown in the picture, choosing two main keywords followed by the brand name, or business name. Each webpage shall have an unique title and its length should be within 50 to 60 characters.
The description will be useful for the search engine crawler to deepen the nature of the content of the web page it refers to. A good description shall contain the main keywords, possibly not repeating them, and should be between 50 and 160 characters long.
In a world where people browse more and more from their smartphones, this tag has become crucial to optimise web views on mobile devices. For more information on this, you can visit this page, where expert Google developers dispense useful advices about best practices on the most popular viewports.
Coerentemente con l’evoluzione e la rinnovata importanza strategica dei social media, alle solite meta-tags si sono aggiunte le tags Open Graph, protocollo standard utilizzata da piattaforme come Twitter, Facebook e Whatsapp per popolare le anteprime di condivisione di un link.
Coherently with the evolution and renewed strategic importance of social media, the usual meta-tags have been joined by Open Graph tags, the standard protocol used by social media platforms such as Twitter, Facebook and Whatsapp to populate link previews when sharing them.
As for the classic counterparts, we list the most used Open Graph tags:
The "site" property suggests the site name, often reflecting the keyword used for the website domain it refers to
The "title" property indicates the unique title of the page in which it is inserted.
The "description" property briefly describes (between 50 and 160 characters) the content of the page in which it is featured.
##### - og:url
The URL property suggests the destination URL of when sharing a preview link on any social media platform. This can be very useful for masking very long URLs which are too long to use.
The "image" property indicates to the platform a short path to reach the graphic resource to be used for the sharing preview. Our advice is to use images with dimensions of at least 1200 x 630 and whose file size does not exceed 8mb.
To check that the sharing preview of your link complies with the conditions set out in the meta tags, you can use the very useful debugger tool offered by Facebook.
The HTML programming language provides a wide variety of usable tags, some used to achieve the same purpose, albeit in a different way. Here are five easy rules to follow in order to write an HTML code that allows web crawlers to correctly index your webpage:
Each page must have one and only one header, in
<h1>, preferably using the same keywords used in the “title” meta-tag featured within the same webpage.
Textual content longer than 60 characters must be inserted in
Sections, and sub-sections, within a page must be titled using the tags
<h4> depending on their level of subordination within the webpage.
For the opening and closing sections of a webpage, use the
<img> descriptions using the "alt" field. In this way, search engines will be allowed to index that content and make it accessible to those who can’t read, thus increasing the quality score of a webpage.
URLs structure and composition of a correctly indexed site will consistently reflect the structure of its navigation.
Here is a practical example, extrapolated from our website:
Come potete vedere, nello screenshot la sezione visitata è quella dei servizi, e di conseguenza la struttura dell’URL di riferimento della pagina è https://companyname.it/page.
In questo caso, https://www.fontoso.it/servizi
As you can see in the screenshot, the services section of our website has been visited, and URL structure reflects it accordingly.
In this case, https://www.fontoso.it/servizi.
As written in point 1 of this guide, Google is the undisputed leader on the search engine market, and luckily for us it offers a variety of very useful tools to optimize websites performance, increasing, in return, the so-called "Quality Score", a qualitative score assigned by Google's indexing algorithm, according to various criteria, some of which are not further specified.
Below, we list 5 tools that every SEO practitioner should have within their technical background:
This tool offered by Google is very useful to conduct a targeted research on keywords to be included in your website content. Just enter a few suggestion keywords, and the algorithm will suggest new related ones, indicating a search volume estimate related to them.
It is also possible to geo-localise such suggestions, in order to increase coherence of the suggested keywords with your market and geographical areas of reference.
This platform is free and is made available for every webmaster to analyse organic searches related to keywords used on their websites. Especially noteworthy, is the section dedicated to the inclusion of your sitemap.xml, an indispensable tool to suggesting and force requests related to pages to be analysed by Google’s crawlers.
On this platform, it is possible to monitor the main vital indicators of your website, such as number of visits, bounce rate, number and type of events (to be set), conversions etc.
Within the Administrator Menu, you can copy the global tag linked to your Google account, to be inserted into your HTML’s head section, as indicated here.
Furthermore, the use of this platform is a prerequisite to verify domain ownership of your site, which is required when accessing the Google Search Console.
Surely you have happened to search for the name of a business on Google, and surely you have seen this box appear at the top of your search page (on mobile devices) or next to it (on fixed devices).
That box contains the information in the Google My Business profile tied to the commercial activity which was the subject of your search query.
Here are some of the main ones:
Management of videos and photos relating to your business, also posted by visiting users.
It allows you to publish information related to opening hours, telephone numbers and directions to reach your business, allowing the geo-location of your business on Google Maps.
Allows businesses to publish posts, appearing in the google news feed, related to promotions, events and/or other initiatives of a commercial and/or informative nature.
It allows webmasters to monitor reviews posted by customers, and possibly respond to them (a practice we strongly recommend to improve your brand’s image and tone of voice).
It allows webmaster to identify best-performing keywords used in search queries of the customers whom have reached your website. Very useful in the embryonic phase of your business, to discover long-tail keywords, usually not considered in initial stage, but still profitable and relevant when customers insert low-intent search queries.
This is certainly the most technical tool among those listed, and certainly among the most valuable for the information we can draw from it.
By inserting the link to your website in the search bar, it will be possible to have a detailed report about your website performance according to a list of indicators set by Google itself.
Much of these information is merely technical, therefore not immediately actionable from a shallow perspective. However, we recommend everyone to use it in order to have an holistic view over your webpages’ main health indicators, and why not, maybe start a restyling process to improve website performance related to the indicators contained in such report.
Although these 5 tips do not exhaust all the topics related to the SEO performance of your website, they are certainly an excellent starting point to start a process that will have to last over time, but that will surely bring concrete advantages and benefits to your business.
For any clarification or further information regarding the content of this article, our team is happy to answer any inquiry.
If you like, contact us using the appropriate form in the Contact section.