How to make my website like and indexed on Google.

A web page, if well programmed, can generate traffic potential in Internet search engines such as Live Search, Google or Yahoo! This traffic potential is determined by the positioning that the different contents of said website can achieve for the searches related to them that users may pose, on the one hand.

But at the same time, for the total number of contents or internal pages that the website contains. It seems clear that, the more content, the traffic potential should be greater, and therefore the indexing would also be greater ....

... Or it may not be indexed correctly?

The search engines and directories of the Network are the main source of traffic that reaches a portal, a website, a corporate website, a blog, a web community ...

A website well positioned for popular search terms and with a large amount of content has a much greater capacity to receive traffic from search engines than a website with few contents, framed in topics that generate little interest in the users of the Network or poorly positioned for such searches. This point is vital for positioning, the more interest in the web pages, or in its contents, the more traffic is generated; For pages with very exclusive content, the number of users who are interested in the topic is smaller and therefore the number of visits is smaller.

In other articles on search engine positioning, we have talked about the fact that the positioning of a web in search engines depends essentially on two aspects: the relevance on page, or relevance of the contents of the page itself - essentially the texts, title and meta tags from the page - on the one hand; and the off-page relevance, or relevance in the form of links from other websites, characterized by the quantity and quality of said links.

However, even before a search engine has the possibility to calculate the relevance of a web page, there is a prerequisite that the web has to fulfill: that the web page is indexable.

We call indexability of a website to its ease of being found by search engines, of being correctly crawled in all its contents and of being properly identified the search categories in which it should be included as a result.

From this point, the greater or lesser relevance of the page calculated based on multiple parameters, will influence the final position it will occupy in the results that the search engine will show users.

There are many things and changes we can make on our website to make it more attractive to search engines:

1. Acting as a web search engine does:

One of the first steps to know what a search engine likes and doesn't like is to see its website as seen by the robot that has to index it.

There are various spider simulators or robot simulators that you can use. They are online tools that present the information that can be tracked and used to calculate relevance.

In this way we will see how the images, animations, multimedia content, Flash, etc. they disappear and only the text and links remain. Indeed, the search engines fundamentally take into account the text content of the page to calculate the relevance, it is therefore that the most important of our page is the content (text mainly).

For these purposes, the "cache" view that you can consult in the results of some search engines is also very useful: it is the copy of your page that they have saved on their servers.

It is even possible to isolate the available text by arriving at a view of your page similar to that obtained with the spider simulator tools of the previous paragraph. It is very important to use this method for it.

If in the cache version of your page or after using the spider simulator check that there is no visible text, we must assume that we have a serious problem, since the page is not going to be indexed correctly.

Probably your website is made with Flash, or all the text is part of an image that integrates the design along with the text. In both cases, the solution involves altering the original programming of your website or creating an alternative HTML version that does contain relevant text for the search engine.

2. Each page a unique URL, a unique address.

Unique URLs for each page, or what is the same, each page must have its own unique address or URL to be easily found.

In search engine results pages, each result is identified with a value that must be unique: it is the URL of the page. It is the same text string, numbers and symbols that once in the address field of the browser will take you to that page (also called URL of the page, which must be unique and exclusive).

The value of that address is unique: it is like the ID of the page. Identify that content and no other web page can have exactly the same. There cannot be two pages that can be accessed with the same address in the browser.

If you browse your website and verify that the URL of your browser does not change, you have a problem.

It is possible that your website has many contents, but search engines will not be able to archive each page with a unique address. You can check if this is your case by asking the search engines what pages they know about your website by entering the command "site: www.yourdomain.com" in the search field. When you press ENTER, search engines will return a list of the indexed pages of your website. These pages are likely to appear in the results of a search (very important topic, since it includes the description that will be used to find the content). If you did not change the URL when browsing your website, there may be few pages in this list. It may be because your website is programmed with Flash, with AJAX or with frames. In any of the three cases you must radically change the programming of your web page to identify each different page with a different and unique URL. Only then can you have more opportunities for the different pages of your website to appear in the results to different searches.

3. Trackable links. Links and links that help us find the content of our website.

The links are very important for search engine robots, since they are used as a spider to find new web content, new pages and even new websites. Robots use them to reach new pages, so their importance is vital on a website or site.

Any of the means used in point 1 will help us to see the traceable links, those that the robots will continue to continue tracking content and more content, pages and more web pages. In the cache version we will see them as underlined blue text, while in the spider simulator it will occupy a specific section of the analysis.

If when you search for “site: www.mydomain.com” in point 2 you find few pages of your site listed, it may also be because the links on your pages are not traceable, so you should apply a robot simulator to your page to check.

If necessary, replace the drop-down menus programmed with JavaScript or Flash with normal HTML links, or duplicate the most important links in a link line located in the footer. With this technique you will achieve that all your pages can be tracked by robots, and they can jump from one page to another of your website and can index all of them. It will also retain the navigability of its website.

4. Beware of pop-up windows

It is still very common in electronic stores: we navigate through the sections, we reach the product family, we consult a product file and, eh voilá, it opens in a new, smaller window with no navigation controls (also called pop-up windows). up).

Product listings are the most valuable information on any website. When opening it in a new window in this way we run the risk that the opening of the new window will be intercepted by the blockers that exist in multiple browsers. For example Firefox and Safari, automatically block these pop-up windows, so the information will not be shown to the user.

On the other hand, and more importantly, we prevent robots from arriving and indexing these pages that contain information of great value and that provide many times the synthesis of the web content, since the links that open these pages do not They are normally traceable.

These are links programmed with JavaScript that can pose problems to search engines. If this is the case, the solution is to integrate the product sheets into the general design of your website so that they are simply one more page, without opening them in a new window. Or that they are part of the main page where the characteristics of the products are shown.

5. Fear of subdirectories: how to organize information on a website.

Search engine robots normally consider that the home page of a website is the most important of the site, and that the level of importance decreases as the distance in clicks of it increases, therefore decreases according to the depth of the web or of the pages you access.

The indexing process thus begins with the pages that occupy the first levels and it costs more to reach the pages that have few incoming links or are at the deepest levels of navigation, or in the most hidden area of the web.

It is therefore important to design information architectures with few levels of depth, which evolutions more horizontally than vertically. And establish alternative navigation itineraries for search engines that allow internal pages to emerge within a few clicks of the home page. We can do it with sections of related links, the most searched, featured products, etc.

6. It impacts users and leaves search engines indifferent: Flash, Silverlight ...

In spite of the time that has been present in the Network, Flash technology still presents multiple problems for search engines, especially indexing problems.

Most content programmed with these technologies called Rich Media are difficult to index for search engines and, depending on the way the web is programmed, it can mean that none of our content is crawled, preventing the website from being indexed correctly.

At the moment, there is no alternative but to program an alternative version in HTML that contains enough indexable content and that, at the user level, can offer links to address Rich Media content, but that meets the requirements set by search engines for a good indexing

7. Avoid frames or frames.

At the time when bandwidth was a scarce commodity, the use of frames was fully justified. Pages were divided between fixed elements such as navigation, header, footer, etc. and dynamic, as a content area. The different sections were programmed in frames so that, once a specific website was loaded, you only had to “travel” the part that varied: the page that occupied the content framework. These types of pages are easy to identify because they contain vertical or horizontal scrolling controls that do not occupy the entire browser window.

A web programmed with frames (frames or iframes) has many indexability problems: search engines often cannot track the content of the frame. The URL on these types of websites generally does not change. And, even when the pages of the frames are indexed, there is a risk that the user, who clicks on them as a result, is on an “orphan” page that opens in his browser separate from its corresponding frame structure and, therefore, without navigation, header, footer, etc.

The increasing use of broadband makes the use of frames unjustifiable in most cases. Given the indexability problems they pose, it is recommended to transform a framework structure into individual pages that integrate all the elements.

8. Use of internal search engines

In many large portals, such as media or real estate, there is a much greater amount of content than is possible from the different menus of the home page or section headers. These portals use internal search engines so that users can filter the contents and access the pages that interest them. However, search engines cannot fill out search forms with different criteria to reach these contents.

This causes that a very significant part of these contents is not indexed. The solution is to create groups of contents that, through links, allow navigation to each of the contents under different criteria. Sometimes, this navigation structure will resemble a directory, in the case of a real estate portal, or a calendar, in the case of a media outlet. In any case, the strategy to guarantee indexability involves the creation of alternative navigation itineraries through indexable links for search engines.

9. Pages that weigh heavily when downloaded by browsers.

In the first years of the search engines, it was recommended that the pages were not very heavy, that is, that their file size was not excessive, to ensure that the search engines would index the entire content of the page. At present, this recommendation makes less sense since the evolution of search engines allows us to avoid this type of limitations.

However, it is still a good rule to keep the file size as small as possible, without junk code and as conforming to the standards defined by W3C as possible. This will ensure that search engines will track it correctly and will also have several very beneficial side effects. In the first place, a very extensive page has many possibilities of being diffuse in its content: it will talk about several different things. This type of page is positioned worse than pages clearly focused on a topic. On the other hand, by reducing the size of the file we make user navigation through the website more agile, which results in a more positive experience.

10. The internal order of the web: domains, subdomains and subdirectories

Your company is global, serves many markets and in many languages. How should you structure your website from the point of view of indexability? Let's see what some general recommendations could be:

Search engines reward the websites of the same country, so if you act in several different countries, it could be interesting to acquire the domains with country extension of each market in which you operate: mydomain.com, mydomain.co.uk, mydomain. fr, etc.

If you do not go to different countries, but you do have content in different languages, it might be appropriate to group them into subdomains, such as: english.yourdomain.com, francais.yourdomain.com, etc.

If the only thing that worries you is to structure the sections of your website well, then the obvious thing is to use the subdirectories: www.yourdomain.com/section1, www.yourdomain.com/section2, etc.

11. Redirects

Occasionally, you will have acquired domains in other countries just to avoid problems with unfair competitors or with a view to possible future expansion. What is the most appropriate way to send the possible traffic that can be generated in those domains to your primary domain? Making all of them point to the same IP as the primary domain? From the user's point of view there may be no difference, but from the perspective of search engines, it is better to schedule a permanent redirect 301 from each of these domains to the main one. This permanent redirection message communicates in a language that search engines can understand, that these domains currently have no content and that, in reality, the main domain is to which the visit is redirected.

There are multiple http header analyzers on the Web with which you can check how your domains respond. Your primary domain should respond with a 200 OK message, while your redirected domains should respond with a 301 message.

12. How to make an effective 404 page

In a dynamic website, with frequent updating of multiple contents, it is common that, sooner or later, a link ends up pointing to a non-existent page. Even if your website has some type of control to detect the existence of broken links, it is always possible that a link on another website or in search engines points to a page that one day you thought you no longer needed. In these cases, the servers usually return a generic error message with the 404 code, which indicates that this page does not exist.

This generic message can be customized so that the server returns a page properly formatted with the corporate design and also informs that the content demanded no longer exists. However, there are powerful reasons for the user and for the search engines that in addition to this error message should also add a small directory of links with links pointing to the main content groups of the site. Your users will interpret this as: “Okay, the page you were looking for no longer exists, but this is what we have to offer you to continue with us and continue your visit.” And the search engine robots will have new “pebbles” To continue jumping to new content to index on your website. In both cases, your website will win.

13. The website map

Although we usually read books sequentially, from beginning to end, it is clear that the index plays a fundamental role in relocating certain content later. The index is, on the one hand, a great scheme that collects in a summarized and clear way all the contents of the book and, on the other, a way to jump to certain specific content through the page number. Similarly, the site map allows you to see on a single page the complete scheme of the website we are on, and allows us, through its links, to “quickly jump” to certain content without having to use the navigation menu . The site map is therefore very useful for users.

But it is also very interesting from the point of view of indexability. The navigation menus only allow a few - usually less than ten - options in the main menu. From these few options, through submenus, drop-down menus, etc. We can access the following contents. This increases the distance in clicks of certain contents with respect to the main page, which, as we have already seen, makes indexing difficult. The site map allows you to display, on a single page, a much larger number of links that are just one click away from the main page. This allows a better circulation of the popularity juice from the home page to the internal ones and that the circulation of the search engine robot through its website is much easier.

14. The robots.txt file

All we have said is to ensure that search engines can index all the contents of our website. But what can we do if we precisely want the opposite, that they do not index certain content? There is a type of special file called robots.txt where we can easily specify which areas, subdirectories or files on our website should not be indexed by search engines.

It is important to properly program this file especially in the content management systems (CMS) that generate it in an automated way, since it may be the case that accidentally, areas that should be tracked should be included as non-indexable.

15. How to make a sitemap file

Finally, we will name another type of special file, it is the sitemap file, which is usually a file with XML code invisible to users but that search engines will consult to discover all the pages of your website that we want them to index. There are multiple tools on the Internet to easily generate the code from this file. Once generated and uploaded to the server, we can register it in the search engines through the tools interface for the Yahoo! webmaster or from Google, or by entering a simple line "sitemap: http: //www.mydomain.com/misitemap.xml" in the robots.txt file, for Live Search.

In large portals, the use of the sitemap file may be the most effective strategy to achieve high levels of indexing.

With everything in sight

The objective of indexability is to ensure that a website takes full advantage of its traffic generation potential.

To do this, it must be ensured that each and every one of its contents has had the opportunity to be indexed by search engines. That means that all the text has been traced, that the search categories have been correctly identified where they should appear and that, as far as possible, their relevance is greater than that of the homologous contents of other websites with which they will compete on the search results pages.

Think that each page of your website is like a hook waiting in the sea of ​​search engines: if you only have one indexed page, you only have one hook. If you have a few pages in the indexes, it is as if you had several hooks waiting for the prey. If all the pages of your website are indexed, your website will be like a drag network: it will be taking advantage of its traffic generation potential. Apply these fifteen points and surely your website will look more like that network of potential customers.

Creative services of graphic design, graphic communication, design of wine and oil labels, magazines and catalogs

More about SEO positioning

Ofifacil.com Graphic Design





We use cookies. Continue browsing implies the acceptance of our: Cookies policy
OK