This article talking about search engines optimization (SEO). It will inform you how to make our new blog posting quickly indexed by Google search engine. And how how long time our new posting get indexed by Google search engine is depending on several factors such as:
- Blog popularity Popularity determined by some factors such as backlink and social media interactions. With social media interaction we able to get more traffic through Facebook, Twitter, and so on. Therefore, should we provide a social bookmarking tool to facilitate readers to share articles via social media such as Facebook and Twitter
- Craw-able content We have to create content that can be crawled by search engine robots according to the standard Google. We have to create website content that can be understood and read by bots / spiders. Try not to post content only in the form of images, flash, or JavaScript as bots prefer to read text. Do not forget to equip your blog template with meta tags.
- A good website structure (interlink structure). An easy way to interlink SEO landing page is to add a list of "Related Posts" at the bottom of the page or in the sidebar so that all targets interlink become structured. The main purpose of the link structure is to help visitors or users of our website in searching information. By ensuring that the link text used in the structure is the key word to focus, then you help the search engines determine the relevance of information and also assist the user by using language that is consistent with how they look for information.
The following are things that must be considered in order to posted articles quickly indexed by the Google search engine robots:
- SEO friendly Blog Template. Use blogger templates that SEO friendly. Many web that provides a template, even many were free.
- Do on page SEO optimization. To check your blog SEO quality please use online search engines optimization tools. One of them is https://chkme.com/page-seo-tools.
- Avoid duplicate contents When using the Blog Archive, you should note the settings in the Robots.txt as Blog Archive considered as duplicate content by the search engines so that Blog Archive can actually reduce the value of the quality of your blog or website by search engine robots. Change the setting in the Robots.txt be:
- Using widget for internal link Use widget like recent post (latest articles), or related article, and popular post will be good enough to be used as it will help the quality of the structure Interlink.
- Setting Search Performance Go to the menu "Settings" Search preferences Custom robots.txt (enabled) and click Edit menu, add the following code in the box provided in Blogger setting.
User-agent: *
Disallow: /search
Disallow: /undefined
Disallow: /*archive
Disallow: /p/
- Register your blog / website on Webmasters Some webmaster available are Google Webmaster, Bing Webmaster, Alexa Webmaster, and Yandex.We can verify our blog / website on these webmaster tool so that our website / blog will be more quickly indexed by search engine
- Fetch blog / website or article URL in Google Webmaster Fetch as Google in Webmaster Tool. If you want your blog is fetched, leave the URL box blank. If you want a new article fetched, enter the URL of the article in the box provided. Then click FETCH or FETCH AND RENDER. Then click Submit to index.
- Submit your blog Sitemap in Google Webmaster. Google Webmaster dashboard menu is available at Blogspot. Click Overview, and look at the bottom right there is the Open Webmaster Tool. Or can be directly click Google Webmaster.
- Ping Your Blog Ping your Blog or website by using online Ping tool. For example googleping.
- Check Server Responses Server Response Code should be 200 / OK. To check the server response blog / website please use the tools provided by CEOSentro.
Please fill in the URL of your blog in red letters.User-agent: *
Disallow: /search
Disallow: /undefined
Disallow: /*archive >
Disallow: /p/
Allow: /
Sitemap: https://YOUR WEBSITE URL/sitemap.xml
Submit |
Fecthing URL |
Post a Comment