Saturday, January 19, 2013

Strong Google SEO Algorithm Update On 17th January 2013

The rankings for most of the keywords has depreciated from top 10 positions as there has been a Google update on 17th of Jan 2013.

Google official Blog has not yet confirmed it but most of the popular blog sites have mentioned the Google algo update that has affected the rankings as well as traffic for most of the keywords.

I am mentioning the URL of few popular blogs below:-

Monday I reported some weaker signs of a Google update but things settled down a bit but is now spiking higher this morning.

The WebmasterWorld thread sparked up again with a lot more feedback and chatter from webmasters about Google changes.

Source : Technology Updates

Wednesday, December 7, 2011

What is Code to Text Ratio?

Code to Text Ratio

The Code to Text Ratio calculates the percentage of actual text in a web page. Content ratio tool extracts the text from paragraphs and the anchor text from HTML code and calculates the Code to Text Ratio based on this information.

Why Code to Text Ratio Important for SEO?

Code to text ratio is used by search engines to check the relevancy of a web page. Higher Code to Text Ratio is good for ranking purpose.

An Example to calculate code to text ratio percentage of a web page is given below:

Web Page Size :
17481 Bytes = 17 KB

Code Size :
13692 Bytes = 13 KB

Text Size :
3789 Bytes = 4 KB

Code to Text Ratio = Text Size/Web Page Size*100

Code to Text Ratio = 3789/17481*100 = 21.67%

Wednesday, June 8, 2011

New Google Product 2011

Google the most popular search engine worldwide is now launching some new Google products and services in 2011. Our SEO expert explains some of the New Google Product and Services 2011 that are being talked by everyone include:

  • Gmail Call Phone: Now Gmail users can easily make free calls to landlines and cell phones from within the Gmail interface.

  • Gmail Priority Inbox: Google made its Gmail service smarter with priority inbox. This tool is very useful as it can prioritize user’s important mails or messages to help them to sort mails and pare the inbox clutter.

  • Google TV:Google being one of the top search engines can easily search Web for anything. Furthermore, now you can also do TV channel surfing, peep into worldwide TV channels, and watch what world is watching.

  • Google News Revamp: Google News has now been given an important change by Google in an effort to provide the users with more personalization options and relevancy. With the help of this new feature, the users will now be able to custom tailor the news feeds, that they are most interested in, whether it be business, sci-tech, world or any other subject. This means, the users will be able to find new stories of their interests and these stories will not be subjected to their respective days of publishing they will be based on their personal interests

  • Chrome OS: Google Chrome OS is an open source, lightweight operating system that will initially be targeted at netbooks. Speed, simplicity and security are the key aspects of Google Chrome OS. Chrome OS is designed to be fast and lightweight, to start up and get onto the web in a few seconds. Google Chrome OS is created for people who spend most of their time on the web, and is being designed to power computers ranging from small netbooks to full-size desktop systems.

  • Nexus One: The Nexus One by Taiwan's HTC Corporation is a Smart phone by Google. It uses the Android open source mobile operating system. Its features include the ability to transcribe voice to text, an additional microphone for dynamic noise suppression, and voice guided turn-by-turn navigation to drivers.

  • Google Social Search: Google Social Search helps you discover relevant content from your social connections, a set of your online friends and contacts. Content from your friends and acquaintances is sometimes more relevant and meaningful to you than content from any random person. Your Social Search experience is personal and the highlighted content that you see is unique to you and your social connections. Some of the content you may see include Websites, blogs, and other content that's shared by or created by your friends, Images that are shared by your social connections, Relevant articles from your Google Reader subscriptions, Profiles of people you know beneath results for social sites, Web content that has been recommended or shared by others.

  • Google Buzz:Google Buzz is a new way to start conversations about the things you find interesting. It's built right into Gmail, so you don't have to peck out an entirely new set of friends from scratch — it just works. Buzz brings this network to the surface by automatically setting you up to follow the people you email and chat with the most. Buzz integrates tightly with your existing Gmail inbox, so you're sure to see the stuff that matters most as it happens in real time.

Friday, March 18, 2011

New Google Update March 2011

New: Google Lets You Block Any Site From Search Results

Google announced you can now hide or block certain sites from showing up in the Google search results.

When you do a search in Google, the search results will show a new link near the “Cache” link when you click a result and then return to Google. The link that Google adds to the search results reads “Block all results.” Clicking on that will allow you to block the site from showing up in the Google results.

If you are not logged in, Google will immediately block the result and confirm they did so. But if you want the site to remain blocked on future searches, you have to login and confirm the block request.

At the bottom of the search results it will show you that there are blocked sites. It will let you show the blocked results or manage your blocked sites.

Here is the confirmation page:

Here is the manage block page:

You can access this page under your “Search Settings.”

Google said:

We’re adding this feature because we believe giving you control over the results you find will provide an even more personalized and enjoyable experience on Google. In addition, while we’re not currently using the domains people block as a signal in ranking, we’ll look at the data and see whether it would be useful as we continue to evaluate and improve our search results in the future. The new feature is rolling out today and tomorrow on in English for people using Chrome 9+, IE8+ and Firefox 3.5+, and we’ll be expanding to new regions, languages and browsers soon. We hope you find it useful, and we’ll be listening closely to your suggestions.

Thursday, February 17, 2011

SEO Technical / Tactics Interview Questions

Every SEO prefers certain tactics over others, but familiarity with many could indicate a deeper understanding of the industry. And while every SEO doesn't need to have a web developer background, having such skills can help set someone apart from the crowd.

Technical / Tactics

1. Give me a description of your general SEO experience.
2. Can you write HTML code by hand?
3. Could you briefly explain the PageRank algorithm?
4. How you created any SEO tools either from scratch or pieced together from others?
5. What do you think of PageRank?
6. What do you think of using XML sitemaps?
7. What are your thoughts on the direction of Web 2.0 technologies with regards to SEO?
8. What SEO tools do you regularly use?
9. Under what circumstances would you look to exclude pages from search engines using robots.txt vs meta robots tag?
10. What areas do you think are currently the most important in organically ranking a site?
11. Do you have experience in copywriting and can you provide some writing samples?
12. Have you ever had something you've written reach the front-page of Digg? Sphinn? Or be Stumbled?
13. Explain to me what META tags matter in today's world.
14. Explain various steps that you would take to optimize a website?
15. If the company whose site you've been working for has decided to move all of its content to a new domain, what steps would you take?
16. Rate from 1 to 10, tell me the most important "on page" elements
17. Review the code of past clients/company websites where SEO was performed.
18. What do you think about link buying?
19. What is Latent Semantic Analysis (LSI Indexing)?
20. What is Phrase Based Indexing and Retrieval and what roles does it play?
21. What is the difference between SEO and SEM?
22. What kind of strategies do you normally implement for back links?
23. What role does social media play in an SEO strategy?
24. What things wouldn't you to do increase rankings because the risk of penalty is too high?
25. What's the difference between PageRank and Toolbar PageRank?
26. Why might you want to use nofollow on an internal link?

I hope it will help you & you all will like it, Soon I will come with SEO Analysis Interview Questions, Keep reading..

Best Link Builder, Best SEO, Top Link Builder

Tuesday, February 15, 2011

Google update cracks down on duplicate content

Those within the SEO industry have known for some time that duplicate content is a big no-no with the search engines, especially Google. However, despite the lectures and best practices slamming the use of duplicate content, scraper sites and plagiarised web copy, sites that steal content still managed to slip through the cracks in Google’s algorithm and rank within its pages.

This caused much ire with Google, and with many website owners and SEO experts who played by Google’s rules. Google (as ever) has been busy working behind the scenes to stamp out this most heinous threat to the purity of its SERPs. In the last big Google Algorithm update in January Google has worked tirelessly to deal with the issue of duplicate content and content scraper sites, ensuring its index is free of spam and thievery.

In Google’s January update, the search engine has put in place more measures to drive spam content sites from its pages, rewarding those sites with unique, quality content.

Matt Cutts, Google’s head of anti-spam, commented about the update:

“[I mentioned] that ‘we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content.”

“That change was approved at our weekly quality launch meeting last Thursday and launched earlier this week.”

“This was a pretty targeted launch: slightly over 2% of queries change in some way, but less than half a percent of search results change enough that someone might really notice.”

“The net effect is that searchers are more likely to see the sites that wrote the original content rather than a site that scraped or copied the original site’s content.”

Google is talking tough, but what exactly will be affected by this change, and how can you avoid being adversely affected by the ‘Google Content Update’?

Firstly, if your website utilises content from other sites, meaning that you aggregate content in some way to your site (such as with the use of an RSS feed) then your site can be seen as a scraper site, or content farm. None of your content is your own, therefore is of new use to Google in terms of offering it to its users.

Secondly, if your website uses content that is very low in quality, and very high in quantity, in a bid to increase the size of your website and the number of keywords that your site ranks for, you could also be in trouble with this new update. Some websites use content in this way, and receive their content from offshore companies using non-native speakers because of the lower cost implications – regardless of the quality.

The way to protect yourself, and your website, against the Content Update (snappy eh?) is to ensure that any content added to your website is of a high quality and relevance to your site, and your industry, and that it is unique.

For Google to trust a website enough to rank it highly in the SERPs it needs to have unique, relevant content, and to feature strong backlinks. All of this can be achieved through ethical SEO.

Best Link Builder, Best SEO, Top Link Builder

Thursday, February 3, 2011

Google accuses Bing of 'copying' its search results

Google accused of criminal intent over StreetView data

Google is "almost certain" to face prosecution for collecting data from unsecured wi-fi networks, according to Privacy International (PI).

The search giant has been under scrutiny for collecting wi-fi data as part of its StreetView project.

Google has released an independent audit of the rogue code, which it has claimed was included in the StreetView software by mistake.

But PI is convinced the audit proves "criminal intent".

"The independent audit of the Google system shows that the system used for the wi-fi collection intentionally separated out unencrypted content (payload data) of communications and systematically wrote this data to hard drives. This is equivalent to placing a hard tap and a digital recorder onto a phone wire without consent or authorisation," said PI in a statement.

This would put Google at odds with the interception laws of the 30 countries that the system was used in, it added.

Scotland Yard

"The Germans are almost certain to prosecute. Because there was intent, they have no choice but to prosecute," said Simon Davies, head of PI.

In the UK the ICO has said it is reviewing the audit but that for the time being it had no plans to pursue the matter.

PI however does intend to take the case to the police.

Start Quote

The idea that this was a work of a lone engineer doesn't add up”

End Quote Simon Davies Privacy International

"I don't see any alternative but for us to go to Scotland Yard," said Mr Davies.

The revelation that Google had collected such data led the German Information Commissioner to demand it handed over a hard-disk so it could examine exactly what it had collected.

It has not yet received the data and has extended the original deadline for it to be handed over.

The Australian police have also been ordered to investigate Google for possible breach of privacy.

'Systematic failure'

According to Google, the code which allowed data to be collected was part of an experimental wi-fi project undertaken by an unnamed engineer to improve location-based services and was never intended to be incorporated in the software for StreetView.

"As we have said before, this was a mistake. The report today confirms that Google did indeed collect and store payload data from unencrypted wi-fi networks, but not from networks that were encrypted. We are continuing to work with the relevant authorities to respond to their questions and concerns," said a Google spokesman.

"This was a failure of communication between and within teams," he added.

But PI disputes this explanation.

"The idea that this was a work of a lone engineer doesn't add up. This is complex code and it must have been given a budget and been overseen. Google has asserted that all its projects are rigorously checked," said Mr Davies.

"It goes to the heart of a systematic failure of management and of duty of care," he added.

Best Link Builder, Best SEO, Top Link Builder