Thursday, February 17, 2011

SEO Technical / Tactics Interview Questions

Every SEO prefers certain tactics over others, but familiarity with many could indicate a deeper understanding of the industry. And while every SEO doesn't need to have a web developer background, having such skills can help set someone apart from the crowd.

Technical / Tactics

1. Give me a description of your general SEO experience.
2. Can you write HTML code by hand?
3. Could you briefly explain the PageRank algorithm?
4. How you created any SEO tools either from scratch or pieced together from others?
5. What do you think of PageRank?
6. What do you think of using XML sitemaps?
7. What are your thoughts on the direction of Web 2.0 technologies with regards to SEO?
8. What SEO tools do you regularly use?
9. Under what circumstances would you look to exclude pages from search engines using robots.txt vs meta robots tag?
10. What areas do you think are currently the most important in organically ranking a site?
11. Do you have experience in copywriting and can you provide some writing samples?
12. Have you ever had something you've written reach the front-page of Digg? Sphinn? Or be Stumbled?
13. Explain to me what META tags matter in today's world.
14. Explain various steps that you would take to optimize a website?
15. If the company whose site you've been working for has decided to move all of its content to a new domain, what steps would you take?
16. Rate from 1 to 10, tell me the most important "on page" elements
17. Review the code of past clients/company websites where SEO was performed.
18. What do you think about link buying?
19. What is Latent Semantic Analysis (LSI Indexing)?
20. What is Phrase Based Indexing and Retrieval and what roles does it play?
21. What is the difference between SEO and SEM?
22. What kind of strategies do you normally implement for back links?
23. What role does social media play in an SEO strategy?
24. What things wouldn't you to do increase rankings because the risk of penalty is too high?
25. What's the difference between PageRank and Toolbar PageRank?
26. Why might you want to use nofollow on an internal link?

I hope it will help you & you all will like it, Soon I will come with SEO Analysis Interview Questions, Keep reading..

Best Link Builder, Best SEO, Top Link Builder

Tuesday, February 15, 2011

Google update cracks down on duplicate content

Those within the SEO industry have known for some time that duplicate content is a big no-no with the search engines, especially Google. However, despite the lectures and best practices slamming the use of duplicate content, scraper sites and plagiarised web copy, sites that steal content still managed to slip through the cracks in Google’s algorithm and rank within its pages.

This caused much ire with Google, and with many website owners and SEO experts who played by Google’s rules. Google (as ever) has been busy working behind the scenes to stamp out this most heinous threat to the purity of its SERPs. In the last big Google Algorithm update in January Google has worked tirelessly to deal with the issue of duplicate content and content scraper sites, ensuring its index is free of spam and thievery.

In Google’s January update, the search engine has put in place more measures to drive spam content sites from its pages, rewarding those sites with unique, quality content.

Matt Cutts, Google’s head of anti-spam, commented about the update:

“[I mentioned] that ‘we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content.”

“That change was approved at our weekly quality launch meeting last Thursday and launched earlier this week.”

“This was a pretty targeted launch: slightly over 2% of queries change in some way, but less than half a percent of search results change enough that someone might really notice.”

“The net effect is that searchers are more likely to see the sites that wrote the original content rather than a site that scraped or copied the original site’s content.”

Google is talking tough, but what exactly will be affected by this change, and how can you avoid being adversely affected by the ‘Google Content Update’?

Firstly, if your website utilises content from other sites, meaning that you aggregate content in some way to your site (such as with the use of an RSS feed) then your site can be seen as a scraper site, or content farm. None of your content is your own, therefore is of new use to Google in terms of offering it to its users.

Secondly, if your website uses content that is very low in quality, and very high in quantity, in a bid to increase the size of your website and the number of keywords that your site ranks for, you could also be in trouble with this new update. Some websites use content in this way, and receive their content from offshore companies using non-native speakers because of the lower cost implications – regardless of the quality.

The way to protect yourself, and your website, against the Content Update (snappy eh?) is to ensure that any content added to your website is of a high quality and relevance to your site, and your industry, and that it is unique.

For Google to trust a website enough to rank it highly in the SERPs it needs to have unique, relevant content, and to feature strong backlinks. All of this can be achieved through ethical SEO.

Best Link Builder, Best SEO, Top Link Builder


Thursday, February 3, 2011

Google accuses Bing of 'copying' its search results

Google accused of criminal intent over StreetView data





Google is "almost certain" to face prosecution for collecting data from unsecured wi-fi networks, according to Privacy International (PI).

The search giant has been under scrutiny for collecting wi-fi data as part of its StreetView project.

Google has released an independent audit of the rogue code, which it has claimed was included in the StreetView software by mistake.

But PI is convinced the audit proves "criminal intent".

"The independent audit of the Google system shows that the system used for the wi-fi collection intentionally separated out unencrypted content (payload data) of communications and systematically wrote this data to hard drives. This is equivalent to placing a hard tap and a digital recorder onto a phone wire without consent or authorisation," said PI in a statement.

This would put Google at odds with the interception laws of the 30 countries that the system was used in, it added.

Scotland Yard

"The Germans are almost certain to prosecute. Because there was intent, they have no choice but to prosecute," said Simon Davies, head of PI.

In the UK the ICO has said it is reviewing the audit but that for the time being it had no plans to pursue the matter.

PI however does intend to take the case to the police.


Start Quote

The idea that this was a work of a lone engineer doesn't add up”

End Quote Simon Davies Privacy International

"I don't see any alternative but for us to go to Scotland Yard," said Mr Davies.

The revelation that Google had collected such data led the German Information Commissioner to demand it handed over a hard-disk so it could examine exactly what it had collected.

It has not yet received the data and has extended the original deadline for it to be handed over.

The Australian police have also been ordered to investigate Google for possible breach of privacy.

'Systematic failure'

According to Google, the code which allowed data to be collected was part of an experimental wi-fi project undertaken by an unnamed engineer to improve location-based services and was never intended to be incorporated in the software for StreetView.

"As we have said before, this was a mistake. The report today confirms that Google did indeed collect and store payload data from unencrypted wi-fi networks, but not from networks that were encrypted. We are continuing to work with the relevant authorities to respond to their questions and concerns," said a Google spokesman.

"This was a failure of communication between and within teams," he added.

But PI disputes this explanation.

"The idea that this was a work of a lone engineer doesn't add up. This is complex code and it must have been given a budget and been overseen. Google has asserted that all its projects are rigorously checked," said Mr Davies.

"It goes to the heart of a systematic failure of management and of duty of care," he added.

Best Link Builder, Best SEO, Top Link Builder