March 26, 2012
by Jeanine Vecchiarelli return to JayVee Media Link
The blog: the lifeblood of our social media existence. Micro, macro, informational, promotional; ultimately, everything online comes back to our blogs. Small wonder, then, why it is critically important to optimize our compositions to ensure maximum visibility.
With the ever evolving algorithms that Google is experimenting with/implementing, attaining and maintaining maximum visibility is like riding a constantly shifting paradigm. What used to work doesn’t anymore; in fact, it could conceivably work AGAINST us. Hence, it is crucial to stay abreast of the changes that Google is making to their search and ranking parameters.
Lately there has been a flurry of reports covering the various ways Google is looking to change its criteria. Some were started a couple of years ago and are still works in progress; others are being rolled out imminently. The most important aspects of these changes are:
- Google’s bots are being “trained” to downgrade sites that appear overly optimized. Remember how keyword density used to be a great way to get noticed? Now a Google search spider will label a site that uses keywords excessively as spammy, and will downgrade the “offending” site’s ranking.
- The same holds true if Google’s bots find too many links on a website page. Like keywords, they are still important. But overusing them will result in a lowering of your site’s ranking.
- This is really more of a reminder than a change, but I thought I’d add it anyway because it is significant. It is still advised to add pictures to your blogs; in fact, pictures offer yet another dimension for gaining visibility thanks to sites like Pinterest. But it’s important to remember to add alternate text to the pictures. The Google bots don’t actually “see” photos, but they will read and score contextually optimized descriptions of them.
- While continuing to experiment with this dimension, Google is phasing in “semantic” capabilities in its search. Its objective in doing so is to go beyond simple word recognition in queries, to a point where the bots can actually begin understanding what is being asked. For this end, Google has acquired an open source knowledge graph called Freebase. By building infrastructure layers in a knowledge graph like this one, its idea is to use Freebase as a tool to aid the creation of more knowledge. The aim is for Google’s bots to ultimately understand the actual context of a search query or a web page that they are crawling. This is why good quality user generated content has gained so much significance in the rankings contest. The search bots are learning to recognize and award rank value to contextually relevant written passages.
- Incorporating content from Google Plus profiles, the Picasa photo sharing service, and content approved for sharing on these two platforms is yet another method Google has recently implemented in its quest to deliver smarter, more personalized search results as well as higher site rankings. This is something else we must bear in mind going forward as we continue to create our content while aiming to win the best possible visibility.
So…are you ready to retool the way you create your content? What will you change to maximize visibility, given these new Google search and ranking parameters? Please share your thoughts in the “comments” section below!
Google Search Set To Change