It looks like if HTTPS protocol is not going to be a e-commerce privilige, where security plays crucial role. Recent Google announcement about consideration of HTTPS as a metric, might dramatically change the rules of ranking game. This means that if all other elements will be equal, HTTPS websites get preference over HTTP. It you would think that’s not enough, Google will go a step further and mark http websites as “unsafe” letting users know that they taking risk visiting them. So what could be your next step if search giant will go this way? The cheapest one to start with is SSL certificate, which with GoDaddy cost you only £3.99 / year. One further will set you off by nearly 55, but you’ll benefit from Boosts your site’s Google ranking and EV SSL turns browser bar green for your website as one of the selling companies promises. Named by some as another monopolistic move, Google trying convince everyone that’s making a safer internet, but at the same time it is making life difficult for the average user who wishes to have a typical blog or website. Theoretical question that arises at that point is what happen when we all move to HTTPS. Will ranks go back to normal? What do you think. Related posts: Power to users – Google gives some privacy control in Gmail’s My Account Next Google Panda ranking algorithm update – quick thoughts Matt Cutts will no longer be one responsible for Google Search Test your website On Different Platforms with Google Resizer
Matthew or Matt Cutts works as a Software Engineer who used to lead the Web Spam team at no less than Google. Cutts is known to write the first version of Google’s family filter called SafeSearch and has been involved in search quality and web spam in Google for a good number of years. However, Matt Cutts is presently on extended leave from his position of working on search engine optimization concerns with Google’s search quality team. Education and Career Matt Cutts earned his Bachelor’s degree back in 1995 for both in Mathematics and Computer Science from the University of Kentucky. He then earned a Master of Science degree in 1998 from the University of North Carolina at Chapel Hill. His early career in search started when he was still working on his Ph.D. at Chapel Hill’s University of North Carolina. Matt Cutts then worked as a Software Engineer for Google in 2000. During the PubCon in 2007, Cutts explained that his field of study was mostly in Computer Science but he then moved into another field which is information retrieval and then to search engines which may have been inspired when he joined two outside classes from the Information and Library Science department of the University of North Carolina. Cutts is primarily known as the designer of SafeSearch where he was nicknamed as the Porn Cookie Guy since he would hand out his wife’s homemade cookies to any of his colleagues who can give him a list of undesirable porn sites or content in Google’s search results. Matt Cutts used to be the key resource person when it comes to the Web spam team and had provided a lot of helpful insights which are related to the use of Google as a search engine and other pertinent issues. Matt Cutts is currently busy with his blog which he uses to share his personal thoughts on the relevance of Search Engine Optimization (SEO) and other related issues. The Rise of Social Media and SEO Nowadays, Internet marketers rely heavily on social media to promote their websites or blogs. One technique is called guest blogging which has been utilized by many. However, Cutts explained through his blog that this technique is actually futile considering that it does not work. Cutts further explained that guest blogging is merely a spammy practice that will just annoy people and will not do your website any good. Although guest blogging worked for a time wherein it was still deemed as quite effective; this practice nowadays should be put to stop. The main reason why guest blogging is already considered as passé is the said method has become downright annoying and spammy. The exception to the rule is when the guest bloggers are credible and if they can create rock star content. Does Social Media influence SEO? Matt Cutts was asked on one of his video blogs on whether Twitter and Facebook signals affect the Google ranking algorithms. This is a very valid question since many website owners and SEO experts think that social media play a vital role in the whole SEO process. Many people think that Google would actually index every single page on the web, but Cutts explained that Google does not consider every page as important. The number of web pages is increasing every single day which is why Google is quite selective when it comes to indexing web pages and the like. Social media has become a very popular platform for any person and recent reports have stated that there are over 500 million tweets a day which makes it understandable for Google to not even make an attempt in indexing most or even all social media posts that are generated. Matt Cutts also revealed that Google actually has a limit when it comes to how much they can crawl in social media. This means that Google, as powerful as they may seem, is not always able to crawl in all of the pages on Twitter or Facebook which are considered as two of the most popular social networking sites. However, Google may be able to actually see every single tweet in real time but this does not mean that they are able to index every tweet. A study that was conducted back in June of this year showed that Google is only able to index less than 4% of the total number of tweets. One of the most common Internet marketing practices uses social media sites as platforms to promote their business or introduce their website. However, Google does not presently use signals like Twitter or even Facebook followers when it comes to search ranking. This is because Google will only use a signal when it is highly necessary. The Google team must first make sure that they are confident in what the signal means and must first check the internal signals and all other connections regarding the content of the site before they can move forward. Since Google cannot completely crawl all over Twitter or Facebook, this means that they are actually missing out a lot of data which can help them make a correct evaluation. Social media sites are prone to a lot of changes every day. Some Facebook users would change their posts, relationship status and whatnots which can lead to other events. There may even be instances wherein users would block or unfollow other users which can affect the signal confidence on Google’s side. Internet marketers have long believed that social signals were considered as indications of quality and trustworthiness by the search algorithm of Google. However, such metrics like in Twitter or Facebook do not imply the authority or influence of a particular profile which in the end means that they do not in any way affect search rankings. There are now several studies that support this announcement from Matt Cutts which include the Rank Correlation for 2013 from SearchMetrics as well as several case studies from Quicksprout. For more details or if you need more convincing, check out Matt Cutts’ video which was posted in December 2010. Related posts: Matt Cutts will no longer be one responsible for Google Search The Social Side of SEO in 2014 8 Latest White SEO Techniques that Works for Your Website Is Guest Posting Dead? What Can Guest Posting do for Your Business?
Recently announced rollout of the Accelerated Mobile Pages (AMP) by Google, which will have without a doubt, some consequences on content publishers and search marketers. With Mobilegeddon as a flash, and strike in form of the AMP, comes the following question “How this will affect mobile SEO?” AMP pages should give users almost instant content, with 15-85 percent speed improvement and support of Google pre-render. That means that at least in mobile terms, your page should get instant boost, and from Google slight edge in ranking (of course if you fullfil all the others requirements). Similar to news box, AMP module will appear at the top of the SERP and push everything else down. You can get some conclusion right here. However, not opting-in for Accelerated Mobile Pages project, it is more likely to have any or just marginal impact in paid search, we would suggest. Some platforms might get easier like WordPress integration via AMP plugin, while other might be a little bit trickier and could require some code to implement, bot overall this should be a pretty straightforward task. Investment in SEO, and in mobile segment in particular should be not option but necessity as year-after-year search traffic increase there, so ignoring it at any stage of your online presence, should be consider a little bit foolish. More information about this topic you can find here. Related posts: Why a business needs a mobile friendly website? Which Mobile Application Platform Should You Invest With?
Back in 2009, Google made a proposal to make AJAX pages crawlable. Back then, their systems were not able to render and understand pages that use JavaScript to present content to users. For that reason Google proposed a set of practices that webmasters could follow in order to ensure that their AJAX-based applications are indexed properly by search engines, theirs in particular. Now, there is no problem with JavaScript or CSS files, as long as they are not blocking Googlebot from crawling, with rendering and understanding web pages that modern browsers serves. To reflect that change, Google updated technical Webmaster Guidelines and they’re not recommending disallowing Googlebot from crawling anyone site’s CSS or JS files. As Google also mentions, webmasters should match experience and look of websites using pre-rendered pages with those served to Googlebots with their equivalent offered for users. Serving Googlebot different content than a normal user would see is considered cloaking, and would be against our Webmaster Guidelines. What’s more, immediate change are not necessary, unless we’re building a new site or restructuring an already existing one, then it’s recommended to simply avoid introducing _escaped_fragment_ urls. Related posts: What? Your website is not using HTTPS Google will lower your rank Matt Cutts will no longer be one responsible for Google Search Power to users – Google gives some privacy control in Gmail’s My Account Test your website On Different Platforms with Google Resizer
It hasn’t been long since Google began encouraging clients to try and jump on the Google Authorship trend and they were convinced that doing so could help improve web rankings and eventually increase site traffic. Many have indeed tried and saw the results. However, in 2014, Google has decided to let go and kill Google Authorship. Before figuring out the reasons, let us deal with the story behind this Google feature first. What is Google Authorship? Google thought of crediting authors who write on various topics, especially those who wrote extensively on specific topics which other writers will quote, cite, or use as reference. The articles or contents are not exclusive to a particular site or to a blog but are seen as contributions to other blogs and sites as well; thus, credit should go to the authors which include not only the writer but the webmasters and publishers as well. Google Authorship makes sure that the author’s names are always linked and connected to the respective published content. It is done by using a hidden code; rel=author mark-up. Its main function therefore is to track content of individual authors and analyze authoritativeness as part of the ranking criteria. This is because inside the search results, details are linked to verified online profiles and tend to rank higher than content that lacks verification. Why Google Killed Authorship The intention was good; however, disappointments ensued afterwards regarding this feature. In other words, it failed to actualize its promise of high rankings in search engines. There are two possible reasons why is this so. First, there is low adoption rate on the part of the writers and publishers. There seemed to be a perceived faulty implementation which did not excuse even those who adopt Google Authorship methods. In addition, Google was not successful in automatically attributing authorship. The second reason for failure is that it has been noted that showing authorship of content does not actually matter to users. Users did not find meaningful experience with the images and the compilation of content of specific authors, but instead tend to clutter their mobile access experience. In addition, the cost of supporting Google Authorship is quite costly compared with the benefit that it was supposed to generate. Why You Should Not Worry? If you are going to analyze closely, there is no need to panic as a result of Google’s killing authorship. It is not an invitation to plagiarise and steal content. First, take note of the reason why Google killed it; authorship does not help site ranking significantly. Thus, the end of it will not affect search rankings or web traffic to your website. In other words, you can still keep your authorship markup as it doesn’t have any bearings when Google performs searches. Authorship experiment has failed but the search for original content still continues and copyright, plagiarism, and content stealing is what matters when it comes to intellectual property. According to Google experts, this is not yet the time to dismantle Google profile as Author rank is still functional. Google has still other means to identify authorship and it is still safe to presume that high quality and authoritative content will not be rewarded. Here are some important steps that you need to do as a result of dismantling Google authorship. Continue to use profiles and bylines especially when you participate in guest blogging so that connection between your content and website is still active. Focus more and improve your Google Plus activities. The more useful, interesting, and authoritative your posts are, all the more reasons users will share it, resulting to more people reading and seeing them, and as a result, you will more likely appear on top of relevant searches. Improve networking as part of building authority. Taking that authority building is not all about accumulating more and stronger followers but rather about connecting and linking with the right people, so always search for top ranked authors and communities then connect with them. Linking with these people can increase your chances of sharing your content; therefore, boosting your authoritativeness. Find your own place under the sun, build your own niche. The old adage “jack of all trades, master of none” still applies when it comes to building content. The more sporadic and diverse your blog and site, the less likely you will build and increase authority levels. Do not forget that content still rules. You can never argue the benefits of having high quality content. Customers and users will definitely be turned off with boring content; therefore, you should provide them with real, quality, honest information as part of the solution to their problems. How to Avoid Content Stealing One of the biggest worries about removal of Google Authorship is content stealing and plagiarism. It is important to point out that these illegal and invasion of intellectual property activities were never eliminated the moment Google Authorship was implemented. So, it is still a matter of personal responsibility when it comes to protecting your content and here are some few reminders. Discover first if your content was stolen by using different kinds of available technology such as Copyscape which helps you discover where your contents were accessed and published in other sites. Use Referral Traffic to see which sites are giving you traffic and focus on sites which has spike results and see if your content appears on these sites. Display your content guidelines prominently so that users will be aware that you are serious with protecting your intellectual properties. Create content guidelines so that users will know how to properly cite your content and authorship. In your RSS feeds, input a few paragraphs that will link to your content. There are content robbers who use robots to steal content directly. Include brand logo if it is visual content to show that you own the content. Time can only tell if dismantling Google authorship can lead to increase in plagiarism and content stealing. It never went away in the first place when Google Authorship was still active. Related posts: Is Google going crazy – High authoritative backlinks on Google Pigeon’s fire? Want to trick Google? Yelp for help! How Google Gives a Different Meaning to the Word Free with their adverts Google Penguin 3.0 – how hard Google hit you this time
Google autocomplete feature which using its API will dissapear in August, or maybe it’s better to say will stop working as intended. It’s really difficult or even impossible to conduct a search on Google without running into autocomplete feature. The handy feature that predicts queries as you’re typing them, was engineered to be used with Google Search, and it’s going to be the only place on the web it will be used. You have probably seen Google’s autocomplete functionality outside of Google Search, used by developers on other sites thanks to the autocomplete API, search engine giant used to provide. However Google wants to end its unrestricted, unofficial and unpublished autocomplete API use outside of Google’s core search product. From August 10th, 2015, Google is imposing restrictions on its unpublished autocomplete API, restricting unauthorized access of that feature. What that means is that developers and publishers will no longer be able to use Google’s autocomplete service on their sites. What more, autocomplete as mention earlier will only be used as it was intended to be used, and that’s in Google search: If you had been using Google’s autocomplete API on your site, we’d recommend using Google Custom Search Engine on your site instead. By doing that you will still have the autocomplete functionality but slightly different way (different connection to Google Search services). Related posts: Power to users – Google gives some privacy control in Gmail’s My Account Matt Cutts will no longer be one responsible for Google Search Google Penguin 3.0 – how hard Google hit you this time What could we expect from Google after Penguin 2.0 update?
Aside head of Spam department, there will be few people responsible for Google Search as we’ve learn today. The top of Anty-spam team Matt Cutts will be supported by Gary Illyes, who’s Webmaster Trends Analyst in Google. This seems to have some sort of confirmation in Cutts taking recent side road about spam related information, at least those posted on Twitter. This came to everyone attention when Matt Cutts was not the main attraction at the “You & A” session at this year’s SMX Advanced Expo 2015 in Seattle. The important things Garry mentioned were related to Panda and Penguin updates as well as to unannounced ‘Quality update’ that rolled out last week. In a quick re-cap we’d like to point out, Panda will be rolled out in 2-4 weeks, while Penguin will be heading to reel time deployment, rather than regular updates as it was until now. Of course team responsible for that is quite close to accomplish this, yet we have to wait a bit longer. Another crucial and interesting thing Gary mentioned was a Penalty Tracker which is a tool that Google Search team is using to monitor and manage penalties bringing particular websites down in the rankings. Unfortunately there are no plans to make this tool available for webmasters. One thing we don’t completely understand is mentioned click role and it’s relevance to Google learning user’s behavior. As an example, Gary used “Apple” and the Google’s learning algorithm which will determine next move, which suppose to improve search next time user search for something. The question here is what if we were looking for Apple iPhone using just word ‘Apple’ and somewhere in the future for a fruit. Would Google serve Cupertino company all the time because firs time they learned that from clicks? Luckily rumors about increased relevance as a ranking factors for https, weren’t confirmed as well as mobilegeddon fear that grown in the past months was given more explanation. To finish, Gary mention two quite opposite matters, which were duplicate content and Google Authorship. There is no duplicate content penalty,” Gary said, because there are other things that webmasters should be more concerned about. But then when it comes to Google Authorship, this tool is still in use, somewhere declining its purpose. Related posts: Social Media Relevance to Search Engine Optimization Google Penguin 3.0 – how hard Google hit you this time Google not recommending the AJAX crawling anymore Power to users – Google gives some privacy control in Gmail’s My Account
As much as 93 people who took recent Pew study survey suggest that it’s extremely important to control access to their personal information, and 90 percent care about the type of information that’s collected about them. That’s why Google decided to loose a string a bit and give users control on their privacy and security. Search giant also set up a new website privacy.google.com which focuses solely on answering user concerns about the company’s data collection procedures across its online services. My accessing Gmail’s My Account, users will have access to the settings and tools that enable them to tweak their security settings that fit their needs. With My Account, users are able to safeguard their data, better protect their privacy, and decide what information is used to make Google services work best for them. Some most important user settings with ‘My Account’ that are available for everyone include: Use of the Privacy Checkup and Security Checkup, step-by-step guides that take the user through important privacy and security settings. The ability to manage the information used from Search, Maps, YouTube and other Google services. Users can, i.e, turn on and off settings such as Web and App Activity. With the setting switched on, search results are quicker and more relevant, so it is advised to do so. Easy customization of the Ads Settings tool to control ads based on user interests and user searches. The ability to control which apps and sites are connected to one’s account. It is also important to mention, that website privacy.google.com set up by Google answers some very important questions like how Google is able to serve relevant ads without selling people’s personal information, how encryption and spam filtering help keep user data safe and how having information about their users enables Google to customize user experience across its properties. Related posts: Bing will encrypt search traffic to protect privacy What? Your website is not using HTTPS Google will lower your rank Google not recommending the AJAX crawling anymore Matt Cutts will no longer be one responsible for Google Search
Adverts are literary everywhere and even in search engines like Google which presents their page with clean, simple design at first, just to serve hot ads as a starter. Enjoy!When you consider some interesting facts about search engine such as more than seventy percent of users do not click on the next page when searching, that at least seventy percent to eighty percent of search users do not click on paid adverts. That email and searching are top online activities, and that companies with blog or that blog are indexed more and therefore get more sales leads than those who don’t. In addition, a majority of users click on natural search and rarely on sponsored search results. What are sponsored ads? You might have already encountered sponsored ads or search results when you perform search operations or when you open your emails. These are links from advertisers’ webpage that appear as sponsored link adjacent to natural or organic search results. Companies that want to advertise in this method submit its product information in the form of keyword listings to different search engines. Bid codes or values are associated with individual adverts to extract a position on different competitive listing in the search engine page results. By assigning codes and click, advertisers will pay only each time the ads are clicked by users. The effect of adverts to users Search engine operation and email are free online services. However, the influx of search engine advertising has suddenly altered the meaning of free; from costless to annoying. As more companies found sponsored ads beneficial for them, online users have to become extra careful when it comes to browsing their mails and reading search results menu as they might click the wrong link. In addition, users have become suspicious with adverts and believe that these are malwares ready to attack their personal computers. That was the case when Google was accused by promoting paid results in shopping section by Microsoft, and backfired them with similar accusation. Users have been exposed to distracting adverts. For example, they have to encounter the adverts first before reaching a list of organic search results or they have to see sponsored links instead of the latest emails they receive on top of their inbox list. Thus, Google have altered the meaning of free from gratuitous to distracting. As a result, users spend more time in order to delete sponsored links and adverts or find themselves scrolling downwards to escape reading adverts which takes one half pages of their browsers. Because adverts have become annoying and distracting for online users, more than a majority of them are indeed complaining. The problem is that their complaints were limited to cursing themselves and not filing legitimate complaints. Apparently, advert has become a way of life and users have accepted it as a form of natural order. Thus, Google has altered free emails and search engine operations into a stoic form of online experience. Users are immune with the adverts or sponsored links and do not care anymore if the links are indeed the ones they are looking for. One of the effects of adverts is that more and more online users are finding ways to eliminate adverts from their system. Users are searching for software that can eliminate these things and some are even willing to pay for it just to get rid of distracting and annoying adverts. Related posts: Google Penguin 3.0 – how hard Google hit you this time What? Your website is not using HTTPS Google will lower your rank What could we expect from Google after Penguin 2.0 update? Why Google Authorship was Killed and Why You Should Not Worry
The Penguin update #6, which was also known as Google Penguin 3.0 was rolled out in the month of October. Many website owners thought that this update would cause a huge impact on them since Google was trying hard to manage millions of new links that are added to the World Wide Web on a daily basis. It took about a year to roll out the new update. However, the impact of Google Penguin 3.0 was not that harmful as expected. Keep on reading this article to learn more about Google Penguin 3.0 update and how hard it affected you this time. The most significant impact from the Google Penguin 3.0 update is that they introduced a new system that allow refreshes to happen more frequently. It was done in the same manner as the Panda update did. This new update was not focused on a specific industry or niche and it affected all the websites out there in general. Unfortunately, some websites lost about 70 percent of their search visibility through this update. As I mentioned earlier, it took some time for the Google to launch their new update. The main reason was that they wanted to stay away from punishing websites that do not have spammy content. Google Penguin is an algorithm and it is hard for the algorithm to determine exactly whether a website has spammy content. Finally they managed to roll out an effective update and punish websites that actually have spammy links. This time Google did not punish websites for having bad links. Instead, they erased the value of bad links from the websites. This update was much deeper than the previous ones and it caused a big impact on the smaller areas. Google did not want to punish the websites that have a solid foundation as it will cause a lot of frustration among webmasters. That’s why they thought of removing the juice from bad links, leaving something for the website owners to think about. This was a worldwide update and it affected websites of all regions and languages. The spam tolerance of Google was much effective through this update and it helped them to remove a lot of spammy content. Moreover, this update is visible in all versions of Google. The victims of Google Penguin 3.0 can be divided into two main categories as negative SEO victims and spammy websites. The negative SEO victims are the unfortunate websites that were killed incorrectly by the update. If you own such a website, you don’t need to worry about anything as we will let you know few optimal strategies that can be implemented in order to get rid of the negative effects of this update. People who were punished for having spammy content can also try those strategies because Google was kind enough not to punish and remove the website completely from search engine results. Related posts: How to recover from Google Penguin 3.0 penalty – quick, yet effective methods What could we expect from Google after Penguin 2.0 update? Is Google going crazy – High authoritative backlinks on Google Pigeon’s fire? How Google is directing the internet?
Choosing your domain name. What’s good for your...
Search engine optimization, also known as SEO is...
Voice search is rapidly gaining more and more recognition,...
A good copywriter is like the heart in our body. It...