Category Archive Guerilla Marketing

Yelp That Hurts

As a small company with limited advertising resources, I try to leverage everything I can to make myself visible on the Internet. One of the most important tools a small business can use besides Blogging is getting positive reviews from major sites. One of the bigger sites with a large influence is Yelp.com. If you are new to a town or just visiting Yelp is a great way to find out if that service you are planning to use is well thought of. Well the Web being what it is, people soon realized the importance of reviews on Yelp, and before long people were gaming the system. So Yelp fought back and came up with algorithms to fight fraudulent reviews, the arms race has begun.

Think about it, the accuracy of reviews is critical to your trusting a review sight. What would you think if you used a service that had great reviews but in reality that service was just the opposite. Yelp’s popularity grew initially not because of the accuracy of the reviews but the ease with which one could post those reviews and share them with others. The 800 pound gorilla that Yelp has become has made it a target for manipulation. Think about the last time you went to a hotel review site and bought a cheap room that looked too good to be true and in fact was, not good, not good at all.

Yelp introduced several anti-fraud measures to help control reviews. They added the ability to respond to a negative review, the put a waiting period for someone new coming into the system and several other not so obvious controls. Yelp of course will not disclose how their algorithm’s work, it’s the secret sauce that helps them control the content just like Google search algorithm’s. There are problems of course and that’s where it hurts. Yelp considers a review from a reviewer to be fraudulent after thirty days if that reviewer does not post any further reviews. Logical I guess, since it stops a fake user from being created just to post a positive review, except guess what there are companies that have created hundreds of reviewers that are fake and that are for hire. How do I know this, my doctor has to use such services to prevent his reputation from being besmirched by an angry doctor shopping, prescription drug abuser. These patients know that they can screw with a doctor by leaving a negative review, and tell the doctor as much during the appointment, so how does the doctor combat this, he hires a reputation agency.

The reputation agency has hundreds of fake users with seemingly real backgrounds, containing pictures, biographies, Facebook profiles, etc. The agencies merely flood the doctor’s reviews with many positive reviews and it drives the algorithm to push the low review to the bottom. In order to prevent Yelp from detecting this theses agencies consider these fake reviewers part of their tools so the encourage/pay their employees to post reviews to fake accounts so that not all the reviews are of one category or all positive.  They have this down to an art, gee and you were worried about an arms race with China.

In my case I do not have the budget or requirement to bury negative fraudulent reviews, but Yelp has marked all of my reviews as fraudulent since every client that has posted a review did not post any more. My clients need help with their computers and are too busy running their business to be running a muck on the Internet telling everybody about their last bowel movement or the latest gossip about their neighbor. In short my typical client does not participate in the social frenzy of the present generation.

So what does one do in this situation, well for now it’s a work in progress? I am trying to use other review sites that are not as popular as Yelp, but never the less seem to drive some traffic. Sites such as Thumbtack.com, Manta.com, etc. are places I suggest for my clients to leave reviews. I know it’s only a matter of time though before these sites have to adopt counter measures to unscrupulous companies and individuals. Even so I find that I have to do an enormous amount of free stuff some times to get someone to do a review, and I guess that’s just the price of doing business.  Someday I guess I will get a client that will leave a positive review on Yelp that will stick so for now Yelp that hurts…

My Experience with SEO – Chasing Bugs

Search the web for Search Engine Optimization (SEO) and you will find a plethora of web sites and people ready to sell you advice. Visit the many blogs and user groups and you find references to snake oil salesman and superman, there seems to be no end to the promises made and broken by some so called experts and web sites. Getting your web site to the first page of a search is an accomplishment worthy of praise. There are many things you can do on your own to improve your chances and I am by no means an SEO expert. My intent here is to relay my own experiences in getting SEO working on my web site. SEO is at best a black art and a project that is never finished.

Google, Bing and Ask do not make their crawl algorithms public as they are the secret sauce worthy of hefty protection, except imagine a secret sauce that changes constantly. The algorithms used are constantly adapting to those that find ways to beat them. That being said, what I am writing today could be completely irrelevant in a year, so for an in depth look at the what, why and how go to my links and look at the publication “The Beginner’s Guide to SEO”. My tale here is not an instructional guide on SEO but on just how to get to the beginning of the race.

My site is built on the Open Source Content Management System (CMS) called Joomla. I started with the Webmaster tools from Google and Bing (see my links), in particular the Google tools. Crawl errors had become the bane of my existence and my site had dozens. I was shocked and happy to find out that Google had already been crawling all over my site without me really trying. In Joomla I had SEO friendly URL’s turned on but I still had major problems that were causing issues with Google and Bing. Simple URL’s are best and all good intentions aside, I had enabled Multi Language support but it really was not appropriate for the local markets I am targeting. Disabling multilingual support changed my URL’s from ‘http://www.zypath.com/index.php/en/resources’ to ‘http://www.zypath.com/index.php/resources’ but now I had my second issue, why was there a ‘index.php’ sitting in the middle of each URL. Users were being redirected from ‘http://www.zypath.com’ and ‘http://zypath.com’ to ‘http://www.zypath.com/index.php’ and Bing did not like that at all, in fact Bing refused to index the site while Google didn’t seem to notice – go figure? Once again a default Joomla setting was the culprit and the issue was easily fixed.

These first issues though were not the source of my crawl errors though, there was a big white elephant sitting in the room as I blissfully went about my business. It turns out all of my Tutorials and Blog articles had an ID number in front of them and Google did not think that was very SEO friendly. My search to figure out why this was happening turned up a glaring problem in Joomla that I hope will get fixed. Joomla’s infinite flexibility comes at a price; a menu item must point to any content visible on the site otherwise it adds an ID to the URL. I really do not want to swell my menus up with hundreds of items so what do I do? I created a hidden menu (not set to any module position) that linked to each article of content, a pain and a work around but it worked. The ID’s were gone and Google and Bing were happy.

One final problem remains and this was URL Canonicalization, say that fast three times and you will sound like an old mafia boss… Canonicalization describes how a site can use slightly different URLs for the same page – for example people often leave off the ‘www’ when typing a web site URL. The separation can cause lost link value (according to Google – sounds more like a stock portfolio) and hurt rankings for your page, Google even describes this and how they attempt to determine the best page match for the URL. Fixing this required going into the web server and creating what is known as a 301 redirect or re-write rule so that all pages point to the same URL. Bing dada Boom I am on fire now, Bing finally starts responding…

Unfortunately we are still not done with the basics. We now have Google and Bing able to find and index our pages but we really are just starting with SEO optimization. We will review that in another Blog post where we will cover using some free online SEO optimization tools.

My Experience with SEO – Squashing Bugs

Now that we have Google and Bing crawling all over our site, we need to make sure they behave the way we want, in other words making sure we are SEO compliant. Now we get to see how well (or how poor if you’re a half glass full kind of person) our site will play in the big bad web. This requires some validation against known successful sites and for this we turn to a site called ‘SEOSiteCheckup’. Entering or web site URL rewards us with a grade and the details to back it up. Now I am normally an ‘A’ student so getting a ‘C’ my first time was a bit of a shocker. I thought I had covered all of the bases but that’s where I was surprised, I had missed several key areas and had several incomplete tasks. Time to smash some bugs.

I had a glaring problem with images, some were too big and most did not have the ‘alt’ tag. Alternative ‘alt’ text is text associated with an image that serves the same purpose and conveys the same essential information as the image, that’s a Wiki mouthful. In situations where the image is not available to the reader (perhaps because they have turned off images in their web browser, or are using a screen reader due to a visual impairment) the alternative text ensures no information or functionality is lost. Absent or unhelpful alternative text is a source of frustration for blind users of the Web. The ‘alt’ tags were set where ever possible but sometimes the template restricts their setting especially if there is a lot of Javascript in the page. Thankfully the search engines do not seem to penalize too much if a few tags are missing. Several of my images were initially over a 120 kb, the maximum preferred size is 50 kb, ouch! Using GIMP (yup it’s on my links page) I experimented with changes in image quality until I could get file sizes that were small enough and pleasing to the eye. I was able to shrink the image size substantially by setting the JPEG quality parameter to 25%. Be warned do not muck with anything else on a JPEG conversion unless you are a graphics guru, trust me there are too many things to screw up.

A very important part of the SEO optimization is a ‘sitemap’, just don’t tell a guy that – keep driving honey I am sure it’s around the next corner. I am using an extension called Xmap which creates both a ‘HTML’ and ‘XML’ sitemap, and no it doesn’t wash the cat too… The ‘XML’ site map is critical for getting Google and Bing to index your pages. On the Webmaster pages for both Google and Bing there is a configuration area where you can input the location of your ‘XML’ sitemap, this will greatly speed up the indexing of your site. The ASK search engine is not as forgiving as Google or Bing and it requires a very specific ‘XML’ standard web site, I have to ASK why. Unfortunately for me it will not accept the sitemap Xmap generates as it is in a non-SEO friendly format. I will have to use external software to generate the sitemap Ask wants. The ‘HTML’ site map can be used as a page on your site and is very useful for people wanting to find something. Xmap also creates hyperlinks with the HTML so that the user can jump right to the page when it is clicked.

My original low grade was primarily due to how I had constructed my meta tags. These tags are not seen by users browsing your page, but are used by search engines to associate your web pages with keywords and phrases. The meta tag title and meta tag description are used to build a short description of your web site to put in search page results. My error was not due to omitting these tags, as many people do, but to not using tags that appear in my content. Someone figured out that they could just put all of the most popular keywords into their meta tags to improve their position on search results, now you know why the secret sauce keeps changing. Tags were being used that had nothing to do with the content of the web page, they were popular keywords that helped raise a websites ranking. Search engines responded to this by verifying that the content contained the title, description and keyword meta tags used, otherwise the ranking was penalized, do not collect $200, go directly to jail. I corrected this error by using a Joomla component, module and plugin called SEOBoss that actually highlighted my keywords listed in the meta tags throughout my page content, I could then verify that indeed my content contained the meta tags I was using. The SEOBoss component also scanned my website and brought together all of the meta tag settings for each page (yes you need to set meta tags for all of your pages, though the front page is the most important.).

Joomla and the template I had bought thankfully supplied several more features needed to complete SEO. Heading status, robots.txt, image tag expiry and favicon are automatically done by Joomla and or the template. Heading status is the use of h1 and h2 html tags to highlight important keywords and describing sub-topics of a page. The robots.txt file is a small text file that gives instructions to when robots (search engine crawlers) about how to behave on the site, such as telling the robots that certain parts of your server are off limits to some or all robots, danger Will Robinson. The use of image expiry tags tells a browser to cache the image for a specified date so that it does not keep re-fetching unchanged images from your server. Finally the ‘Favicon’ are small icons that appear in your browser’s URL navigation bar – so it’s important that it’s your logo or some other symbol related to your business as these are the symbols are used when visitors set a Bookmark or Favorite for a page, and you thought little fairies came along and put it there…

There is one last very important area, that is actually a relatively new part of your web sites search ranking and that is links. Not links you create to someone else, but the links from other sites to your content. The links also need to be to meaningful content. The old thinking was to create short 500 word or less blogs and lots of them. Now Google, wants links to meaningful content that is between 750 and 1500 words. The last word is about social metrics and their effect on rankings. Those seemly useless Facebook ‘Like’ button, Google +, Tweet and Linkedin inShare buttons really are important for content. They provide a way for non-web Yoda’s to show that they like your content without having to create a phreaking link to it. Take a bow you have now made it probable that your web site will show up in the first 10000 results, just kidding… let’s hope it’s the listed with in the first ten to twenty search results.