Symantec's MessageLabs revealed a list of the most spammed states in the US today, with some somewhat surprising results. And the award for state that receives the most spam is...Idaho.

According to MessageLabs, Idaho is the spam capital of the US with 93.8% spam, far exceeding the global spam rate for September of 86.4%. MessageLabs says Idaho has jumped 43 spots since 2008, when it was ranked the 44th most spammed state. According to the security firm, the jump can be attributed to "the resilient and aggressive" botnet market, as well as a higher volume of global spam since he beginning of the credit crisis in late 2008.

Here's a look at the top ten:

1. Idaho
2. Kentucky
3. New Jersey
4. Alabama
5. Illinois
6. Indiana
7. Massachusetts
8. Pennsylvania
9. Arizona
10. Maryland.

I have to say that based on the incredible amount of spam I receive in my inbox on a daily basis, I am not entirely surprised that our state of Kentucky is high on the list.

MessageLabs - Most Spammed States in the US

"Some of the high spam levels seen across the US can be attributed to the economic challenges experienced globally since the end of 2008 as well as Internet advancement including the high adoption of social networking," said Paul Wood, MessageLabs Intelligence Senior Analyst, Symantec. "Spammers have taken full advantage of both the economic uncertainty of some and the trustworthiness of others for their own rewards. Automated tools, resilient botnets and targeted spam campaigns are all part of the spammers’ toolkit and they are constantly evolving these techniques to outsmart any effort to stop them. No state is immune to the affects of spam."

MessageLabs says there currently between 4 and 6 million computers across the globe that have been compromised to form botnets, which send the majority of spam. These are used by cybercriminals to send out over 87% of all spam, which equates to about 151 billion emails every day.

So who's not getting very much spam?

States mentioned as having the least amount of spam include Montana, Alaska, Kansas, South Dakota, Tennessee, Vermont, Rhode Island, Wisconsin, and Florida. Puerto Rico gets below the average global spam level.


Time spent on social networking and blogging sites accounted for 17 percent of all time spent on the Internet in August 2009, according to a new report from Nielsen.

"This growth suggests a wholesale change in the way the Internet is used," said Jon Gibs, vice president, media and agency insights, Nielsen's online division.

"While video and text content remain central to the Web experience - the desire of online consumers to connect, communicate and share is increasingly driving the medium's growth."

Nielsen - Significant Gains in Advertising

Year-over-year, estimated online advertising spend on the most popular social network and blogging sites increased 119 percent, from about $49 million in August 2008 to $108 million in August 2009. The share of estimated spend on these sites has also grown, increasing from a seven percent share of total online ad spend in August 2008 to a 15 percent share in August 2009.

While a number of industries decreased their overall online ad spend year-over-year in August, spending on the top social network sites increased across the board. The Entertainment Industry led in growing its online ad dollars, increasing ad spending on the top social network sites by 812 percent in August. Travel Advertisers followed suit, increasing their ad spend on these sites by 364 percent.

"In the past, advertisers had significant concerns with social media advertising," said Gibs. "The considerable increases we've seen in ad spending over the past year suggest that many of these concerns have subsided or been addressed."

"In particular, advertisers that want to connect with core fan bases, such as movie studios, are allocating more and more dollars to online communities like Facebook and MySpace, where they can engage in an ongoing dialog with their target market."

Year-over-Year Percent Change in Online Advertising
Top Social Networking Site Advertised on By Industry


Twitter was raising funds at a $1 billion valuation eight days ago. Since then, the amount of money Twitter's supposed to be raising has doubled, and more details about who's supplying the cash have surfaced. As you might imagine, all of this has generated quite a lot of discussion.

Twitter Logo

Below, we'll try to provide a roundup of people's reactions. Fair warning: incredulous takes appear to outweigh supportive ones by a significant margin. And also, in case you've missed all of the big headlines, here are the latest facts as reported by the Wall Street Journal: Twitter is raising $100 million from investors including T. Rowe Price.

As for responses, Jason Fried's Onion-like comeback is perhaps getting the most attention. In a fake press release, he wrote, "37signals is now a $100 billion dollar company, according to a group of investors who have agreed to purchase 0.000000001% of the company in exchange for $1." The piece has received more than 160 comments, most of which amount to pats on the back.

Then there's the observation of Dan Frommer and Kamelia Angelova to consider. In an article titled "Twitter Raises Cash Pile As Traffic Growth Slows," they noted, "During August, the company attracted 55 million unique visitors (worldwide) to Twitter.com, according to comScore. That's up about 3 million, or 6%, over July. That's solid, but nothing like its go-go month of April, when it grew by 13 million uniques (~70% m/m) or June, when it grew by 7 million uniques (~20% m/m)."

Still, there were some neutral and even positive remarks about the investment round. David Carr just went for a sort of comedic approach with the following tweet:

Then Larry Dignan raised a very good point about T. Rowe Price's involvement. "This mutual fund firm, which I know well, isn't exactly a run-and-gun investment house," he wrote. "These folks play long term and tout planning for the long run."

So perhaps Twitter's strategy to make money - which at this point, may consist of selling premium accounts and/or data analytics services, along with introducing ads - is further along than most critics thought. Anyway, we'd be interested to hear your opinions concerning the $1 billion valuation in the comments.


You may recall back at SMX Seattle earlier this year, Google's Matt Cutts talked at length about paid links. He touched upon the topic of Google being able to read javascript after giving out advice for so long to use javascript as a way to keep Google from reading paid links.

When asked about this, Matt said Googlebot had gotten smarter. He noted that Google began changing its messaging on the subject around 2007-2008 to stop mentioning javascript but to nofollow or do a redirect through a URL which is blocked through robots.txt.

Cutts noted that even on the onclick in javascript, the crawl and indexing team had submitted code so that it would respect a rel="nofollow". So you can put a rel="nofollow" attribute on a link that's running in javascript, and more often than not, Google will make sure it doesn't flow pagerank even if they're executing the javascript.

Cutts did say, however, that if you want to be completely safe, to nofollow or link through things that are blocked.

Cutts revisited the topic in a recent upload to the Google Webmaster Central YouTube channel, in response to the following user question:

Now that Google can crawl JavaScript links, what is going to happen with all those paid links that were behind JavScript code? Will Google start penalizing them?

Matt reiterated that Google has gotten better at crawling javascript, and that URLs you put into javascript that you didn't think would be crawled, might now possibly be crawled and indexed. He says the vast majority of people who do javascript links are ad networks and that Google handles these very well.

He then reiterated the use of nofollow, even within the javascript code, and the use of robots.txt to block out URls, and redirects.

"We find that the vast majority of paid links are typically not done with javascript," says Cutts. "They're typically completely straight text links. so that's where we've been spending the vast majority of our time."

Cutts says that Google is not currently penalizing paid javascript links, but they may start looking down the line. He says it hasn't been a big issue at all in his experience though.

"If you're selling text links, just make sure they don't flow page rank and they don’t effect search engines," he says.


To most people and companies, a $350 million purchase would represent a huge, long-lasting commitment. Yahoo, on the other hand, apparently wants out of its ownership of Zimbra after a period of two years.

Kara Swisher wrote earlier today, "According to numerous sources, Yahoo has also been shopping around Zimbra, the open-source email company it bought in late 2007 for $350 million."

And although Swisher is almost always dead-on, anyway, the report's especially plausible since Yahoo's been selling and shutting down a quite a lot of other stuff as of late.

For a little Zimbra history: we first heard about the Yahoo-Zimbra deal on September 17th, 2007. To be fair, a lot has changed since; at that point in time, Jerry Yang was in charge of the larger corporation and Yahoo's stock was sitting pretty at around $25 per share. Now, it's Carol Bartz's show and investors are stuck with shares worth about $17 each.

If Yahoo loses too much on Zimbra, however, the recession and other extenuating circumstances still may not be able to save it from taking another significant PR hit.

Swisher listed Comcast and Google as potential Zimbra buyers.


Google has added a new way for webmasters to tell it which parameters in URLS, they wish to be ignored. They have added a new feature to Google Webmaster Tools called simply, "Parameter Handling." Google provides the following explanation with the feature:

Dynamic parameters (for example, session IDs, source, or language) in your URLs can result in many different URLs all pointing to essentially the same content. For example, http://www.example.com/dresses?sid=12395923 might point to the same content as http://www.example.com/dresses. You can specify whether you want Google to ignore up to 15 specific parameters in your URL. This can result in more efficient crawling and fewer duplicate URLs, while helping to ensure that the information you need is preserved. (Note: While Google takes suggestions into account, we don't guarantee that we'll follow them in every case.)

The feature is yet another option webmasters can use when trying to eliminate duplicate content issues, which as we all know can be harmful to rankings, even though Google says it's not a penalty. Either way, eliminating duplicate content when possible is likely to be in your best interest.

To use the feature, just go to Google Webmaster Tools, click on site configuration, and settings. There you will find the "parameter handling" option.

Barry Schwartz at Search Engine Roundtable notes that Yahoo has had a similar feature for quite some time, which it calls the "Dynamic URLs" feature. Ex-Googler Vanessa Fox has a very informative piece on the topic of URL parameters available here.


Google has announced the DoubleClick Ad Exchange, which it refers to as a real-time marketplace for helping online publishers and ad networks/agencies buy and sell display ad space. Prices in the marketplace are set in a real-time auction.

Google says it has three principles for its approach to display advertising:

1. Simplify the system for buying and selling display ads: For example, our DoubleClick ad serving products help advertisers and publishers manage campaigns and ad formats across thousands of websites and from thousands of advertisers.

2. Deliver better performance that advertisers and agencies can measure: We're building a host of new features to help advertisers to run display ad campaigns across the Google Content Network (comprising hundreds of thousands of AdSense partner sites) and on YouTube. We're also developing better measurement and reporting technology so they can figure out what's working and what's not.

3. Open up the ecosystem: We want to democratize access to display advertising and make it accessible and open, like search advertising. We recently launched the
Display Ad Builder to help businesses easily set up and run display ad campaigns. 80% of advertisers who use that product have never run a display ad campaign before.

AdWords advertisers can run ads with the same AdWords interface through the exchange. Ad Exchange sites are considered part of the Google Content Network. Ad Exchange placements will appear like any other Conent Network Placement in AdWords reports. Users can still use the Placement Performance Report to see where their ads have run, and which ones performed best. Google does note, however, that Ad Exchange sites can choose to remain anonymous, and in cases like that, the site will appear in your reports with an anonymized label like "123456.anonymous.google." You have the power to exclude these placements though.

Google says AdSense publishers will also benefit from more advertisers coming through the exchange. The company recently announced plans to give AdSense publishers a new way to generate revenue by allowing multiple Google-certified ad networks to compete for display ad space on their sites. This is related to the Ad Exchange announcement. The Google-certified ad network capability is powered by the DoubleClick Ad Exchange.

"Certified ad networks are Ad Exchange participants who have gone through an additional certification process in order to be able to bid for your ad space through AdSense," Google says. "We call this feature 'yield management', because it offers you the most revenue for each ad that shows on your site in real time, regardless of whether it's Google or another certified party who can offer you the highest bid."

Publishers using the Ad Exchange can use real-time data and bids to allocate ad space that pays the most at any particular second. They get access to more advertisers, Google manages billing and payments from networks, so publishers get one monthly payment.

Ad networks and agencies get access to more publishers, more ad space, real-time bidding, and a new API, which lets them integrate their own functionality and systems when using the Ad Exchange.

Google's move is largely seen as its way of cutting into Yahoo's share of the display advertising pie. This is one area where Yahoo has been quite successful, as Google has dominated the text ad market.


In some areas and industries, non-compete clauses are a way of life; companies don't want their best and brightest working for competitors soon, if ever. California law isn't too keen on non-compete clauses, however, and it looks like Google wants to accelerate the rate at which Yahoo employees jump ship.

Matt Cutts, who is of course one of Google's most visible figures, mentioned (and perhaps bragged about) one defection on his blog last night. "I was talking to an excellent new Googler that joined from Yahoo this week, and that reminded me that I meant to do this post a little while ago," he wrote.

Cutts then segued into what sounds a lot like a job offer, continuing, "[I]f you're an excellent Yahoo engineer with solid experience in search, Google is hiring. If you want to apply for a Software Engineer (SWE) position in Mountain View, use this job page and the application will make it to the right recruiters. Thanks!"

It's a little bit hard to know what to make of the post; the act of advertising a single opening (or even several) on Cutts's blog seems like the HR equivalent of fishing with dynamite.

Still, the key point, as far as we can tell, is that Google's getting even more aggressive about draining Yahoo of talent.


As you're probably aware, the plan for the deal between Microsoft and Yahoo that dominated many of the headlines this summer, is for Bing to take over Yahoo search, in terms of algorithmic ranking. Basically, Bing will handle the back-end, while Yahoo will handle the front-end design of the new Yahoo Search. That should be happening next year sometime.
With Bing taking over Yahoo Search, webmasters are going to need to evaluate their need to address their own sites with regards to optimizing for Bing. While optimizing for Bing is generally a good idea anyway, those who see a good deal of traffic from Yahoo Search, are going to want to give this some special attention.

Presumably, it doesn't matter if you rank well in Yahoo now, if you don't rank well in Bing. At least it won't matter when the change comes. If you're ranked number 1 in Yahoo, but you're on the 7th page in Bing, you've got some work to do.

Ranking Number 1 in Bing
iCrossing Search Strategist David Shapiro gave some good advice in a recent blog post. To summarize, he said if Yahoo is driving a significant amount of traffic to your site, you need to determine what keywords you rank well for in Yahoo, but not in Bing, and before next year, you need to work on raising these rankings. He also said you need to determine which Yahoo terms you rank 6-10 for that may return "Quick Tabs".

"With the way Bing displays search results for these queries, ranking 6-10 is significantly less valuable," says Shapiro. "Bing returns the top five results for the primary keyword you entered, then displays the top three results for up to five related terms, providing a list of 20 possible listings for the user to select."

Dave Shapiro "If you currently rank 6-10 for any of these keywords you should work on building links to move up into the top five, and focus on achieving top three results for the terms that Bing has chosen for the Quick Tabs, especially considering these terms are more targeted and likely convert better," he adds.

There are differences between Google and Bing, but Microsoft's stance on SEO isn't all that different than Google's. There are different algorithms at play, but both like quality, relevant links and good content. In fact, if you've optimized for Live Search in the past, you should be happy to know that Bing's not that different from that either.

"There have been no major changes to the MSNBot crawler during the upgrade to Bing," Microsoft says in a Bing white paper (pdf) for webmasters. "However, the Bing team is continuously refining and improving our crawling and indexing abilities. Note that the bot name hasn't changed. It will still show up in the web server access logs as MSNBot."

Do yourself a favor and read that white paper. As Shapiro says, you would also do well to make sure your sites are listed with Bing Webmaster Tools. He also suggests that in some cases, it may be a good idea to increase your paid budget, just to circumvent any lost organic traffic in the transition period.

There is a good chance you are getting a lot more traffic from Google than from Yahoo, so if that's the case, luckily you still have that going for you. In addition, social networks like Twitter and Facebook (not to mention blogs) are driving a lot of traffic to websites as well.


Google has released a new feature for its Website Optimizer tool, called Experiment Notes. The feature is designed to help users include documentation as part of their testing. For any experiments, users can now add their own annotations.

If you are unfamiliar with Website Optimizer, it's a free tool from Google that webmasters can use to increase conversions by making adjustments to design and text elements to see what works.

"What you put in your note depends on what stage your experiment is in," explains Trevor Claiborne of Google's Website Optimizer team. "If you're still designing your experiment you might include your testing hypothesis or some variations you're considering. As your experiment is running, you might include any external factors that might have an impact on your conversion rate. And as your experiment concludes you might include some thoughts on why variations performed as they did."

Google Website Optimizer Experiment Notes
"Experiment notes are also great if you have several teams coordinating on a Website Optimizer experiment," says Claiborne. "For example, your IT team might update the note once they've installed the Website Optimizer tags on the test page. Your creative team can then start creating variations in Website Optimizer and update the note."

The Experiment Notes feature can be found on the settings page for any experiment in Google Website Optimizer.


Google has plans to introduce a micro payment system aimed at helping online publishers earn additional revenue.

In a document submitted to the Newspaper Association of America in a response to a request made by the NAA to a number of companies, Google outlined its proposed payment plan.

Google Checkout Logo

In a document posted by Harvard University's Nieman Journalism Lab, the company said " Google believes that an open web benefits all users and publishers. However, 'open' need not mean free. We believe that content on the Internet can thrive supported by multiple business models - including content available only via subscription."

The company said its payment system would be an extension of Google Checkout, and would be "available to both Google and non-Google properties within the next year."

"While we believe that advertising will likely remain the main source of revenue for most news content, a paid model can serve as an important source of additional revenue," Google said.

"Google has experience not only with our e-commerce products; we have successfully built consumer products used by millions around the world," it said. "We can use this expertise to help create a successful e-commerce platform for publishers."

Key features of Google's proposed payment plan include:

  • Single sign-on capability for users to access content and manage subscriptions
  • Ability for publishers to combine subscriptions from different titles together for one price
  • Ability for publishers to create multiple payment options and easily include/exclude
  • content behind a paywall
  • Multiple tiers of access to search including 1) snippets only with "subscription" label, 2)
  • access to preview pages and 3) "first click free" access
  • Advertising systems that offer highly relevant ads for users, such as interest-based
  • advertising
Question is,how you can make money online,with this?


In case you haven't noticed by now, Google has gotten bigger. The homepage and search results pages now have a bigger searchbox, complete with a bigger font when you type in a query.

"Although this is a very simple idea and an even simpler change, we're excited about it — because it symbolizes our focus on search and because it makes our clean, minimalist homepage even easier and more fun to use," says Marissa Mayer, Vice President, Search Products & User Experience at Google.

Besides the larger font in the search box, the text for the suggestions below is larger as well. Here is a look at the difference between the old Google and the new Google:

Google Super Sized

"Over the past 11 years, we've made a number of changes to our homepage. Some are small and some are large," says Mayer. "In this case, it's a small change that makes search more prominent."

"Google has always been first and foremost about search, and we're committed to building and powering the best search on the web — now available through a supersized search box," she adds.

The only reason we're discussing the fact that Google has gotten bigger with regards to design elements, is really because Google is so big in terms of search market share and just the daily lives of many, many users. And hey, this is the search industry we're covering.


Real-time search is still an emerging concept. At this point, using a real-time search engine will bring you results by time/date. This doesn't always cater to relevancy, which is why there is still a lot of work to be done in this field.
So, if real-time results are based upon time/date, and the user's query, it stands to reason that time and those queries are the most important components in getting your content found in these types of searches.

1. Use Keywords

This seems obvious, but use keywords in not only your content, but in your titles, and your updates. If you're writing an article, you have to consider what people are going to include in their updates if they share it on a social network, whether this be Facebook, Twitter, or anything else.

More often than not, they are going to include the title. If the right keywords are in the title, then those keywords are also more likely to appear in any ensuing tweets, Facebook updates, etc. If someone searches for those keywords, they will be more likely to find your content in a real-time search.

The same goes for your own Tweets/status updates. Even if you are not sharing an article, if you want your update to be found, use relevant keywords. Again, obvious, but true.

Real Time Search tweet

2. Talk About Timely Events

Simply mentioning events that are current will put you directly into the results for any searches having to do with that topic, provided the right keywords are in play. This is a method that could and (surely is) being exploited by spammers, but that doesn't mean you can't provide legitimate conversation and simply put yourself on more people's radars, without throwing links at them every time.

Michael Jackson status update
3. Have a Lot of Followers

A lot of followers on Twitter If you have a lot of followers or friends on social networks, or even just readers of your blog, you are going to get more people sharing your content. The more people sharing your content, the more impressions of your content will be making their way into real time searches.

There is no easy way to instantly get a bunch of legitimate readers/followers. It will take some promotion. Provide useful content that people will link to and it will spread virally. Provide clear ways for them to follow you (like links to Facebook pages and Twitter accounts on your blog).

4. Promote Conversation

Whether on your blog or on a social network, spark conversations. Talk about topics that people are interested in. This is tied to number 2. The more conversations you are involved with, the more retweets (and equivalents on other networks) you are likely to get. And again, this means more impressions in real times searches.

5. Include Calls to Engagement

I recently talked about why there is more to retweeting than meets the eye for businesses. I mentioned the use of buttons like Tweetmeme's and Digg's. These are buttons you can put on articles that show the amount of retweets/diggs that article has. They kind of act as a meter for engagement.

These buttons are certainly not all-encompassing. They only represent the conversation on 2 channels, and not the web in general. I'm sure there are other buttons that can be used in addition.

More importantly though, they provide a "call to action" to share the content. People can digg or retweet a story with a simple click, and you're one step closer to being found in somebody's real-time search.

Wrapping Up

Real-time search is much more basic (at least so far) than say, Google Search. You're not ranking for relevancy. Really, you could hardly call it ranking it all. It's about visibility. That means, you have to get people talking about your content/updates.

Social media by nature is viral. Real-time search is nothing more than putting things in chronological order. You have to keep people talking to stay relevant to "right now."


I thought that one of the more interesting topics addressed at Search Engine Strategies San Jose a while back was that of SEO and the publishing industry. This is an industry seemingly at war with entities like Google (at least partially), even though there are clearly measures publishers could take, which would make Google and Google News in particular work to their advantage.
Google News is a very useful resource to online news seekers. It seems to get more and more useful as time goes on. For example, they just started incorporating real-time search suggestions into news queries. Publishers should embrace such a tool (Google News) that users themselves embrace, and can ultimately gain them more traffic.

Google Suggest on Google News

This week, Google has shared some insight into search engine optimization practices for news search. Publishers could learn a lot from the following video.

In addition to the video, Google's Maile Ohye answered a couple of questions about Google News SEO on the Google News blog. For one, she says that adding a city to the title of the publication will not help publishers target their local audience, because Google extracts geography and location information from the articles themselves.

"Changing your name to include relevant keywords or adding a local address in your footer won't help you target a specific audience in our News rankings," she says.

She also says that Google only wants recently added URLs in publishers' News Sitemaps, because they direct Googlebot to the publishers' breaking information. "If you include older URLs, no worries (there's no penalty unless you're perceived as maliciously spamming -- this case would be rare, so again, no worries); we just won't include those URLs in our next News crawl," says Ohye.

A few weeks ago, a patent was granted to Google for "systems and method for improving the ranking of news articles." The patent was originally filed way back in 2003, so there is no question that some of the details have changed, but within it there are a number of factors highlighted, some of which may be ranking factors Google News considers.

In one "implementation consistent with the principles of the invention," here are some factors that are mentioned:

- a number of articles produced by the news source during a first time period

- an average length of an article produced by the news source

- an amount of important coverage that the news source produces in a second time period

- a breaking news score

- an amount of network traffic to the news source

- a human opinion of the news source

- circulation statistics of the news source

- a size of a staff associated with the news source

- a number of bureaus associated with the news source

- a number of original named entities in a group of articles associated with the news source

- a breadth of coverage by the news source

- a number of different countries from which network traffic to the news source originates

- the writing style used by the news source

A couple months ago, Google posted a Google News publisher FAQ page. That answers questions like:

- Can I suggest my personal website for inclusion in Google News?

- What requirements do I have to meet in order to be included in Google News?

- My website was accepted in Google News a few days ago, but I still can't find my articles. Is something wrong?

- Why aren't my images showing up in Google News?

- Why do all my articles have a strange title in Google News, like "Share this" or "By Jane Q. Journalist"?

- What is the "unique number" or "3 digit" rule?

- Should I submit a News sitemap?

- Why can't I see the option to submit a News sitemap in Webmaster Tools?

- Once I've submitted a News sitemap, do I have to resubmit it each time I publish a new article?

- If I submit a News sitemap, will Google News stop crawling my regular section pages?

- How often does Google News crawl my News sitemap? In Webmaster Tools, it appears to be crawled only once per day.

- Why have my articles stopped appearing in Google News, even though they've been showing up previously?
The moral of the story is that there are a lot of things you can look at if you are serious about getting traffic from Google News, whether you are already being picked up or not. The best part is that most of it is straight from Google itself.

More tips from Search Engine Strategies can be found here.

Make Money Now!!

An online guru making more in a single month than most people make in a year is finally revealing his top money making opportunity.  Click here to learn more.
If you wish to no longer receive emails, please click here.
Or write to: Suite 208 – 260 West Esplanade North Vancouver, B.C. V7M3G7 Canada