Introducing a New Kind of Tweetup: Social Media Battles

If you’ve seen me speak, or read my blog you’ve probably heard me rail against unicorns and rainbows advice. Going to lots of social media conferences, and reading a lot written about it, I’m noticing a disturbing lack of healthy debate, nobody disagrees with anyone else (at least not by name, in public).

Taking an idea from hip hop culture, I’m organizing a new kind of Tweetup: Social Media Battles.

Each battle will be composed of 2 people representing opposing viewpoints on a social media topic. They’ll each be given 2 minutes to make their case, and 1 minute rebuttals to their opponents, then the audience will decide the winner. Fast, simple and honest. We’ll do a bunch of these depending on how many people want to step up to the plate and battle.

The first one is going to be during FutureM week here in Boston on the evening of October 7th at the HubSpot offices in Cambridge, mark it on your calendars.

If you’ve got ideas for topics to debate, please leave them in the comments. And if you’re feeling brave and want to battle, email me.

Read More

The Most ReTweetable Words Finder Tool

I’ve done a bunch of research about The Most ReTweetable Words and people seem to like it, but the overall top 20 is a bit too generic for many niches. So I made a tool that will allow you to find the most ReTweetable words about your specific topic.

This tool will show you the 20 most ReTweetable words about any given topic. Simply enter a keyword (like “marketing”) and click analyze. The tool will return a list of words that were found to be related to that word and highly ReTweetable. It will also display the number of Tweets and ReTweets analyzed to generate the list.

Each word is recalculated after 24 hours, and the tool analyzes up to 1500 Tweets and 1500 ReTweets per word. Therefore the most ReTweetable words displayed represent the most contagious topics recently.

The tool works by comparing words found in ReTweets against how commonly they appeared in non-ReTweet Tweets, and identifies those words that appear in ReTweets more than they do in non-ReTweets.

It does take a few moments to analyze a new word for the first time, so please be patient. And this tool is very beta, so expect some (read: plenty) bugs. There are some known issues with multiple word searches.



Read More

New Experiments Question the Power of Social Proof on the Web

In a lot of my presentations and research, I’ve talked about social proof, and I’ve hypothesized that it has an effect on social and viral behavior online, but I had never actually proven it. So a few weeks ago, I began a series of experiments designed to test the assumption that the effects of social proof and social conformity can be exploited on the web.

In the first two experiments, I split tested ReTweet buttons with different ReTweet counts shown on blog posts. First I compared “0 Tweets” with “776 Tweets.” The results were exactly the opposite of what I expected. After 36 hours, the button showing no Tweets had been clicked more than double the times the other button had. The sample size and variation performance are statistically significant, and the results show a 96% confidence level.

While discussing these results with Alison, she suggested that they may have been due to a “first post” effect, where people want to be the first to share a piece of content. So I tested a button showing “15 Tweets” against one showing “776 Tweets.”

While the post I used for this test was more popular, the results of the experiment showed a far less significant difference between the two buttons. The “15 Tweets” button performed marginally better, but the low confidence value means there is probably no meaningful difference between the two buttons.

The results of the first two tests had me questioning whether or not social proof has the effect online I thought it did. My next step was to test the Feedburner subscriber count RSS button, which I believed was perhaps more likely to exhibit traditional social conformity effects.

I began by testing a button displaying “12 Subscribers” against one that displayed “62172 Subscribers.” The higher variation was clicked on a slight .13% more, and again this experiment’s confidence interval is too low to really be significant.

Finally, I decided to test the “first post” effect on the RSS button, by comparing a “0 Subscribers” button against the “62172 Subscribers” button. Again, the 62172 version did a little better, but failed to reach a statistically significant level.

In spite of the insignificant results I found in 3 of 4 tests, I believe my findings are interesting for a few reasons. First and perhaps most importantly, they represent a first step towards “contagiousness testing” which would allow marketers to apply split and multivariate testing methods to content virality.

An elongated test may reveal that higher showing a higher subscriber count on an RSS button, does lead to a small, but significant click-through increase.

These 4 experiments also suggest that there may be a powerful “first post” effect that marketers can leverage in certain situations. I plan to do more research into this in the future.

Read More