In building my latest site Geek Flirt (a social network/dating site for geeks) I realized that lolcats would be the perfect vector for targeting geeky girls, so I gave it a shot. (click for larger images) I’d love to hear what you all think of these.
Category: Small Business
It’s a basic truth of the human condition that everybody lies. The only variable is about what.
House often didn’t see his patients so they wouldn’t have a chance to lie to him. People lie, data doesn’t.
The same is true with analytics. Designers, developers and owners lie, statistics don’t. People have ulterior motivations and egos, numbers don’t.
I’m probably the last person to figure this out, but I was just doing some planning and I figured out this easy little formula to figure out estimated average CPA:
Average Cost Per Acquisition (CPA) = Average Cost per Click / Conversion Rate
David and I just used that formula plus some algebra to come up with a budget estimation.
It used to be an easy target to warn against only thinking about search engine friendliness after a site was built, every few weeks another “seo expert” would come out and tell stories of entirely built sites that had to be re-engineered to allow spiders the best possible access to its content. And while, at least in my little corner of the web world, that lesson has been learned and is starting to sound redundant and obvious, those who refuse to learn from past mistakes will repeat them.
These days I’m finding it very common that things like conversion and usability based on real user testing and analytics data are only being thought of after a site is built. Often highly arbitrary “best practices” guidelines are followed (or not) during an entirely aesthetically-driven design process, that leads to a pretty looking site that has a huge bounce rate and horrible conversions. Then 6 months or a year later an analyst is called in and clicktracks is expected to save the day.
Some common reasons justifying this I’ve heard are “user testing is too expensive” (sure, but its less expensive than doing a free redesign for the client because their site looks nifty, but doesn’t make them any money), or that only after a site is live do you have “real” data. Thats true only in the narrowest analytics sense of the word, qualitative user testing is certainly possible with 5 to 8 subjects before a site is live, and if even that is too much to ask, someone with knowledge of usability and conversion enhancement should be involved in early in the design process, not just graphic designers.
Jakob Nielson’s most recent alertbox columm touches on this a little:
Having a good designer doesn’t eliminate the need for a systematic usability process. Risk reduction and quality improvement both require user testing and other usability methods.
Designers are just that, designers, not user experience or interaction specialist, and certainly not target users.
And on a side note, I’m curious, what would you guys call an “average” homepage bounce rate?
I heard a client say recently that trying to make changes to an established site to increase its conversion rate was just haphazardly guessing, and they were corrected by someone who said that the right way to do it would be to guess and then test with multivariate tests. I disagree.
The scientific method says we should study the subject first, then make a hypothesis and test it. Study the site visitor’s current behaviors first through analytics or user testing/surveys. Then you can begin to make assertions about what problems said site visitors are having on the site and you can make the necessary conversion enhancements and test them.
Email me if you want to talk about boosting your conversion rates.
PS this site’s 1 year birthday was 2 days ago.
Let’s say you’ve got a website. Consumer e-commerce. You get lots of visitors and you have lots of pages. Most of your traffic is from search engines, and your keyword range is wide, with a natural head-tail power law curve to it. Some of these people buy things, most do not and like any other business owner, you want to figure out how to make more people buy stuff from you. Nothing on the site is broken, or screams for help, like broken search or poor navigation and the site does sell a few things, just not a lot.
For me, the key has been segmenting visitors into behavioral cross sections. Like I’ll first look at the keywords the visitors came from and see if there are any keyphrases or root-keyword groups with abnormally low time-on-site averages or conversion rate. If one or many do then I would stop paying per click on those words if I was and I would exclude them from further research. I do this to rule out keyword-intent as a possible source of low conversion rate.
Then I’ll look at entry pages the same way. If I find an under performing landing page, I’d also expect to see a high bounce rate. If I didn’t have any landing pages relevant to the traffic the bad one was receiving, it would be come a candidate for testing.
The same analysis would then be applied to key pages in the shopping and checkout experience and if I found any un-replaceable pages I’d move them to the testing phase.
The pages selected for testing would be broken down into component pieces and variants of each chunk would be run through tests. Each variation would have large differences and I’d only test 2 or 3 of each, perhaps even using a “bad” one. Testing a low number of variations will allow me to run the test quickly and look for which page sections produced the biggest range in performance among the variations. These are the important parts.
Then tap the skills of experts in design or communication for interative testing and fine-tuning of the important page elements. When your improvements plateau, return to an earlier step and repeat the process.
Ok, ok. Enough with the hint dropping. SponsoredPosts.com has been officially announced and you can submit your email address to receive notification when it fully launches as well as for access to the private beta when that opens. And yeah, how sweet is that domain name?
So if you wouldn’t mind doing me a favor, give it a digg.
I know lots of you out there are either bloggers or SEO’s/webmasters. And many of you are both. As you may know by now, I love the paid-post model for link development and targetted advertising,s o I’d really love some feedback about what features you think would be super-cool in a payperpost-type sponsored posts marketplace.
Drop ’em in the comments, thanks.
After surviving the webmaster deathmatch at pubcon Las Vegas, we were able to secure beta access to Google’s sweet new multivaritate testing system, Google website optimizer. I’ve been using it for a few weeks now and I’m liking it so far. For a free product, its awesome, but it does seem very beta still. Here’s a couple of points of contention I have with it:
Once you start an experiment, there’s no way to edit variations or any of the other experiment settings, and if you stop an experiment you can’t restart it, you have to create a new one from scratch.
When you’re creating variations for a page section to test, you have to hit save before you can preview the page, again not a huge problem, but a little confusing.
If you’ve created an experiment and not finished building it (like if you messed up) there’s no way to delete it from your list of experiments.
The navigation of the Website Optimizer seems inconsistant, sometimes I can’t find my way back to a screen I was previously on.
Sometimes I log into other google accounts (like my personal one, rather than our adwords account) and when I log back into adwords, I can access everything but the optimizer. If I click on it’s tab I’m sent momentarily back to the log in screen then back to the adwords account homepage by a redirection. I’m forced to restart my browser (this is happening on firefox).
The only success metric it seems I can measure is a direct conversion, sometimes I’d like to just check out time on site, or pageviews. Perhaps this is doable via integration with Google Analytics, but I’m not sure.
But again, these are mostly little convienance issues, and overall I love the app, especially at the price (free).
They’re probably not going to be doing a spring conference. Instead likely more of the one day type events.
Several of the search engines have announced a unified sitemap structure. MSN, Google and Yahoo. Ask is invinted too.
Yahoo search mission is to enable people to find, use, share and expand all human knowlege. Find: enable people to find what they are looking for, use search not for sake of searching but to achive a purpose. Share sharing knowldge with people you connect with and connecting to people who you share knowledge with. Expand: yahoo answers helping expand the amount of information on the web.
Getting into the search index: Link to the url from an already indexed URL simple urls and shallow urls, 3 to 4 levels deep, good athoritative links.
Unique content, page-specific titles, page-specific metatags. Seperare pages only when there is seperate content. Multiple domains only when there are distinct businesses.
excessive doorway pages/domaons
link farms massive domain interlinking
off-topic links dilute valuable links
internal links don’t help/hurt rankins
Regexp in robots.txt, they support the google style format. wildcards, anchors.
if rules disagree, the more specific statement will supersceded the shorter command
no archive, they also have crawl delay.
Search Builder, customized web search, site search and customized vertical news search. strong international adoption, monetization in 2007. He’s going through how it works. 4 easy steps to create a search box, you put the code on your blog or site, then it will create a cobranded yahoo SERP with site publisher branding and it gives you keyword reporting. Topic oriented search, tweaking of organic results, popular search term cloud, query history reports for publishers.
you can login via yahoo id, then you can add as many sites as you’d like to add, and you can see your feeds. there is also site authentication. you can manage sitefeeds and add feeds and if you’re authenticated you can get info on its processing. You can just go to the site explorer page without the management and overhead of ID to submit a page or feed or sitemap. You put an authenticatioon key on your site and within 24 hours they’ll check it and you can get lots of useful information on processing. site explorer lets you see inlinks, indexed sites, tim says they’re one of the only search engines that provide the full back links data.
Ysearchblog will post about what is to come.
Eytan Seidman MSN
He’s going to talk about live search, he’s asking what products everyone is using. IE, firefox, xp, mac, who’s doing white hat and who’s doing spam.
web, news, local, images.
He’s showing live image search with the endless scroll bar. and a slider, there are lots of sexy ajax features. He’s talking very fast.
He’s showing that if you search for a person in image searcht they’ll show related people.
people want a tremendous amount of images. you can consume lots of images and people search for people in image search a lot.
Infinte scroll bar works well in images, not so well in web search.
local search maps.live.com 2d and 3d maps for zooming around and see what its like.
core relevance their relevance is improving all the time
crawling, deeper crawls, support for new content types (eg HTTPS)
improving display using an example of university of arizona descriptions of sites are reading more like queries.
Better tools for searching, noodp crawl commands and robots.txt
new query commands:
ip: all the pages on a certain IP. (these two look cool)
and many others…
search macros example: macro:livelabs.msdn select. cancatenate all search commands
sitemaps, they support the new sitemap format.
They want to hear from us MSNdude, webmasterworld, forums, stickymail and the livesearch blog
Peter Linsley ask.com
They support all the noodp and robots type stuff
He’s showing some financial data. They’re the 4th biggst search engine.
dynamic comunity clustering and expertrank
search verticals for blogs, images, news maps etc
search enhancing tools such as related search, binoculars, smart answers and a unique under interface.
He’s showing a SERP with onebox type results, instead of sponsored ads on the right there are related search, like for narrowing the search or expand it, and related names. Binoculars, you can preview the site itself.
bloglines was acquired by ask.
blog search is currently tracking millions of feeds. subcriptions in bloglines will gurantee we know about your feed, basic SEO and standard blogg specific rules apply to gain popularity, blogging frequently helps, blogging original content that is useful, “suck a little bit less than the other guy”
high popular blogs can appear as smart answers on the ask SERPs.
He’s showing a screenshot of the bloglines search for webmaster world. with subscription and opinion vs news feed blogs.
best practices for ask SEO
expertrank and related search means more opportunities for ranking
garner links from pages that are on topic, this results in higher returns than links from off topic pages generic links are not bad.
configure robots,txt to allow the ask.com crawler.
langauge and country description
search engines have to guess language and country, you can improve our accuracy
may not appear in SERP if incorrectly detected or may gain an unwanted translate this page link
improve accuracy of language with plenty of text and using unicode
no spelling mistakes
Improve accuracy of country detection with ccTLD or hosting locally.
matt is curious about the demographics of seo, lots of people are older than 30, he’s also asking about democratic or republican.
the united sitemapes (sitemaps.org) a common format.
he’s recapping what’s happened in the last year.
Google calendar, gmail, google docs, google checkout
webmaster central, maps, custom search engine, google reader, calendar gmail, hosted services/apps, picasaweb, notebook
the webmaster central, opens up a dialog, it shows 40 different type of errors, 404’s robots.txt
it shows you what you’re showing up for and what people are clickiing on.
pretty graphs were just introduced, how many pages a day we getting crawled. how many kilobytes googlebot is using, and how long the pages are taking to load
google earths and googlemaps
google earth added 16 historical maps
custom search engine
it can index over as many sites as you want, anything from sitessearch to a list of as many sites as you want. You can apply a boost to specific sites and use regular results. and you can let people add sites. Bookmarklets.
it has been totally been rewritten matt has a list of blackhat sites in his reader. a few people in this audience.
live everywhere by feb
deeper crawling and fresher indexing
supplimental resaults are determined by pagerank
if you get more links you will show up in the normal results.
126.96.36.199 is still old pagerank for a while.
most of the new infrastructure will be transperant.
communication updates, googleguy showed up in WMW 5 years ago.
today 25 googlers, 5 people working on communication
webmaster trends analyst just hired to look to blogosphere for problems
blogs and forums
webmaster consule (robots.txt problems penalties)
Matt says brian white is the best guy on spam in the world, he’s being sent to europe to crack down on european spam
he’s showing a graph of how google handles spam reports.
its in the google webmaster console, you can also report spam via this console
they recieve more weight from the webmaster central, its more trusted.
is there a webmaster credit history? Matt wouldn’t “put it that way”. Matt thinks it would be cool though.
Tim Mayer says is all about trust.
there is discussion at yahoo about implimenting a big daddy type cache server.
someone is asking about scraping.
there are some tests running on different ways to spot scraping.
do you get penalized by being linked to from a scraper, matt says no inlink can hurt you.
she’s saying that adsense is the reason for scraping.
a random slice of adsense content and its mostly good.
will custom search have adsense. if you’re not a nonprofit you can get a revshare on a custom search engine.
will YPN ever go to canada, or do you hate us? Tim can’t answer that question, he’s on the websearch side.
any plans to allow a linkfromdomain command?
tim thinks its cool feature, they don’t have a launch date to annouce, but its cool
google has a IP operator, allinanchor and allintitle don’t get a ton of use from normal users, they might offer this type of stuff from webmaster console.
He’s suggesting to show where the 404s came from in the consoles.
some really specific phrases some sites may be on the fringe of having enough links to be in the crawl.
someone asked what the dilly with searchmash and matt is showing it
its a place for fun experiementing, a playground.
Brett wants NOODP, noarchive and stuff in robots.txt