Super Session: Search and Research on a Rail

Brett’s wife is pregnant.

Tom Hughes
He’s going to talk about brand positioning research. He starts off by taking a high level view. Putting aside CTRs and conversion. While we’re develivering website performance, we’re delivering brand experience. The website is the front line for delivering brand experience. Are we delivering the intended experience for the brand? Optimal brand performance increases thbe efficiency and consistency of communications, it gets everyone on the same page, drives strategry for search optimization, connects website efforts to sustainable business performance. Where do we have strengthgs over our competitors beware of “green fees” (if airlines started talking abtou safety, beward of things that everyone has to have to be in the game, things that ever brand has, look for advatanges that are unique to the brand) issues. What the customer’s ideal looksl ike and how can we close the gaps.
Secondary research: Syndicated studies, market data and industry reports.
Primary research: focus groups, individual interviews, surveys.
DIY: talk to customers, talk to employees gather secondary resarch information.
Get help with: questionaire design and analysis of data., leave these to the pros
He’s using jaguar as an example, every employee knew that “style” was the brand.

takeaway: a positioning and messaging template
start with position, essecnce of how you are compellingly different.
then idenfity who you are and what you do
differentiation how are you better
Siginificance why customer should care
messaging support key messages that deliver the brand (point-at-ables, what can we point at that will reinforce this)
emotional support, key motivational needs delivered by the brand

deliver optimal brand experience on the home page
incorporate pictures or taglines on sitepages that resonate positioning
define and audit search terms
launch banner ad campaigns that resonate the brand positioning
brand leve
product level
customer segment level

dell was at a crossroads for large vertical segments, use generic or vertical positioning? they conducted research to find pain points for the federal gov and they built a positioning to answer that and launched flash based ad. the browser he was trying to demo the ad on didn’t have flash installed. the first banner led you to a tour. there was a DC and a bagdad area to click, each led to seperate tours, the bagdad one to a defense oriented tour. They hit every pain point they discovered. increased click trough by over 200% and it drove buy rates over 15%. the campain was not rocket science, its about emrging web optimization and customer intelligence.
line of site on position + use it to guide web optimization.

Gord Hotchkiss
Keynotes at the show: guy kawasaki, gord bumped all the font sizes up on his powerpoint.
he also liked john battelle, he didn’t see alot of people writing his important point down, the notion of intent and how search is the navigation of web 2.0. gord says that really really really important.
Why is research important?
Search Marketing:

  1. Right message
  2. right person
  3. right place
  4. right time
  5. right experience

accept the path that the customer is in control. He’s usinig HP as an example of how not to do this, he’s showing a google SERP for digital cameras, HP is showing an ad for desktops and notebooks, the ad sends you to the main page, and there’s no cameras on that page. Gord says “what the hell”. He’s offering two hours of free consulting so he doesn’t have to keep pointing it out.
Now he’s showing a brilliant study of eyetracking that he made. Guy Kawasaki has his own entry on “how did you hear about this” on istockphoto.
Intent is key
get inside your customer’s mind
can’t do it based on query
you need to research, start simple, just do it, ask your customers, when you bought from us talk me through the process.
personas, stuff all that info into one framework.
how intent imacts searching
say your forgot to buy your brotherinlaw a brithday present, you don’t really like him
you know he likes john irving.
he lives out of town
want to spend $20 $25
you’ll likely start in the sponsored results the amazon ad matches the intent exactly.
as opposed to
you have lunch with someone who talks about the book, you read two irving books, you liked one, you didn’t like the other.
most people will look in the organic results with this intent, the second result has a good scent, its about a review.
Intent should impact scanning behavior for research type queries we would tend to “thin slice” sponsored content our of the way
for the purchase typ queriues we should focus more on the sponsored ads.
80 searchers on an eye tracking station.
a research scenario looking for info about the bellagio, another purchase scenario about making reservations.
people tend to take the top 3 and they break them off into a consideration set. sponsored ads are included. we like to have at least one organic result in the consideration set. When you’re doing a purchase the consideration set is more top heavy the reserch intent is more bottom heavy. much heavier CTR for the research queries on organic results, 100% of the clicks were in organic, in the purchase group it was split about 50/50.
The purchase group focused on the area needed to make a reservation, but ignored the text, the research group scanned the entire site. the user has control, don’t try to fool around with that. You can’t force them scan parts of the page they don’t want to scan, the user is in control.
intent is important, understand the target, udnerstand the intent, get the message in the right place, get the message right, get it there at the right time. deliver the right experience.

Glenn Alsup
he’s talking about malcolm gladwell’s book blink. and how the psychologist can thin slice the newly weds who can stay together.
the area between the user and the object casinos knnow the stats, but can’t know how the user will interact with the games.
qualitative vs quantitative
qualitative reject the idea that social science cans be measured like natural scinces, they feel that human behavoir is always bound to context
quantiative belive that human behavoir is measurable.
User research: understand situation and background, goals and objectives, use cases
tactics: strategic focus groups
model observation
case study agilent, low cost, technically complex products.
the goal of the research was ways to improve the search.
surveys show engineers prefer to do their own resarch
heuristics study, qualifying questionaire, usability lab testing.
Test plan: 1st pass 27% success rate (he’s showing a video of the testing, the users are frustrated
2nd pass 53% success rate. by using cascading menus on the new design, the engineers enjoy the experience more.
the participant, the facililtaro and observer(s).
there are labs that can recomend participants for your test
the facilitator is the most important, they are the people who conduct the study. they’ve used PHDS and people with common sense.
the observers are outside the lab two way mirrors or cameras.
the denver election commisioner was recently fired, they do lots of testing on machines. there is a testing doc that has to be followed before it can be approved.
remote usability: it doesn’t have to take place in a traditional lab. they’re using the resturant at aureole at mandalay bay. they use tablet PCs for the wine list. You can actually see and here the interactions with the menus from denver. remote testing is much cheaper.
he’s talking about branding and showing the airplane tales. There are pictures of animals on frontier airline planes, when they were doing testing, people were asking where are the animals they were concerned with which animal they were flying on.

Dana Todd SEMPO
sempo does lots of research, tracking baseline data, they track and trend over 100 data points.
sizing the marketing, pricing shifts, resource allocation, click pricing, click fraud, product demand.
SEO still has strong demand as an SEM tactic.
migration of SEO and PPC is towards inhouse.
click price elasticity is nearly capped out
awareness and concern about click fraud is growing.
percent of advertisers engage in:
80% engage in SEO
78% in SEM
40% paid inclusion
more people than ever before offering SEO, the dollar value going into SEO is lower than PPC.
There is a migration of SEO and SEM inhouse, dissatisfaction with agencies is high
huge influx of training resources. a whole new big product markets.
21% say they’re tapped out on PPC budgets. 4 out of 5 could tolerate increases.
agencies are more concerned about click fraud a ful 33% its a problem they’re tracking, another 33% think its a problem but aren’t tracking it. See Gord to donate click data for fraud study 2006 survey is up, one respondent gets a prize, it runs untill nov 29th. If you take the survey you get a copy of the results.

Brett Tabke
They’re doing a follow up survey to their older customer satisfaction survey, there were a lot of surprises last time.
“The Big Question” a book:
customer satisfaction survey’s suck.
WMW is looking for a new one to do this, how likely are you to recommend product X to a friend on a scale of 1 to 10
1-6 are detractors, 9-10 are promoters.
promotors-detractors = NPS net promotor score.
he asks that question about pubcon. about 80% are promoters.

Read More

Duplicate Content Issues

Amanda Watlington
She says its an exciting topic, because she’s seeing a lot of it. (She’s talking very loud and enunciating powerfully).
Typical causes of duplicate content

  • Multiple domains
  • Re-designs
  • cms
  • subdomains
  • landing page
  • syndicated, scraped or shared content

Tools for detection, your ability to search is your best tool, use a 10 word unique snippet, she lists a few tools.
Multiplte domains
they can occur when ownership of domains changes, when domains are purchased, when there is an IT and marketing disconnect or lots of changes in personell.
She’s using the example of vs the second domain was purchased later .
Use the unique snippet and check the content, check for redirections, 200 vs 301, and change them to 301. Remember when there is one there may be more.

site redesign
When a site is redisggned and the urls are changed without an seo plan in place. no type of site is immune. Platform changes (html to php) site with good search traction will simply have a bigger footprint to fix.
she’s using as an example.
Check for more pages in the engines than the site has.
depending on how the site is built this can be done via series of rules, 404 trapping
301 redirects
make sure any actions give the results you want
forward planning beats follow up.
make sure you have a custom 404.

ussually drive directly from the platform, sometimes the results of how products are merchandized, look for the problem at the product level page. she’s using a lightiing store as an example.
This is a very disjointed, badly organized session and the loudness of her speaking is harsh.
check product-level pages.
typically this problem requires a complete URL re-archetecture. The goal is to have a single url associated with a product page, no matter where it is linked to from the categories. this is not easy and the fix depends on the site’s system.

multinationals and coprorate sites often have this problem.
an example is which also have information on rats and mice, so they have made and they have
its not hard to detect if you know the reasons behind it.
to repair this treat them as seperate sites, content strategy should be used to make sure that the same content does not appear more than once.

landing pages
When a webmaster is testing multiple creatives and landings.
look at the search results, this is not a repair issue, this is a prevention issue.
use 301’s if this happens, but use a robots.txt before it happens prevent it before it happens

syndication and scraping
large sites are targets, some are the result of contractual agreements
there are legal issues. makes their content available to, this happens often with manufacturers.
normally you can just ask where the content comes from
to fix it you must try to ad value and uniquify the content.
discuss scrapers with your legal department.

Bill Slawski SEObyTheSea
The main problem with duplicate content is that SEs don’t want to show the same content over and over, but which version do they show? are there crawling issues related to having lots of the same page?
Large commercial site with 3500 pages, but google showed 95k pages, and he started seeing some patterns, one page showed up in ggoogle 15k times with different URLs, it was a lotus site. there were little widgets that expanded pieces of data, some pages had 21 of the widgets and each click produced a new URL which was getting indexed. Internal link pop was suffering and not all the pages on the site were being indexed, the widget was changed to javascript that didn’t create new URLs, the rankings went up and the number of pages in google went down.
Sites that practice ecomerce, but take prod descriptions right from manuf. when lots of other people do the same thing you’re hurting yourself. you should get unique content
alternate print pages, you can use alternate styles instead of alternate styles, or use robots or no-index meta tags on the print pages
syndicated feeds, most people like to use full feeds, but bloglines sometimes ranks better than your blog, how do you become the authority
cannonical domain name issues, 301 the non-www urls. it is a flaw with MSN’s algo that redirected pages may be classified as duplicate.
session ids, multiple data variables, keep stuff simple, risk mitigation, keep it simple as possible and don’t let the SEs decide.
pages with content that is too similar, title, meta, navigation are all the same, don’t just change little bits of pages.
copyright infringement some pages get copied so much its not possible to run to the legal dept everytime, the key is to make your page the authority.
subdomains, a site sells subdomains as a premier service, each one had a lot of the same pages on each one.
article syndication is it worth doing article synidation, they bring in pages, if the pages are different enough they might get indexed.
mirror’d sites may get ignored at the time crawling, which mirror’d site shows up.

Don’t Crawll the DUST Different URLs Same Content
there is a white paper and a google tech talk. it looks at some of the basic ways an SE can handle duplicate issues, like trailing slashes, http or www vs non-http, non-www pages.
Shingles are hash based was of comparing pages.
Dustbuster looks at rules for the page how the urls are formed and how there maybe dupes, and decide what it should ignore and what it shouldn’t. like and one is a pr10, one is a pr7
the dust paper does not detail which pages are kept which are discarded.
Try to avoid duplicate content as much as possible, you can’t control the people outside of the site so much, but you can control your own site.

Collapsing Equivalent Results.
MSN paper, it tries to go into which pages should be show out of dupes. example
and there are .com mirrors.
how should the SE handle these?
results storage, it keeps all the results, it uses a query independant rank factor, like a PR or page quality, navigational context.
it selects .com because “users prefer .com” users prefer shorter version of the url, prettier URLs. it might also include navigational version, whatever has the less redircts, less latency. you could be clicking on but actually being sent to You should look at white papers and stuff a insite into what engineers think may be important. don’t take all of this as gospel. keywords in the URL may also indicate which url will be shown.
Some sites have lots of different TLDs and the local TLD may be the one you’re shown by an SE. which page is more popular like link popularity, which page gets more clickthroughs.
lots of potential ways content can be duplicated, it can negativly influence your site
you should plan carefully, SEs are likely working on solutions, and some descisons on which page to display may surprise you.

Tim Converse Yahoo
He says all the suggestions so far are right on.
A lot of what he does are anti-black hat measures, and he says that might make him talk adversarially.
He uses “dupes” as an abbreiviation.
There are two bad examples of user experience, showing the same content 10 times and only showing one result.
they use crawl filtering, index-time filtering, query time filtering. They only show 2 results from one site.
they often want to retain some results instead of just not crawling them, because They use local preference for searchers, if there is a slight variaiton, and redunancy.
Alternate document forms, legitimat syndication, multiple regional markets, partial-dup pages from boiler plate, and accidental duplication are all legit reasons why people may duplicate content.
accidental duplication maybe due to session ids and soft 404s.
Now he’s switching to blackhat issues.
Dodgy duplication issues
duplication across multiple domains unnessecarily
aggregation can range from useful to abusive.
indentical content with minmal added value
repeated statements
scraper spammers, they attempt to combat this stuff and they do their best to have the orignal content, and they won’t say much about how. this problem is becoming worse.

Brian White Google
He’s going over the different types of dups.
duplicate detection happens at different places in the pipeline. different types of filters are used for different types of duplicated content. The goal is to serve one version of the content.
He says to pick a specific URL and block the non-preferable version, like with robots.txt. use 301s block printable versions.
Do you call the fragments or named anchors? Most people like to call them named anchors. these may cause issues, the whole page ill be used, anything after the # will be ignored.
Boilerplates can cause isssues, lots of boilerplate=lots of simmilarity
the same content in different languages is not duplicate
the same content on local TLDs is not duplicate
multiple domains with the same content should be 301’d.
Syndication: if you syndicate your stuff, include an absolute link to the original issue, if you use syndicated articles, just being aware they may not rank.
there are several ways to handle scrapers, SEs are working on it, and blog bots spoofing googlebot.
make your pages unique and valuable.

Google just said: NoFollow should really be “untrusted”, they will follow no follow links for discovery but the link with the nofollow tag won’t affect rankings.

Read More

Site Structure for Crawability

Tim Converse Yahoo
He doesn’t work for the crawler group, but he knows that they find annoying. Crawlers are simple minded. Visit URL, store contents. Extract all links, decide when to crawl those links and decide when to refresh that page. Their crawlers runs continuously. External links and domain registration is how they find new domains. Crawlers are behind a few years from modern browsers, as in javascript, flash, css. your links should be in vanilla html, don’t rely on fancy other things being or not-being processed.
The perfect site for crawlers:
all pages reachable from root page, tree structure. Link to a sitemap.
links should be extractable in plain html, view source, or use old school webbrowser.
if every distinct URL matches up with distinct content, multple URLs to same cotnetn looks like dupe
limit dynamic paraments,
sessions and cookies should not determind content.
they won’t crawl stuff that is blocked in robots.
they do pay attention to internal anchor text.
Dynamic sites
constructed on the fly, almost an obsolete meaning. the url has arguments after the question mark. SE’s look at the URL if there is lots of parameters, its likely to be duplicated.
stay away from sessid urls.
map non-? urls to dymamic content
or provide session id free way to navigate the site.
soft-404 traps are bad. if the URL is bogus, send a 404.
worst case is a status 200 on a bad URL with a link that doesn’t exist, soft-404 trap makes it hard to get real content.

evil bots may not obey, but all major SEs will.
don’t accidentally screen good bots out, often robots will screan every bot but a single one.
yahoo added extensions to robots.txt: regexp syntax. google also supports the same syntax, use to screen out dupes.
when the rules disagree, the longest pattern wins.

moving sites
301 to new site and hang on to old domain as long as possible, as much as they can they will migrate ranking to a new site (this is being reworked). Map old paths to the new paths rather than just redirect to the root.

site explorer
links, when the last crawl was, what the representation in the index.
authenticate site at siteexplorer and specific a feed of URls to crawl.

He’s now showing a list of resources and help pages.

Brett says there will be a big annoucement from the SEs tomorrow at the super session about this stuff!

Vanessa Fox Google
What ever will work out well for the visitors will work our well for the SEs, how well can a visitor navigate your site, are all the pages crawlable via links, how accessible is your site, ie with extras turned off or a mobile browser.
Take an objective look using a text browser, have someone look at your site and see how easily they can find things.
Use technology wisely: have alternate ways to access stuff. Don’t use splash pages, minimize flash, use alt tags, don’t put text in images, don’t use frames, minimize javascript. Use transcripts for videos.
Get the most out of your links: use anchor text links, minimize redirects, make sure every page on your site is accesible by static text links. make sure you have an HTML site map and link to it from your main page.
Webmaster Central: crawl errors, bot activity, robots.txt problems, use a sitemap file.
Check your site in the SERPs: she’s using an example where a site’s site: command shows redirect pages, bad title tags, incorrectly optimzed flash pages.
Is only the URL in the SERPs that means you may be blocking acces to a crawl
if all titles and descriptions are the same that is bad, have unique description and title tag.
if your description is loadin.. loading… loading, you’ve got problems.
She’s showing webmaster central.

Mark Jackson
Don’t haphazrdly jump into a redesign. he’s showing a picture of micheal jackson young and old it says “Be Careful”.
think about SEO early in the process, begining to end. Make sure you have the content. Use keyword research for Information Archecture. Balance cool design with strategic, no flash sites. assign keywords to each page and each URL, 3 phrases per page. validate the site, make it section 508 compliant. copy should compliment keywords. Use 301s.
Information architecture
Authority sites are very deep. write good content for each product, put keywords in descriptions. keep old content, like archives newsletters. Study analytics, current pages could be ranking.
Size matters. wikipedia is a very very deep site, its an authority site.
SE friendly doesn’t have to be ugly (he shows and ugly site)
He’s using an exmaple of a site with a flash popup on the homepage. you should keywords on your homepage and have navigation.
avoid image, flash or javascrip navigation. it is better to use css/text navigation. use keywords in anchor text, name pages using keywords.
validate all pages on the site, esp the homepage and sitemap. limit use of javascript, put it in a includes file, use CSS. watch page load and times. host on apach or IIS and use URL rewriting.
he’s using an example of cookies by design where they cut the page size from 120k to 26k and put keywords in left-hand navigation.
Don’t use spacer.gif The site ranks pretty well. (#19 on gifft baskets?)
250 words of content for interior pages, 400 on homepage, no keyword stuff, make it readable, link to relevant pages, internal linking is very important. Don’t use “Click Here”.
He’s using an example ( they use web conference in internal links and they rank at 25 now.
watch URL structure, and canonical domain, pick www or non-www and stick to it.
Use URL rewriting.
Use 301s.
Hypens or underscores? very unimportant, use either.
Unique title, description, and keywords, put the most important words first.
don’t stuff keywords tag, but use a call to action in your description tag.
now he’s using an example they put modular homes/manufactured homes/mobile home in the title tag and they rank well now. Wikipedia beats them in some places (size matters).
use good IA, use keywords, keep URLs static and use 301s.

Brett Tabke
He’s talking about webmasterworld. They have about 2.5 million pages. The content can be rolled a lot of different ways. They also have a mobile version of the entire site, and a printer friendly version. something like 20 million pages the SEs could possibly see. About two years ago the rankings totally dissapeared because a bot got into the printer friendly version and caused duplicate content problems.
They just reworked the entire site’s URL structure. The most challenging programming he’s ever done. they didn’t want to redirect the old URL, they just used the new URL for new content. The new keyword urls worked better for SEs, they’re turning up for more keywords. User’s bookmarks were a big cause of confusion. They’re using a dozen different variations. The SE’s haven’t hassled them, they’re indexed better than ever before.
The site is all custom software brett wrote. He setup a big network of sites for john deere. they got a lot of questions about how to structure a site. try to find the sweet spot between SEs and users. The SEs are digital and the users are analog. Just before infoseek was sold they were working on a theme engine, where a site would be indexed as a whole for 20-50 keywords. He brings up the theme pyramid table, and the webmaster world table. The structure is simple and based on keywords. They callit the “longer tail”.

someone suggests making your site so that a blind person can use it.

someone is asking about hidden divs and saying google traffic has tailed off
vanessa says if the javascript is turned off, the divs should be viewable, and hide the divs via javascript

if you use a google sitemap, the sitemap only augments the natural links. In addition to the “free crawl” says yahoo.
google seems to be case sensative, vanessa suggests redirect to the chosen uppercase or lowercase situation, yahoo agrees.

Someone is asking about page load time.
Yahoo says page load time and page size don’t matter for the crawler, but its not a good idea for a user. Yahoo doesn’t penalize pages that take a long time to load or that are large. Google agrees, there is a timeout, and they’ll only use up a certrain amount of bandwidth on the site. Google won’t penalize for long times or large size, matt did a post about. Its about sales, not traffic.

Someone is asking about multiple urls pointing to the same site.
There will be a session about duplicate content after lunch.

Read More

European and International SEO

Wow, the exhibit hall just opened and the Google guys were taking 100 beta testers for their website optimizer and it was mobbed, pushing and shoving. Brutal.
Christine Churchill is asking how many people are doing international marketing, quite a few hands went up.

Dixon jones:
He says he only speaks english, but europe is his home turf. He’s expressing to MSN spaces for speeches and things that he does.
Receptional is an international marketing agency, they have two top 10 ranks for “internet markting”.
Europe has 90mm more internet users than the US right now and there is another half billion that aree coming online soon. There are 40mm who speak english in the UK. In UK and Sweeden 10% of all ad spend is online, that’s bigger than the US.
This should tell you that your market is 5 times larger than you previously thought.
The US is only the 7th richest place via GDP, but via population its the richest county. Real GDP growth rate shows “weath of the future”: China #10, India 24. The US is 139th. (between namibia and guatemala).
Attack is the best form of defence (unless you are guy kawasaki, in which case military metaphors are banned).
Language differences can be a boon for SEO’s, in the english langague “london” has 50 million results, but in spanish it only has 4million results, but in a few years there will be as many people speaking spanish as english on the web. in greek its 700k.
He says there is no easy way to do international SEO. You will need a website in the local language from top to bottom.
You can do the SEO yourself in the language, or you can find a local SEO.
If you do try to do SEO yourself:
Buy a TLD for the country
or host the site in the target country
regionally theme your links
use native translators
minimize legal issues and “troops on the ground” untill you are confident.

Jessica Bowman
This is her second day on the job with
She asks how many people have optimized or are going to for a language you don’t speak.
Keyword research
you need a seed list, brain storm with a native speaker and go to the competition.
Often other country SEOs aren’t up to snuff so sometimes keyword lists won’t be good on competitor sites.
keyword tools are a challenge, wordtracker and overture is not good, she recommends trellian’s keyworddiscovery.
The volum of search results in english tends to correspond to the volume of searches. Like competition numbers.

Page Optimization
find a native copy writer, but its not always an option.
Translation companies, not many know SEO.
learn the translation process inside and out, breakdown is inevitable.
They use automate tools, not good for SEO. Get a demo of these tools.
Sometimes translations are out of sync.
Consider SEO and non-SEO version of the translation data. Sometimes certain words need to be translated a certain way, but they won’t want to do the whole site like that.
Check the translated copy. examples of problems:
“you are not allowed to login” for an incorrect password error
wrong words.
different words referencing the same things on the same page.
Make sure that your validators check against the english version. otherwise they’ll miss inaccuracies.
sometimes it comes back very long and doesn’t fit in the allocated space.
content added to expand on the meeting (legal issues) sometimes the translation services add stuff that was purposesly left out
train the translator on SEO give examples of good and bad.
Send useful info, screenshots, is it a button, link etc character limits.
send corrections back to translation company.
“car rental” vs “car hire” the company could not translate UK english.
germans have high expectations, translation needed to be accurate,
german tends to come back 3 times longer than the english.
german grammar is a killer for SEO, like cancatenation.
There are two forms of the german language. formal vs informal
treat UK vs american as a different language.
the british are more verbose
you need a native speaker to review copy.
if you are translating into spanish you must consider which region you’re targetting.
use North American spanish for US spanish speakers
spain spanish is very different.

Micheal Bonfils
10 steps to crack Southeast asia, about 300 million internet users, the majority is coming from chinese speakers (118MM), japan (86m) south korea (34m). China on 9% of the population is online. if china had the same penetration rate as taiwan they’d have (784m) users, it would almost double the total internet usage.

  1. undertand your audience
  2. understand domainnames
  3. hosting
  4. SEs
  5. translations
  6. keywords
  7. paid search
  8. organic search
  9. analytics
  10. red tape

Understanding your audience what is funny and creative over there may not be funny and creative here. (he’s showing some corny cartoons) the internet is used for researching, but the buying is often done downstairs. products are cheaper, branding and research is great, direct marketing may not work as well. In china about 50% of all usage comes from a cafe.

Know the major SEs the big SE in china is baidu (62.1% of the market) google has 25% and yahoo has 10%. Baidu’s growth is huge. in taiwan 90% use yahoo, 5% google, in japan 55% use yahoo, 35% use google and 5% use msn, in south korea, 63% use naver, 14% use daum and 11% yahoo korea. He’s showing a “bruce lee search relationship chart”.

Get an asian domain name make sure your domain name is pronouceable, google is not pronoucable in china, they call it googoo, they can’t make the “gle” sound. the TLDs are .cn/, .kr/ .jp/ .tw/ .hk/ look at competitor TLDs. you can ussually easily purchase asian TLDs.

get hosting you can host here, the access is very slow, but economical and easy to pay for. Hosting in asia can be very expensive. Chinese hosting is very fast for chinese users, they are politically regulated (the bigger the more regulated) online payment systems can be hard to find, you oftne have to wire., xinnet (he showed some hosting companies).

Translate Well don’t make mistakes:
pepsi: meant to say brings you back to life it really said pepsi brings you ancestors back from the grave.
coca-cola: they found a word that sounded like coca-cola, but the word means “bite the wax tadpole”
ad:tech: Cost per lead translated badly to cost per leadership.

develop your keywords the same expression in english may be expanded to 15 variations. simplified chinese, traditional chinese (hong kong) and taiwaneese.

Impliment Paid Search: Start with paid search, SEO is a “tough nut to crack”. Local paid search “baidu” paid listings are sometimes in the center of the listings in baidu.

Organic Search similar SEO to the US. link pop is not as strong a factor (in google it is still). think asian (webcrawler and infoseek) think local, local company favoritism, local hosting, local domain. (He shows a list of submission links).

Understanding reporting: daily spending reporting, he shows a baidu keyword report. its hard to find cost-per-impression in china as much. Yahoo is similar, Impliment google analytics.

Understand the Red Tape: in china: customer service is very limited, not as friendly as the US, they’ll hang up on you etc. No APIs yet. ebay is one of the only companies that has an API from baidu. CPM reporting is missing. Kickbacks and discounts are very commonplace there. bribery for favoritism is commonplace. SEs often give discounts to agencies sometimes. Poor payment methods, you’ll probably have to wire funds. political favoritism, chinese competitors are partnered with the gov, money flow goes in they don’t like money flow out. High speed access is limited, most people are 56k, except for hong kong and taiwan.

Barry Lloyd is in china so:
Andy Atkins-Kruger:
He says brits are verbose, heh. Multilingual search blog.
He’s showing a sort of venn diagram of SEO, PPC, pr and something else (I missed it).
Now a graph of how the internet grew, africa asia and europe are on top.
Research is key. he uses “football boots” in british, with would be “soccer cleats”. but spain swould be the biggest market for that. You can chose central control or an in-country agency.
keyword list,
You do need a local domain, its not always possible but if you can get it, you should, local links are also important, but beware of duplicate content issues.
Most of the time the SERP result numbers are corrrelated to the amount of searches.
plurals and singulars are often very different in languages. football boots vs scarpe da calciao/scarpa da calcio. Football boots are never searched for as a singular, but in italian they’ll search for both. prepositions and particles, people search both with and without prepositions. Accents, people will also search both with and without. 50% of search done by french people don’t include accents where they’re suposed to. Alternate characters: germany had a spelling reform in 2000, people might search both with the correct characters, some won’t. aggregation/concatenation keywords often run together. people tend to split terms commonly when they search. (less competition on the less-correct terms). declensions, SE”s often don’t do well with non-romain languages. local algos are written to handle these better. free online mistranslation, don’t use them.
There are many markets where google isn’t first, go local.

Read More

John Battelle Keynote

John said that New Orleans pubcon was the first time he talked about federated media and if this audience didn’t boo him off the stage he felt it could be a good idea.
He’s talking about digitizing the back-office and showing a graph of people who were involved in technology, a little over 10 million in 1970. Now digitizing the front-office with over 100 million people who were touched by technology in the 90s. His mac just showed a blue screen and he said its an intel mac and its “totally screwing with him”. He says its bi-polar, it doesn’t have any idea what it wants to be. (He asked somebody to blog it, and I just did). Now the graph shows the digitization of the customers (web 2.0) with lots of people touched by technology. He says he wishes he made a bigger deal about search as interface in his book “Search”.
He’s showing the DOS interface with dir. Now the desktop GUI with “hunt and poke” interface. He’s saying there is still a list inside the window.
He says we’re in the 1.0 of search, the command line of the search as an interface, with a list of results, like dir. The big difference is that we’re using natural language now, instead of some arcane computer code. Natural language as a way of navigating computing is a huge jump, but we’re just getting started, we’re still at the command line. search as steering wheel.
He’s talking about using a handheld device to search for a wine label while you’re shopping. Price comparison, reviews, collaborative filtering, company info, merchants that are selling it, buy now, get it shipped. He says roduct inventory information has to become open an available.
Search is the driver of web 2.0 businesses, and it is a turning point for all forms of marketing.
Intent over content old marketing was content driven, but with search it is intent driven. Intent becomes a proxy for audience. It is non-innteruption marketing.
Search drives social media. You expect every community to know the rules of social media and the community expects businesses to know how to deal with it. Marketing becomes dialog.
it uses to be all about distribution, print media distribution channels charge extortion type rates, but now its all about attention.
Conversation over dictation
new marketing is driven by permission not interupption.
how do you do brand marketing where people are already talking to eachother and the site? Turn over control to the consumer. The best businesses on the web let their customers help build their business. amazon, ebay, myspace. can we do that in marketing?
He’s using the thinkpad as an example, they were about to go with a chinese manufacturer and consumers were concerened about the brand being tarnished. He suggested that IBM take customer input but the company freaked out. “our customers buy the stuff we make”. They ran a poll about titanium vs black thinkpads. Titanium won, they’re going to make both.
Microsoft was running one creative across all FM properties and the authors were not happy. One suggested context-specific “author copy” and they ran both. The author-copy improved performance 60%.
Symantec started a blog and wrote a story about zero viruses on mac and it was dug, now symantec only runs RSS ads across FM properties.
Cisco wanted to drive discussion of “The Human Network”. They were going to put up a wikipedia with all the FM authors writing what they though the human network was. they were suggested to move over to the commercial wikipedia and people voted on which definition they liked. later a wikipedia entry just happened on its own.
Dice is an IT jobs board. They ran a rant banner, it allows you to type in a little box a rant about how you hate your job and every rant is shown across all properties. Almost an IM chatroom. Conversation through marketing. The average interaction time with this ad was 8 minutes.
There are 2 ways to work with FM. self service or the sale department. Reader surveys across properties are inputted into the self service system. He logged into the self service system (with a 3 character password!). He’s showing the demographic targetting properties, it shows relevance data for properties that match your demo search.
He noticed that his search blog was getting into the tens of thousands of readers. Boing boing had a 500$ bandwidth bill for having half a million visitors and they came to John with this “problem”. So they bundled this into FM and built some reporting and analytics software. Their mission is to cultivate relationships with authors by focusing on quality and engagement. All advertising is approved by authors.
FM Mores:
voice and POV
FM has almost 100 sites, more than 750mm impression/month. great demos, doing over a mil in month in revenue, sales force of 15 engineering staff of 4, author services staff of 4, almost 1500 advertisers on the self-service platform.
Church and state, editorial and advertising.
the thing that blogging allows that old media doesn’t is transperancy and trust. He says that you shouldn’t “sell your words on your editorial site”. The blogger has a conversational relationship with the audience.
Google’s radio, print and tv ads.
He called his conference “web two oh” (no point). You have to have a conversation with your audience. FM is in the “cream” business, and google is in the “milk business” they’re partnering with google.
whats the most unique thing he’s seen a blogger do to increase his readership?
Lists do really well, but there is no replacement for having a high quality conversation and author passion. There are two ways to publish, write about what you’re passionate about or look where the market is. The most passionate and high integrity voices always win in blogging.
Somebody is asking about the “legacy business of hosting” saying he’s one of the original hosts, that nearly everyone used to be on. He’s asking how to take customer input.
John suggests to open a forum or site where you can ask people what they want, listen to mass feedback and engage in a conversation.
someone is asking about pricing models for “cream” vs “milk” is cream always going to be CPM? why not CPC or CPA? And what percentage of advertising is rejected by FM authors?
Cream pricing will generally always be CPM. Becuase it is difficult to measure when the “loop is closed” for brand advertising. When an author can say no, it means a yes is an affirmative vote, its about 98% yes.

Read More

Afternoon Opera Keynote Jon S. von Tetzchner

Brett is talking about how long he’s been using opera and how much he respects it for charging for a product the two other major competitors gave it away for free. He’s had over 500 computers and never run an anti-virus program, because he doesn’t run a microsoft email client and he uses opera.

Jon starts talking about what Opera does as a company. He says they’ve been doing browsers since 1994. They setup the first intranet in Norway. He says they take one piece of code and they make it run on any device. As opposed to pocket Internet Explorer Opera uses the full browser on all devices. People wanted them to make an OS, an office suite or cross platform, and they chose the third option.
He says the browser is the glue between devices.
Its a worldwide web so they’re a world wide company, they have office all around the world. People are coming from all parts of the world to work for opera.

  • 39 million people have downloaded opera on the desktop this year
  • 150 million downloads since 1996
  • 40 million cellphones shipped with opera preinstalled
  • 7 million active opera users
  • more than 500,000 community members

He says they work with many of the best vendors around the world.
They have 340 people in their company. He says they dream about browsers, its the only thing they’ve done for 12 years. Jon is one of the two founders, the other founder passed away in april.
He says “dedication leads to innovation”. They focus on what people want, tabs, speed, running on any (even very old, crappy hardware), sessions, zooming (opera did it in 96, ie is just doing it this year), mouse gestures and security.
He says “there’s only one web” so browsers should work the same on all platforms.
There is a trend that many new apps are coming out as web apps and practically all of these are based on AJAX. People are making widgets that will run anywhere there is opera and they’re trying to make their widget standards open with the w3c.
They are very pro-standards, they passed the Acid2 test, they support AJAX, SVG, and 3D-Canvas.
They are all about finding ways to protect you. They would rather error on the side of security.
They have cutting edge development tools. Viewing the mobile version of any page, and viewing the DOM tree.
They use the same code base on all their different platforms.
They do data compression in the opera mini to speed up browsing and cut down on bandwidth costs.
The ninento WII will ship with opera, as does the ninendo DS and the sony mylo and the nokia 770.

Why do developers want to code for Opera?
Its based on standards so coding for opera is good for you, and they do have a siginificant about of desktop users, but especially mobile browsers.
They have a group of people who work with webmasters to make the sites work in opera, one of the biggest problems is “if (opera) do nothing” type code.

Where do you guys find the balance between standards and user experience and security? (brett uses styled forms as an example)
They try to err on the site of security, but they often try to find ways to make standards and user experience work. They’re working on high security certificates.

Firefox is gaining momentum and IE7 is coming out, is opera getting squeezed out?
They’ve outlived all the other browsers that people have said they couldn’t compete with.

Read More

Corporate Meda Site SEO Management

This panel is also called “big fucking site management”.
Andrew Gerhart Primedia Automotive:
The first speaker is talking about a site he just took over and the website evaluation they did on it.
it was a strong established site, a market leading authority, lots of incoming links, exisiting content, fresh content and they had previously done some SEO training. The site was built on a modified open source CMS system that output flat html files, the code was a mess, the majority of the content was not opimized, and it was input manually. The site’s structure was not optimized and it linked out to partners for specific content and the forum was no SE friendly. There was room for content expansion, minimal changes would create big improvements. It did not yet rank for its main target keywords, but it has an existing brand awareness that could be leveraged for links and new traffic could be monetized immediately. Time, resources, and politics were all limitations on the project. It was also built on a multiplatform system that could not easily be integrated. They set out a list of goals including training, optimizing existing content, restructring the site and optimizing the code as well as building new content and links to target keywords.
SEO was labeled as a priority and contacts were established through out the site. A policy was made that all new content had to be optimized and reviewed by the SEO team. A baseline was established and new content was monitored. Basic SEO training was undertaken including documenting best practices and training sessions for everyone that would touch new content.
They went through all the internal content to optimize it. All new and exisitng URLs were documented and new URLs were 301d. The existing site code was optimized, including the homepage for the main target keywords. Internal linking structure and sitemap were optimized.
The new content was where the biggest gains were seen. New pages were built outside of the CMS and links out where replaced with insite content. They also relaunched the forums and blogs in new SE friendly systgems.
Links were obtained, internally and externally to correctly target keywords. The brand was utilized to gain new links, and thier network was leveraged. They saw 70% increases in traffic and eventually were able to completly redesign and relaunch the CMS.

Robert Carilli
Their mission is to create and deliver the most compelling shopping experiences for their customers. They started out as and become in 2004. The main driver for their growth has been search engine marketing (he says the domain has helped too). They have tens of millions of web pages. They have british and japanese sites as well as more in the works.
Team Organization:

  • Keyword review
  • Account Management
  • SEO technology
  • SEO Marketing
  • Analytics

keyword review
They review tens of thousands of keywords per week, they have millions in their database and they use technology to manage that list and it is constantly under review as well as trying to identify new keywords. He uses “small child” as an example where advertisers are bidding on “small child for sale” he says anyone with millions of keywords should be reviewing thier lists to prevent this sort of stuff.
Account Management
This is focused on their PPC efforts, updating bids and keyword campaigns. They work closely with partners and generate reports and do analysis on their campaigns.
Keyword Landscape
they have to plan for seasonality, changes in the bid marketm stay up to date on the nuances of the search engines, anticipate current events, manage trademark issuess, and track ROI on the keyword level. He uses the Steve Irwin death as an example of user intent changes, they got a lot of people searching for videos, not to buy products.
Bot SEO, datafeed SEO and Marketing.
Bot SEO is traffic that is driven via SE crawling and indexing of their pages. They are constantly reviewing to make sure they’re applying SEO best practices.
Datafeed SEO is based on product driven sites.
Marketing SEO revolves around link building and link baiting, which leads into content development and enhancement. This also includes SMO, press releases, reputation management and monitoring, and, again, regular reporting and management. They also work closely with creative and marketing teams.

Chris Boggs Avenue A/Razorfish:
They’re working with and SEMPO.
He starts with an example “insert printing”. He does a yahoo site explorer search for the top result and it only has a few incoming and none are great. He has a theory that its due to the internal linking power, structure and anchor text. They use CSS sidebar links. He says internal linking can be very powerful. He’s basing the rest of the presentation on a large site he can’t name, its a fortune 500, with tens of thousands of “sites” in subfolders. He says the site has problems with “too many chefs”. He says if you’re user friendly, you’re search engine friendly. Directory folder structure is often a problem, meaning often loosing continutity in URLs. He says it is worth taking the time to rewrite the URLs or rework the CMS to use clean URLs. He often finds client-side redirects including javascript and meta-refresh. There were multiple webmasters using different techniques which lead to alot of content not being indexed, he also says session ids can be a problem.
He goes on to discuss best practices for internal linking. A big problem is javascript navigational links. He suggests using CSS links that mimic the javascript rollover effects. And he recommends doing good categorization and breadcrumbs and using good link anchor text. he says there is very little that can be done in creating an automated system that can optmize internal linking like a human being can. He says to focus on relevance. He says “link popularity” can be decieving and that often with a huge site you need to increase deeplinking. He says building internal links is like a business relationship, you have to approach this in a manner where you are entering into a business relationship with another company. Its all the same SEO, just on a much larger basis.

Aaron Shear>:
He doesn’t have a full presentation. He’s asking about sites with 500+ hits a day and how SE’s can handle load balancing systems. Specifically thousands of URLs pointing to the same content. This was the biggest nightmare he walked into when he came to He says that response time is also important. Now he’s talking about focusing on the long tail. He says that is a search site which provides some challenges when trying to optimize for other search engines. He says the most effective approach that he uses in a large company is selling SEO as a needle mover for the company and giving engineer’s the credit.

Todd Friesen
If you’re sitting on a large site, you have a lot of power inside the site, you just have to unlock the content and leverage internal linking, especially with the long tail terms. You very often don’t have to go outside of the site.

Read More

Link and SEO Dev Site Review Forum
Canonical domain issues non-www to www verisons. Rae says it will take google months and months to fix the problem.
Using the MSN links from domain tool it shown that he “suitably stingy” in linking out.
You can create extra content that will get links in based on video and stories about what people do in bed. has some duplicate content, and it should be blocked from search engines getting in. They’re on two different IPs so its a little shady, both are also on shared hosting with lots of other sites on the same IP. The descriptions are all indentical and irrelevant. They have built the catalog on a databse that came from the manuf, they need to better edit the descriptions, because there are lots of other sites that are very similar. There are a lot of links on the left hand side that are repeated across the whole site, it is recommended that the template code be slimmed down. Rae suggests putting javascript and style code in external files.
Also has canonical domain issues. There is some possibly misleading anchor text about financial management under the “quick search” headline. those could be more relevant, including entertainment career type keywords. Those links are telling the SE that those pages are about accounting and finance. There are a lot of othersites on the same IP that were purchased and masked to the root domain. It is suggested that they should be 301’d to the main site. In theory it will pass the links to the new address. Don’t use javascript, meta-refresh or multi domain masking.

Rae is using, great tool, same thing I use.
The marketing department buys links. Lots of radio station links is very obvious that they’re paid links. They use a dynamic number (based on where the visitor came from) in the title tag, lots of people are phoning from the telephone number. Thousands of 404 URLs are indexed still. Looks like there was a problem that was fixed. There is a flash front page, which was recommended by a ad agency. The marketing department is only buying obviously paid links and it is recommended that they learn about organic link development. It was suggested that they create a wikipedia page with a backlink about the sleepnumber.
Rae says the site is very nice looking and usable. The main page title tag should be expanded on. There is a correct 301 from non-www to www. There is a list of countries on the left side of the page, they’re all image links that could be changed to anchor text. The site has 79,000 links, some are paid but they’re pretty relevant. The site is ranking decently for its main phrases, but they’re still looking for more traffic. He said he was just looking to show off.
They question that having a domain won’t rank well in other countries, like They need to redirect the non-www to 301 to the www version of the page. It is suggested that they build out seperate language specific domains. They have another domain (a .com) Its recommended to invest in extra copywriting to make each site unique. They need more links, only 1500 are showing in yahoo. They’ve also got translated domains across country specific tlds and they’re asking how to cross link those domains, it seems that it should be on a user-centric case by case basis, but it should be ok if they all have unique content. Its suggested that its done in “small doses”. There is a email in a simple textlink on the page. They also should not put up little sites if they won’t put “ommph”.

its recommended that all changes made to the site are recorded in a master document and who made them.
There are 7,300 pages: nutritional database pages. They should group, theme and crosslinking the nutritional database pages. The default directory index is set to a non-index .asp page, possibly duplicate content issues, as far as google is concerned its dupliate. They are using a different URL with keywords that are also in the domain and they suggest that they 301 that page to the root and use that as the homepage. The homepage features bad food links to pages of branded food, they recommend that they pull the trademark symbol out of the links to these pages to make them rank for the brand keywords. Some URLs on the site are long and filled with GUIDs. Again Rae mentions the code heavy pages and moving javascript and styles into external files.

There was an MSN study released some months ago that showed that if a page does not have a lot of non-display content it is very likely to be spam.
non-WWW to 301 to the WWW. They have some older domains using meta refreshing to their main domain. They should all be 301’d. The homepage is flash, there have been some suggestions of browser detection and delivering non-flash pages, but they’re told to be careful when going down that route as it could possibly mistaken as cloaking. They are asking about using sifr but they’re cautioned against using it. It is being recommended that they use consistent URLs to link to specific pages inside the site. Another reference to pulling javascript out to an external file.
Very very competative niche, but there are only a few hundred links. Its recommended that they do a backlink search on the old competitor and get all the same links. When new sites are put up its very important to map out old pages and 301 them to the new sites. Its recommended to look at logfiles for 404 errors and try to 301 those pages into real pages on the site. Rae suggests putting the site on its own server/IP. Its suggested that they register a “snazzier” domain name and 301 it into the main site.

Read More

Yahoo Search Marketing Lunch Panel: The New Advertising Platform

Dan Boberg starts off by talking about pubcon history and yahoo’s involvement with it. Then he mentions panama, they’ve talked to a lot of users to get input on the new system. They’re going to do a demo of it. He’s saying they’re moving away from a search listing position to a marketing position and away from manual editorial review towards algoritmic review. He says the system is currently very linear and the new one should be more dynamic. He mentions a/b testing, demographic targeting, click to call, geotargetting, clicks forecasting, quality index, optimization and analytics, fast ad activation.
John (something) takes over. He’s going to into their production environment and walk us through the creation of a campaign. New homepage has a new section of alerts, communications pushed out to the advertisers, to show things that are not working, like running out of budget, ads not showing, editoral rejections. They’ll also show opportunities there. the advertiser can specifcy what types of alerts they’d like to see. He shows an alert that say funds are low and the link leads you to the section to add more funds.
They also have a section for trending information, on performance metrics, a “really quick snapshot” of how they’ve done over the last period of time. He offers to take cards to get entered into a pool for testers.
He starts to create a campaign for a hypothetical client, and set some geotargetting parameters. The client is a rental home in colorado. Rather than just setup a geotarget for colorado, he’s going to use advanced features to target searchers outside of colorado who might want a home in colorado. The target criteria is “city and surrounding area” he’s looking for the city by drilling down into a state. He’s targeting san fransico searchers who’ll want a vacation home in CO. The map highlights the taregetted area. They’ve made some acquisitions in the geo-targetting space that should provide advertisers with better reach into a geo area. The technology is better at disambituating search keywords like “soho”. He’s now setting up “targets” or keywords, there is a choice between sponsored search and content match (YPN). He’s putting in 3 seed terms and his URL and the system is getting keywords from the landing page he identified to return a relevant list of keywords. It looks similar to google’s keyword tool. You can select keywords you want and you can setup negative keywords, There is a collaborative filter to further develop the keyword list. The system gives him a refined set of keywords, and he picks the top 4. There is a bug in the step towards the bidding process. Supposedly there was a problem with geo targetting and click forecasting, so for the purposes of the demo he’s now setting the geo paramters to a national campaign.
He shows the new “bid landscape” page. Based on a bid inputted by the user the system is showing estimated clicks, cost and position, there is also a slick ajax tool for checking out the tradeoffs in bid pricing.
He’s writing an ad. There’s a campaign review screen where he can view all his parameters. There is a daily bid limit which can be used to estimate the number of clicks the ad will get. The forecast system is for agencies to go to advertisers with opportunity and budget data. There’s another review section.
It looks like he’s explaining some user behavior data gathered by yahoo search data. They’ve got some features for indirect conversion data, like which pages “assist” in a sale.
John Slade takes the mic now. He asks how many people use tivo to skip ads, and how many people know how to strip ads out of websites with firefox. He believes that eventually people will be able to block irrelevant ads. He says bid-based ranking allows irrelevant ads to be shown, the new ranking system will show users those ads that are most useful and meaningful to users. An ads performance will influence how you should manage your bids. better ads will get a reward for good relevance. He says that bid wars do not help create compelling messages and that is what the new system is designed to help reward.
What will the upgrade look like and how will I get in?
He mentions early adopters vs late, people who want to get in before the holidays and those who do not want to disrupt the holidays. They are scheduling upgrading so they can take these different users into account. They want to be able to iterate quickly and take into account user input. If you want to get in, they’ll show a URL how to get in quickly.
The upgrade will not cost anything.
The new ranking method will take over sometime in q1 after most advertisers are on the new system.
The agencies and tools should be able transition seemlessly.

Is there a mass update tool?
The new version will be able to upload campagns via spreadsheets
There is some effort going into new advertiser offers, set-it-and-forget it setups for novice webmasters, as well as vouchers.
Will there be a client center for agencies?
They’ve made some choices about features in releases, and they think they need to improve multi-site management features. He’s really emphasizing that iteration will be quick and easy.
Is there a plan to clean up the API?
there is a new API to go with the new system, greater stability and features, they’re rolling out API broadly access in q1.
When will the new system be open to new accounts?
No date is being annouced for new accounts, but it should be soon.
The ability to view reports across markets is not in this release.
Will 3rd party bid management tools take on a lesser role in the new system?
They work closely with tools and agencies, they believe it is a market descision if these tools survive in the non-pure-auction bid ranking system. They think there will be different types of tools, instead of bid optimization, advanced click forecasting is used as an example.
Are quality ranks taking into account ad position?
yes, clickthrough and relevance will be taken into consideration also.
Are they removing the 10cent minmum bid?
No plans, but they’re thinking abou it.
When will the UK rollout be?
They will rollout to the international market after the US rollout. The UK is an imporant market, so they’ll want to get that out soon. (more details to come)

Read More

Link Development and Linking Optimization

Rae Hoffman: Delegating link development.
Outsourcing link dev, exchanging, link buying, spamming, public relations, link bait, ad agencies. The right link dev firm can help you achieve great success, the wrong one can cause a lot of problems. A lot of people are out there looking to take advantage of you. Use a firm you’ve gotten a recomendation for, use a sponsor-list like pubcon, or just jump in, but if you do this, dont do it with a site you’re paying your mortgage with.
Research your choices, use SE of your choice to see what the backlinks and buzz the company gets, how do they develop links, do they utlize networks under their control (aka when you stop paying all your links go away). What kind of training do they do, how to they access your site, do they work for your competitors, what is the average cost and number of links, will they sign a non-compete. Expect at least a general answer for all her questions at least. You are who you hire. Their actions could be recived well or recived badly, could get you good ranks or could get you penalized.
Hiring and training an in house link developer. The number and quality gained by an inhouse person will surpass an outside firm. You must know SEO, to have an in-house link dev program. if you don’t hire an expert to comoe setup your program for you.
Ask them what their favorite SE is, name 3 SE, do they know what a blog is, do they know what link is, do they know what a message board is what is their favorite website, do they use instant messenger (you want someone who does use it, a lot) do they know what an email client is, what browser do they use and why, can you find me a canon sd-200 digital camera I can actually buy. It is easier to hire someone who is familiar with the internet but has no marketing background than hire a marketer.
Training docs are important: different types of links you want to obtain, how to find those links, glossary of terms and acronyms (like ROS) information on competative intelligence (very important), reciprocal linking policies (nofollow tags). links to link dev articles that follow your philosophy. Listing of link myths (like don’t get low PR links), spreadsheet template, email templates, document template, clear listing of expectaions (like quotas, realistix expectations), list to SEO tools. Wow she talks fast.
In the beginiing you’ll want to task your link developers specifically about links. If you try to help them gain independance give them freedom with smaller, less important sites, not your cash cows.
Measures: how many they get how many the obtain, what kind, what quality, what anchor text, where are the links, how are they retained. Most importantly your search engine rankings.
Inhouse vs outsource factors:
expectation of quality, ability to house employees value of rankings vs cost of each, expertise to train and manage, if you want the added work of more employees.
Botton line on outsource: get referrals from friends, do research ask questions.
Bottom line on inhouse: hire internet capable people that you think you can train, task them in detail and keep them away from important sites and create docs and resources.

Joel Lesser Reciprocal linking.
When its relevant. Cost effective way to get traffic, even before google.
Link exchanges have gotten a lot of bad press, as have most online marketing methods. Avoid no editorial control services and services that guarantee links. Link exchanging should be a time consuming process, but it is stil usefull. Most people will not respond to link requests with out something in return. Its a give and take world. relevant and related sites should exchange links.
How to indentify a partner: sites that have lots of good related content, have a lot of incoming links.
Reciprocal linking, when done right is a great, cost effective way to build links. There is a lot of misinformation, be careful what you read on the web.
Don’t get a lot of links in a short amount of time, retain editorial control, use link exchange request forms if available. mak linking descisions for users not SEs.
Links lists are perfectly ok, but there are other ways to do it, sidebars, linklets, articles.
He shows two quotes from Cutts and Zawdony, he says they’ve never said not to exchange links.
The most important part of a google patent “a large spike in incoming links could indicate spam” esp in sites “without editorial control”.
Watch your volume

Roger Monti:
look for relevance, no mentions of pagerank no ads for non-relevant sites, year long purchases. Smaller offline magazines. giving away banner ad space. look for time based, not impressions based costs.
“advertise with us” keyword -cpm
“rate card” -cpm advertising
allintitle:”sponsors” -cpm keyword
“job far”

buying whole websites
look for
interactive websites,
search for “temporarily down for maintenance” or allintitle:”site is offline”
underperforming websites.

sites of the month
look for
archived links
niche dedicated
search “site of the month” + keyword
site of the day and week

check for archived newsletters
search newsletter keyword sponsors
newsletter “Advertising rates” keyword

remember to dig outside of the .com. .net and .org space (like

industry associations
charity groups
concentrate on .org’s (search: keyword sponsorts
research competitor backlinks (
.edu job fairs (like above)

he very rarely uses google to search for link opportunities, use yahoo.

proxy sites
cultivate leads with informatl sites
create inbound links with satellite sites
take advantage of the power of blogging, get involved in the conversation, comments, blogrolls.
he uses the traderjoes magazine as a blog on paper as a good example on what content to put on a blog.

youtube and google video
links on videos, in youtube “more stats” links to people who are liinking to that video, direct links in descriptions when you upload a video to google video, both of these methods show up in your backlinks and can create real traffic.

Selling software
pad file
submit to software directories, here he talks about using parasite SEO in software directories to dominate a keyword phrase.
charity design, might not pass PR, but if its related it could be a good thing to do.

Read More