Instrumental+Music Percussion+Music Classical+Music

Tuesday, February 28, 2006

 

Selecting Keywords for your Site

I was prompted to include this article on keywords today because we have just seen a significant increase in the arrival of searchers using '+' and specifically '+mp3'. As we have discussed before our visitor volume is driven off a large number of keywords from Google often over 100 per day with the detailed pattern constantly shifting but overall volumes pretty steady. Now read on:

Anyone who knows anything about how search engines work will
tell you that you need to choose the perfect keywords in order
to improve your search engine ranking. Yet, if you are
relatively new to the website creation scene, you may have very
little knowledge about what keywords are, what they do and how
they enhance search engine optimization (SEO). Nevertheless,
understanding what keywords are and how they work is critical
to your websites success and the amount of traffic you will
receive at your website. Let's take a look at what keywords are
and how to choose the right keywords to improve your SEO
techniques. In doing so, you will immediately be able to put
such knowledge to use and improve your page ranking in search
engines.

Keywords are used by search engines to determine whether or not
your web pages will address the needs of Internet users that are
conducting a search. The more keywords that are found on your
web pages, the more likely your webpage will be listed as a
response to a user's query. Yet, a webmaster must bear in mind
that not enough keywords on a webpage will have little to no
affect on a web page’s search engine ranking, while a web page
with too many keywords may also have little affect. Why? Quite
frankly, an obvious attempt to sway a search engines listing
actually reduces the quality factor of a web page and a search
engine will rate it lower than it would a web page that
contains quality information.

Therefore, choosing keywords is part of the website marketing
process: having the right keywords appear on a webpage will
result in increased traffic to your site. There are a number of
web resources that will help you determine what keywords are
frequently used, like keyword generators and the like, but the
problem with such tools is they do'’t necessarily identify the
right keywords that will ultimately draw traffic to your site.
Instead, you need to be choosy about what keywords you use.
First, you can go with the most popular keywords suggested, but
the most popular keywords do'’t always necessarily work in
drawing traffic to your site, nor do they always help you
improve your SEO and page ranking within major search engines.

First, you will want to try and use truly specific keywords
when generating keyword enriched web pages. Try not to be too
vast in your descriptions - if a keyword generates 20,000 pages
in a search engine, chances are your keyword enriched page
isn'’t going to make a huge difference in you web traffic. Yet,
if you are using keywords that few than a hundred sites use you
may find that your search engine ranking is increased because so
few sites use the specific keywords you have chosen. Obviously
then, it serves to tell you that one keyword enriched webpage
may not make a huge difference in your overall web traffic, but
several keyword enriched pages will. Just be sure to use
uniquely defined, themed keywords within your web pages and
watch your ranking improve in the major search engines.

Another unique way of selecting keywords can be found in
viewing similar websites to your own. Find out what other
webmasters are using to draw traffic to their website. How do
you do this? First, review many websites that are similar to
yours. Watch for repeated themes within websites - —do you see a
pattern? Take that pattern and use it to your advantage, using
a keyword generator determines which keywords are most unique
that fit the pattern that other websites are using. Next,
generate a few web pages that are keyword enriched with your
selected keywords. Conversely, if you are familiar with html,
you can view a webmasters source codes and see what Meta Tags
they are utilizing. Meta Tags reveal the keywords of a page and
can give you a good clue as to how the competition is
approaching the keyword enrichment process, if at all.

Whatever you do, when in the process of searching for keywords,
don’t just throw your best guess out there and go with it.
Although it is possible that you may get lucky and actually
improve your website traffic and search engine ranking, it is
unlikely. Most keywords that are the product of a webmaster'’s
guess are far too broad to make any kind of difference in
search engine ranking or web traffic.

There are a lot of little tricks one can use to make the best
of the keyword enrichment process. You can write articles that
focus solely on the keywords you have selected or you can hire
freelance writers to do the work for you. You can make use of a
variety of free and subscription web tools to improve the type
of keywords you choose. Either way, once you have found the
right keywords you will immediately note an increase in your
web traffic - —an increase that undoubtedly accompanies your
improved search engine ranking - an improvement that is the
direct result of your improve SEO techniques.


About The Author: Daniel Smith writes about directory text
links http://www.linksattack.com


In my experience you need to focus on the relevant keywords with relatively few competitiors don't be lured in to the use of keywords where your site cannot fufil the expectations of visitors - that way leads to really high bounce rates and no extra business. But don't leave out the obvious highly competitive words because sometimes they will get picked up in combination with something you never expected.

..... now where are those ******* +'s coming from?

Monday, February 27, 2006

 

Communicate with Visitors

I have had this article in the possibles stack for a couple of months which is very unusual. We have been worrying about gaining visitors and getting them to listen to atleast one of our samples but it is clear that we need to work on our communication with our visitors - to engage them and encourage them to consider a purchase or two.
Here is a rather thoughtful piece on this topic:

This article might be a bit different than you've come to
expect from a webmaster-related article. The reason is that, in
my opinion, it is a facet of the internet many people don't
really think about. And that is communication. But, by
communication I mean more than just talking or writing. Read
on...

What is Communication?

Communication is defined simply as the transferring of an idea
or concept from one point to another with full duplication on
the receiving side. This last component is one often forgotten.
So, a full one-way communication would be Fred has an idea about
a widget. He propels that idea across space to Ted. Ted receives
the communication, understands it, and has full duplication on
his end of the exact concept Fred was thinking about that
widget. Now, a full two-way communication would be the above
process, but with the addendum that Ted thereby acknowledge
Fred for his communication, sending the acknowledgement across
space to Fred, at which point Fred receives the acknowledgement
and fully duplicates and understands the acknowledgement. Fred
would then thank Ted for this acknowledgement.

So, what we have here is an interchange of ideas with full
understanding on both sides, as well as the full understanding
on both sides that their communication(s) is/are being
received.

Why Do I Bring This Up?

Good question. The above communication formula applies to
everything. Interpersonal relations, business, family, etc.
However, we are in the business of websites. A website, by its
very nature, is a communication medium. If your site does not
properly observe this communication formula, you may be
spinning your wheels posting and emailing your
visitors/customers and they might still not really be receiving
your communication. And, again, I emphasize that receipt. When
you email a customer, I am not referring to whether that email
arrives in their inbox. No, I am referring to whether that
person fully duplicated your communication and got the exact
point you were trying to make.

Have you ever written an article which, to you, makes sense,
but others seem to not be able to grasp? Have you written sales
copy that generated little to no sales? Have you dealt with a
client which, no matter how hard you try, just doesn't seem to
"get it"? If anything like this is out, then your communication
formula is out. For one reason or the other, they are not
getting the same understanding you are.

How Does This Apply?

It's far easier to observe proper communication in person than
over the internet. First, when the person is right in front of
you, you get immediate feedback as to whether they are
understanding you. On the internet, if someone doesn't
understand your writing, they will just leave and you'll never
know. In fact, on the internet, if the communication is out in
any way, you'll probably just lose the visitor. And the nature
of the medium is that you won't know.

But, how can you do your part to enforce a proper communication
formula on your website? Let's look at that:

1. Definitions of Words. Words are part and parcel of the
language. But, if someone does not understand the words you are
using, the language will not communicate to them. In fact, as
human nature has it, when a person is reading something they
don't understand, they will first forget they read it, second
they will individuate from it. Ultimately, they will just leave
and not come back. So, it is in your interest to use words that
your visitors will understand. Do not use big, fancy words just
for the sake of looking learned. It doesn't work. Also, if your
site discusses topics which are technical in nature, do your
best to describe things in an easy-to-follow way. Lastly, it is
my opinion that every site which is an instructional type of
site should include a glossary. Maintain a glossary of commonly
misunderstood terms and, in your content, hyperlink those words
to the definition. You could even use ALT tags or DIV layers to
make the definition pop up when you hover over the word. However
you choose to employ it, making sure your writing communicates
to your reader is in your interest. And this starts with using
words they understand.

2. Acknowledge Your Visitors. As discussed above,
acknowledgement is half of the communication cycle. So, when
your visitors send you an email, acknowledge it. If you do not,
your visitor will think you are ignoring them and they may
become upset. I don't intend to make your visitor seem like a
child there, but it is true. Visitors who are in good
communication with a website are more likely to remember that
website. Their like for the site will increase and they will
have a higher level of agreement with the people behind it.
That is good for you. On the other hand, if you ignore them and
do not reply to their messages, then the communication really
doesn't exist. Therefore, they will write you off. So, organize
your site's email lines so that emails are replied to. At the
least, set up an auto responder to let them know their message
was received. Ideally, though, you will send them an actual
reply.

3. When needed, enforce acknowledgement. Sometimes, a visitor
will initiate a transaction of some variety and then abandon
it. For example, they may sign up for your mailing list but
fail to confirm their subscription. Well, the communication
formula is out. They never acknowledged the confirmation email.
Maybe it never arrived. Maybe they forgot. Regardless, you need
to repeat the question. Just as you would in real life if
somebody does not answer your question, you repeat until you
get an answer. In our example, you may send them a series of
reminders (enabled via cron) until they confirm their
subscription. After a few tries, you can write them off.

4. Design to Communicate. The design of your website needs to
lend itself to the message you are trying to communicate. Badly
designed, a site's design can impede the message. It would be
like trying to communicate to someone over the loud noise of a
jet engine. In this case, you may be talking fine, but you need
to handle the environment around you to make your communication
arrive. Stop the jet engine or go somewhere else where you can
get your message across. Online, your site is the medium. Your
content can be well-written and the words defined, but if the
site is a chore to use, then your site becomes the din that
will keep the message from arriving in the mind of the reader.
I will reserve design theories for another article, but pay
attention to things like (1) cross-browser compatibility, (2)
your main message being very apparent when the user arrives to
your homepage, (3) functional layout. On #2, do not overcrowd
the page with so much information that the user doesn't know
where to focus.

5. Allow Others to Communicate. Communication is the engine
which powers life. It is what makes the world go round. This is
the reason why interactive elements on a website make the site
more sticky and more trafficked. People love to communicate.
So, set up online forums, interactive quizzes, anything that
will invite feedback and participation on the part of your
visitor. Your site does not need to be a one-way flow from you
to them. In fact, a successful website will actively get the
return flow from them to the site.

6. Speak to Their Reality. This could take an article in and of
itself, but I will give it a brief mention here. Even observing
all of the above, you have the simple fact that everybody has a
certain way of looking at things. And everybody thinks their way
of looking at the world is right. In fact, they KNOW they are
right. If you are communicating to them assuming things that
are not part of their reality, then the communication will not
arrive to them. A person has to be receptive to your
communication. An example would be trying to explain the health
benefits of meat to a member of PETA. So, in any situation, you
need to find what the reality of your target audience is and
then tailor your communication to THAT reality in order for
your communication to really arrive. This is where inviting
visitor feedback comes into play. Keep a running record of
feedback to see what their reality is. Do surveys. Find out
what their experience is and speak to that. By doing this and
observing the proper communication formula, you WILL be the
authority for them in your field. Look at Oprah. She is very
wealthy, so much so that most other people cannot really
identify with that kind of wealth. But, Oprah is out there,
talking to everyone. And she positions herself in such a way
where people identify with her. She doesn't put up a front. She
is REAL to people. She speaks to their reality. And she is very
successful because of it.

Wrap It Up

I have touched on some things in this article which I can
easily expand upon at a later date. And I probably will. But,
this will get you started. Judge the communication to and from
your website and see how it measures up. Any successful website
cannot sit there on the internet as it's own little island. It
has to communicate and communicate in a big way. It has to
serve as that 6 lane highway with traffic going in both
directions. Make your site do this and you'll be on your way.



About The Author: David Risley is a web developer and founder
of PC Media, Inc. (http://www.pcmedianet.com). Specializes in
PHP/MySQL development, consulting and internet business
management. He is also the founder of PC Mechanic
(http://www.pcmech.com), a large website delivering
do-it-yourself computer information to thousands of users every
day.

Some feedback would really help so I am going to have a serious look at that - our experiment with Del.io.us dosen't seem to have produced any reaction so we clearly need another mechanism. Perhaps reviews, ratings or requests are angles of attack that we could develop effectively.

In the meantime we are working on the use of Dreamweaver templates to give us an easy to use facility to promote individual recordings and or composers of the week.

Friday, February 24, 2006

 

Improving Search Engine Rankings – Patience Is The Key

The author today's article is a man after my own heart. As regular readers will have noticed our enthusiasm for publishing articles has been substantially dampened by the absence of response to our first attempts so I think perhaps there must be a message here for us:

I have written before about the benefit of articles and how they provide one way links back to your website and therefore improve the importance of your website and thus its ranking in the search engine results. The articles you write have the chance of spreading slowly around the web and ending up within ezines or on other websites or blogs. All of which provide links back to your website. So once you have over thirty articles out there the links can grow exponentially. Just consider what can happen if you have one hundred articles published? Two hundred articles?

The magic figure is said by some experts in this field to be 250 articles, it is then that this exponential growth in links and traffic to your site can be truly seen. I do not have personal experience of this yet as my own figure sits at 35 and this number is already having remarkable results as reported in a previous article. If you average 10 articles a month which is what my own aim is, then after 2 years you will have reached this magic figure of 250.

At the same time your website will have aged a further two years which will also benefit it and it will be viewed as one of the more important web sites.

Patience is the key element in promoting your website. If you want your site to climb up through the rankings then you must have patience. Do not be swayed into parting with money on the promise of instant high rankings. They can be achieved but will not be permanent and on Google in particular this is becoming increasingly more difficult to achieve as the algorithm becomes more complex.

Why do you have to be patient? Google in particular requires a little patience. Google likes to see one way links pointing back to your site but will only count and use links after they have matured for some time. So your articles may not appear as links for about 2 months and then as they mature further they will assume more and more importance in the eyes of Google. Consequently you may have 100 recorded backlinks but another site with only 50 backlinks is still ranked higher. This is presumably due to your links being still young and not 'fully matured'. Once they do so you will leap ahead.

With Yahoo the situation is similar but the time delay is less. You should start to see these links appearing after possibly 2 - 3 weeks.


MSN is the fastest of the search engines to pick up and recognize these links which may very well happen within days.

Consequently you should see very nearly immediate effects in your ranking on MSN and then a few weeks later with Yahoo finally followed after 2 - 3 months your rapid rise through Google.

Have patience and keep the articles flowing as you will ultimately see the rewards for this endeavor and the most important thing is that your rise in the rankings will not be short lived.



About The Author: David Andrew Smith is the owner of a contract cleaning company which operates its services across the whole of the UK. He has developed and continues to maintain and promote the companies website. http://www.wesparkle.co.uk

If you are impressed by these thoughts you might find it instructive to Google David's website and look at where his links actually come from. David is clearly also a believer in reciprocal links as you can see from the link pages on his site and that also appears to be paying off for him. Our experience is that we can find no real pattern in Google's selection of the links which it includes in the 'linking to' listing and I'm not entirely sold on the age of links theory.

By the way McDar's Google Data Center tool is back up again and Big Daddy is now visible on roughly half the datacenters although there are some anomalies in the quick checks made this morning the overall pattern seems reasonably consistent at the moment.

Thursday, February 23, 2006

 

Data Center Quick Check is Down Again!

The MacDar Data Center Quick Check is a free tool which has been put up to give us visibility of the results of searches over several of Google's Data Centers and we have made reference to it from time to time over the last few months.

As Google watchers are only too aware new developments are tested by Google on particular data centers before being rolled out across the whole network. As these data centers have different URLs it is possible to search on them directly.

Big Daddy is referred to as an infrastructure change and is possibly linked to a major upgrade of Googlebot. Even with Jagger there were several phases to the algorithm updates which were tested and then release quite quickly over successive weeks. Big Daddy has been rolled out to a few centers from the first two where it was announced but I think we have also seen some roll back. The three exceptional surges measured by Rankpulse.com are suggestive of this sort of activity although they have made no comment so far. This is not too surprising with an infrastructure upgrade - it wouldn't be too surprising if there were some hardware, network and software variations at some of the data centers which reacted differently to the new stuff. We should just be grateful that Google have a robust distributed system and effective change control processes and be patient.

Never the less, every day we have been doing a quick check on a few of our keywords where we know that Big Daddy is making a difference but we have been thwarted this week because the link to their 'business' end at sleeping-bag.com has not been working. I don't suppose it is sinister - there have been similar downtime periods in the past. We'll keep you posted.

Wednesday, February 22, 2006

 

Legal Music Downloads

I thought that this topic could do with a bit of an airing:

With the scandals in past years involving the famous (or
infamous) Napster, more and more net users who love music are
searching for good quality free legal music downloads. The
Record Industry Association of America and the FBI are
threatening to find and prosecute people who pass along illegal
downloads. This is not an idle threat, since it is easier to
trace people who download music illegally than it is to find
people who make unauthorized recordings through conventional
devices. Therefore, legal music downloads are a safer way of
finding the music you want to listen to at no cost. Although
some complain about the quality of most legal music downloads,
fellow browsers can give valuable hints and advice.

There was once a vast library of legal music downloads with
millions of recordings to choose from. This music "library" was
http://MP3.com, which was later bought by another company and
taken offline (no doubt the purchaser realized that the company
was giving jewels away for free).

Although more than one lover of legal music downloads compared
the dissolution of http://MP3.com to the burning of the great
library in Alexandria, there are ways to find great legal music
downloads, although it takes much more investigation than was
required in the heyday of http://MP3.com.

Many independent musicians who are promoting their work provide
legal music downloads from their sites for free. Although they
may not be big names, they can sound like those at the top of
the charts. It can be fun and exciting searching for these up
and coming bands and downloading their music. You can get tips
from other music lovers on what bands provide the best free
music downloads, or many musicians will give you their
information at concerts.

The reason musicians can afford to offer free legal music
downloads is that they see it as a way of promoting their
music, and they know that those who enjoy the downloads often
buy their CD's. It works on the same principle as the
bookshop/cafes that allow shoppers to sip coffee and browse
through their titles for free; samples increase actual sales.
Since these independent musicians do not yet have big names,
free legal music downloads is sometimes the only way they can
give potential CD buyers an idea of their music.

One of the best ways to find legal music downloads is through
iRATE radio, which has a very large database of downloads, and
enables the listener to download music, give reviews and rate
each recording. You can look at other ratings and reviews as
you submit your own, and from this feedback, you can find the
best legal music downloads. iRATE radio will send you samples
of music that fits your stipulated categories, and if you like
a certain artist or style, it will automatically download new
recordings for you.

Word of mouth is the best way to locate your favorite legal
music downloads, and several friends can get together and make
a hobby out of searching for the best new downloads. It is a
good idea to look through music magazines and e-zines for
reviews of the best up and coming bands and artist. The
reviewer will often give you a link to the artists’ website
which might contain free legal music downloads.

However you find your music online, keep in mind that it is
much easier and safer to stick to legal music downloads than to
take advantage of offers that might involve a violation of
copyright laws. In the old days of VHS, it was easy to just
make a recording without anyone knowing about it, but mostPC'ss
can be easily traced, and your music downloads can be
investigated by authorities.

Therefore, if your dissatisfaction with the quality of many
legal music downloads is tempting you to bend the rules, it
might be a good idea to do a more thorough search among legal
music downloads to find truly good ones. There are great
independent bands out there who sound even better than the big
names.


About The Author: Matt Garrett for http://www.LMG.Org and
http://www.musiccdsonline.net


Our approach has been to carefully avoid breaching anyone's copyright in the music we have recorded and then to copyright the work that we have done to create our interpretations of others' compositions and transcriptions. We have also selected relatively high quality MP3 encoding rates to ensure that the quality is comparable if not better than CD recordings. Legal and decent to our fingertips!

Tuesday, February 21, 2006

 

Google Robot Developments

The spiders on the web have always facinated me. We know that we are visited virtually every day by all the main spiders - not just the main HTML pages but the e-commerce PHP pages as well - triggering redirects etc.

Could The New Google Spider Be Causing Issues With Websites?

Around the time Google announced "Big Daddy," there was a new
Googlebot roaming the web. Since then I've heard stories from
clients of websites and servers going down and previously
unindexed content getting indexed.

I started digging into this and you'd be surprised at what I
found out.

First, let's look at the timeline of events:

In Late September some astute spider watchers over at
Webmasterworld spotted unique Googlebot activity. In fact, it
was in this thread:
http://www.webmasterworld.com/forum3/25897-9-10.htm that the
bot was first reported on. It concerned some posters who
thought that perhaps this could be regular users masquerading
as the famous bot.

Early on it also appeared that the new bot wasn't obeying the
Robots.txt file. This is the protocol which allows or denies
crawling to parts of a website.

Speculation grew on what the new crawler was until Matt Cutts
mentioned a new Google test data center
http://www.mattcutts.com/blog/good-magazines/#comment-5293. For
those that don't know, Matt Cutts is a senior engineer with
Google and one of the few Google employees talking to us
"regular folk." This mention happened in November.

There wasn't much mention of Big Daddy until early January of
this year when Matt again blogged about it asking for feedback.
http://www.mattcutts.com/blog/bigdaddy/

Much feedback was given on the accuracy of the results. There
were also those that asked if the Mozilla Googlebot (known as
"Mozilla/5.0 (compatible; Googlebot/2.1;
+http://www.google.com/bot.html)" in your visitor logs) and Big
Daddy were related, but no response was made.

Now I'm going to begin some of my own speculation:

I do in fact believe the two are related. In fact, I think this
new crawler will eventually replace the old crawlers just as Big
Daddy will replace the current data infrastructure.
http://www.textlinkbrokers.com/blogs/comments/310_0_1_0_C/

Why is this important?

Based on my observations, this crawler may be able to do so
much more than the old crawler.

For one, it emulates a newer browser. The old bot was based on
the Lynx text based browser. While I'm sure Google added
features as time went on, the basic Lynx browser is just that –
basic.

Which explains why Google couldn't deal with things like
JavaScript, CSS and Flash.

However, with the new spider, built on the Mozilla engine,
there are so many possibilities.

Just look at what your Mozilla or Firefox browser can do itself
– render CSS, read and execute JavaScript and other scripting
languages, even emulate other browsers.

But that's not all.

I've talked to a few of my clients and their sites are getting
hammered by this new spider. It has gotten so bad that some of
their servers have gone down because of the volume of traffic
from this one spider!

On the plus side, I have clients who went from a few hundred
thousand indexed pages to over 10 million in just a few weeks!
Literally since December, 2005 there's been a 3500% increase in
indexed pages over an 8 week period! Just so you know, this is
also the client's site that went down because of the huge
volume of crawling happening.

But that's still not all.

I have another client which uses IP recognition to serve
content based on a person's geographic location. If you live in
the US you get American content and pricing; if you live in the
UK you get UK content and pricing. As you may imagine, the UK,
US, Canadian and Australian content is all very similar. In
fact about the only thing noticeably different is the pricing
aspect.

This is my concern – if the duplicate content gets indexed by
Google what will they do? There's a good chance that the site
would be penalized or even banned for violation of the
webmaster quality guidelines set forth by Google here:
http://www.google.com/webmasters/guidelines.html#quality

This is why we implemented IP recognition – so that Googlebot,
which crawls from US IP addresses only sees one version of the
site.

However, a review of the server logs shows that this new
Googlebot has been visiting not only the US content but also
the content of the other sections of the site. Naturally, I
wanted to verify that the IP recognition was working. It is.
This leads me to wonder then; can this browser spoof its
location and/or use a proxy?

Imagine that – the browser is smart enough to do some of its
own testing by viewing the site from multiple IP addresses. If
that's the case then those who cloak sites are going to have
problems.

In any case, from the limited observations I've made, this new
Google – both the data center and the spider – are going to
change the way we do things.



About The Author: Rob Sullivan is a SEO Consultant and Writer
for http://www.textlinkbrokers.com. Textlinkbrokers is a link
building company. Please provide a link directly to
Textlinkbrokers when syndicating this article.


From our point of view the immediate impact is relatively modest because the main search engines have been picking up all our pages for some time. Looking at this I'm not sure whether the new robot is one of the unidentified ones or the one identified as Googlebot. Of more interest is the development of the capabilities of the new robot to look at flash and java script elements on the page and the impact that could have on optimisation. Up to now we have been able to ignore those parts of the page safe in the knowledge that they are invisible to Google and some naughty people have even been known to take advantage of this to cheat. In our case we have duplicated some links in HTML and javascript on the same page and we will soon be able to stop doing that whch is a good thing.

In the meantime the propagation of Big Daddy seems to have stalled, Rankpulse had another convulsion on Friday and our visitors are coming and going in blissful ignorance of all of this.

Friday, February 17, 2006

 

Authority Websites

I thought I'd give a little air time to a theory that floats around from time to time.

Authority Websites

Search engines have become wiser and have put more scrutiny upon deciding page ranking and which websites are really important. Spending time on traditional on-page optimization and link building strategies will not get you that top ranking in Google. Don't get me wrong, you cannot stop doing these tasks, however, the marketing of your website will need to include a variety of other methods if you want to get top rankings. Search engines are looking for clean quality content to provide to their users and we will see the search engine market become more competitive.

Traditional search engine optimization will continue to be part of your efforts excepting for our methodology will change. You will continue to work on link exchanges, keywords, descriptions, titles, headlines, image alt descriptions, comments, and content. Now you will focus your keywords to be very specific and watch you density. Your descriptions will not be as long, but concentrate on more concise descriptions. You will want to watch your HTML coding to make the process easier for the search engine spiders. Try using fewer tables and consider Cascading Style Sheets. The link exchanges will have to be more specific to your industry and with websites that are authoritative in nature.

Authoritative websites is the direction to consider for 2006. Not only making your website an authority for your target market, but to exchange links with other websites that are authoritative for that same target market. What are authoritative websites to link with? Basically there are is a relatively small set of websites that can be trusted as authority, or expert websites. Government sites, university websites, well-recognized news sources, and recognized industry news sources are all examples of sites that can be considered as "“expert" websites. These sites, unlike the average website on the Internet, can be trusted to "link honestly".

Take the following example. I recently went to Google, MSN and Yahoo and searched on the term "internet marketing consultants" and the result on the first page for all three search engines were somewhat surprising. Instead of finding a page full of businesses that provide internet marketing services over 75% of the results were websites that were resources for internet marketing. A careful review of each of these websites revealed all of them to be content rich in their field of internet marketing. They did not provide any services other than information for others to use about internet marketing. For the purpose of this writing these results indicate (1) that making your site an authority for your field and industry is very important to achieve top ranking, and (2) you want to seek out these types of websites to create inbound links if possible and feasible. Otherwise engage a reciprocal link exchange.

Getting noticed by authority websites can be difficult, but with some creativity it can be done. If you make your business newsworthy, news outlets within your industry and without will pickup your news story and hopefully link over to your website. Traditional marketing and public relations requires you to make your company known as an authority within your industry. You should want your clients to know that you are the best source for whatever it is you sell, and that they should trust you. You gain this trust by being visible, not just through your website, but through the websites of other trusted sources such as news outlets. These types of marketing methods would include engaging in public relations, networking, attending trade shows, and talking to news sources both within your industry and outside of your industry.

Your website is a business. No different than a retail brick-and-mortar store you will spend as much time working your website as a regular any regular business. Do not believe that the comfort of your home or that you are working with computers and technology you are able to run your business any easier. The only thing that has changed is the venue, everything else is the same. You have to provide quality content, engage in activities that make your website known, and make your website the absolute best in your target market.


Melih Oztalay is the CEO of SmartFinds Internet Marketing. Internet marketing is not only about knowledge and experience, but also about imagination. Visit SmartFinds at http://www.hsfideas.com

Most of this is unexceptional, but it does not properly acknowledge the rich variety of markets on the web and their different characteristics. Some like the consultancy areas are relatively cerebral and do generate the kind of authoritative sites referred to. In music for example there are a whloe series of authoritative sites on the life and works of specific composers - we know because we used them as part of the research for some of our pages. They are mostly pretty aloof from the hurly burly of commerce and I see no signs of them sharing links with download sites.

I would argue that we have set up authoritative pages on iPod and iTune use of standard MP3s because there was a gap in the market and this is the more common position in the music business with the vast bulk of the searches going on the most popular keywords to the sites with huge collections of material. We are still struggling to convert that into interest in our recordings.

We also seem to do pretty poorly in the article publication line - most of the articles we feature are about the business of running the site and attracting visitors rather than the music itself because we simply don't see that sort of material very often.

Thursday, February 16, 2006

 

Second Rankpulse Spike

Yesterday saw the second unprecedented spike on Rankpulse in three days and Mac Dar shows some futher movement on Big Daddy to more data centers but no major impact on our Google visitor numbers so far. Of course we don't know what data centers Rankpulse use - it is possible that they just use one in which case we might just be seeing Big Daddy coming and going and it could happen again.

On the Browser front Fire Fox is now up to nearly 19% of our visitors but of course with Analytics that includes me.

Music production is on hold at the moment because we are updating software and hard ware and as ever this is not without its little hic coughs and travails. The Creamware update has gone smoothly but it has not resolved the pre-existing problem. It has also opened my eyes to some more of the possibilities which that hardware offers. On the hardware front we are looking to harness a second PC using MIDI and S/P DIF connections. MIDI is working OK but S/P DIF connectors/cables etc are proving a little more challenging.

Wednesday, February 15, 2006

 

Google Analytics - Story so Far

Time for a bit of reflection on Google Analytics - so to start us off here is an independent piece:

Google Analytics

Google Analytics is a product based on the popular web traffic
analytics Urchin platform, as a result of Google's earlier
acquisition of Urchin. As a surprise to the webmaster
community, Urchin's technology, which is one of the most
complete and well designed web analytics software in the
industry, is being given at no cost to webmasters when they
sign up at www.google.com/analytics. This move has raised
suspicion as to what are Google'’s motives and what will be the
consequences of Google having so much website traffic data in
their power. Currently, Google can track traffic only through
the traffic they refer from their portals and an estimate
through the Google toolbars installed in people's computers.
However, this toolbar is not very representative of the general
population, since most users are webmasters trying to see the
PageRank value for web pages displayed in the toolbar. With
direct traffic data from websites, Google could potentially add
an enormous array of variables to their search engine algorithm.
This has webmasters concerned.

Many webmasters, despite the search engine ranking
possibilities that this tool could bring, are jumping in the
wagon to get a slice of Google's "gift" to the webmaster
community. To many webmasters, their only source for traffic
intelligence and reporting was through free tools available,
such as Webalizer and Awstats. These tools were not too precise
in reporting and gave straightforward information like visits,
page views, search terms, and country. This data was valuable
to some extent, but lacked many web analytics that were only
provided through expensive services from Urchin and Webtrends.
Google Analytics changes everything, something that could hurt
Webtrends' sales tremendously.

With Google Analytics, webmasters have many convenient and
competitive features. Google Analytics provides a single
control panel for all of your websites. Here is some analysis
that is provided:

1. In the main control panel, the user is presented with an
executive summary that summarizes the most important traffic
data,– visits and pageviews, percentage and number of one time
visitors and returning visitors, geographic distribution of
visits, and source of visits.

2. Additional to the Executive summary, the user has an option
to these condensed reports: Conversion, Marketing, and Content
summaries. Conversion summary presents the amount of visitors
and their conversion rate, as well as the conversion rates for
established goals. Marketing summary displays the top five:
sources of traffic, search keywords, and campaign based.
Lastly, the Content summary identifies the top five: entry
points, exit points, and most visited pages.

3. Finally, the system provides for very specific reports
divided into two categories: Marketing Optimization and Content
Optimization. Both categories also have sub categories. This in
turn goes into very specific details and analysis. Sub
categories for Marketing Optimization include: Unique Visitor
Tracking, Visitor Segment Performance, Marketing Campaign
Results, and Search Engine Marketing. Content Optimization goes
into sub categories that include: Ad Version Testing, Content
Performance, Navigational Analysis, Goals & Funnel Process, and
Web Design Parameters.

As you can see, Google Analytics isn't anything like the free
traffic statistics tools out in the market. Google Analytics is
a full traffic analysis tool that will provide very valuable
information to webmasters and businesses about their website's
performance. No other free program can match Google Analytics
diverse set of reports. If you can live with the fact that
Google will have knowledge of all of your website's traffic,
including where they come from, how well your marketing efforts
convert, and how Google performs against other sources of
traffic you receive, then start taking advantage of this
powerful tool.



About The Author: Rafael Sosa has been in forefront of
e-business development and digitizing of documents in Puerto
Rico. Since 1999, he has worked extensively in the construction
of websites and internet systems through the integration of an
efficient international team. His articles can be found at
http://www.WebArticles.com/

After a few weeks of acclimatisation I would broadly endorse Rafael's conclusions. But I do miss the impact of individual visit visibility that I got with Hitslink. I also miss the immediacy of Hitslink - I can cope with restricting myself to a daily review of the data but it is disappointing that yesterdays results are consistently understated even after midnight Pacific Coastal time. In Google Analytics the tendency is to emphasise the predominant patterns rather than highlight the interesting exceptions - that does support a focussed, prioritised, action oriented approach but makes for a less exiting monitoring experience.

The security question is not something we need to concern ourselves with and the Beta experience has gone well so far. I hope Google are getting what they need out of it to improve their users experience, as we are committed to continuing with it.

Tuesday, February 14, 2006

 

Big Daddy is on the move!

Yesterday saw the biggest spike on Rankpulse for the last year and our checks on Mac Dar suggest Big Daddy search results are now offered by nearly half the data centers that we can see.

No reaction on the SEO forum yet - so you could have heard it here first!

Not too sure what the implications are for us - we slip a bit on some keywords and improve on a few.

The SEO forum view seems to be that the spam sites are still very much in evidence but then you don't tend to hear much from the spammers who have been caught and eliminated. On the other hand the Google spin seems to be that this is the introduction of new infrastructure that will progressively clean things up and if that turns out to be correct, searchers and white hat optimizers can only benefit. Thank fully this does not seem to be much of a problem in our neck of the woods - Classical Cat and Karadar consistently take the first pages and then we get some specialists and some oddities like Camper van Beethoven competing with us. The ringtone arena is a little more problematic but even there the spammers are not too evident it is more an issue of overlapping generations of technology.

Monday, February 13, 2006

 

Five Common Myths About Search Engine Optimization

Here's some nice common sense on a sometimes confusing topic:

Common Myths About Search Engine Optimization

Picture this scene, an adolescent boy walks into a barber shop and says to the barber, "Don't touch me, I'm only here because my mom forced me." Search engine optimizers are sometimes put into the position of the barber. They are knowledgeable and willing to work on their client's site, but the client doesn't want any modifications done to the text that is visible on her web pages. This kind of dilemma occurs due to general misconceptions about search engine optimization. Let's look at these misconceptions.

1. SEO only involves writing meta tags and working on 'invisible'code

Many people want to get a high ranking for various keywords or keyword phrases, but if you look at the text on their web pages you can hardly find these vital words. They come to a search engine optimizer and think that he or she will sprinkle these words into the meta tags and it will work like magic. This is a major misunderstanding.

It is true that your main keywords and key phrases should be in your title tag and your description meta tag, and even in the keywords meta tag, but they must also appear on the page itself and they must appear in some strategic places on that page. Some clients say, "But I like the way it looks now." You may like the way it looks, but the search engines will not recognize that your page is truly about Electronic Widgets unless these words appear in headlines on the page, in the opening paragraph, in the file or domain name in link text and in the body text of your page.

So, by all means if you already have copy that works, that can convert visitors into buyers or otherwise accomplish the purposes of your site, keep it. But you should also be ready to listen to what the optimizer has to say about modifications that will enable search engines to select your site when a potential buyer makes a query for your key words or phrases.

2. Search Engine Optimization is Tricking the Search Engines

Some clients say, "Don't touch the visible copy but put in the modifications invisibly." Using invisible text is something that can get you banned from a search engine. The main purpose of search engine optimization is to give your website the best possible chance to come up in good positions when someone makes a query for your keywords or key phrases. The key to doing this is to design web pages and write copy that is intelligible to search engines, without sacrificing the experience and understanding of your end-users, the people who visit your site. So, don't ask your SEO professional to try to trick the search engines, but work with him or her to present your website in the best possible way.

3. Search Engine Optimization deals mainly with onsite modifications

Even if your website is well designed, has proper meta tags and has keyword-rich text, this alone does not guarantee that your site will rank high in competitive queries. All of these factors, design, meta tags, and copy, are on-site factors. Search engines certainly take them into consideration, but they also value off-site factors such as how many high quality or authoritative websites link to you. This means that hand-in-hand with your on-site optimization you and your promotion team will have to embark on a campaign to get links to your websites coming from websites that are already highly regarded by the search engines and by the public in general.

4. Search Engine Optimization works instantly

Don't expect to get a flood of traffic right after your site has been optimized. Some search engines work in a fairly rapid manner, but the main search engine at the present moment, Google, is believed to have deliberately put an aging delay into its algorithm. This means that it may take several months before your site makes it into the top results for your particular category, especially if it is a newly created site. During this initial period you will also have to consider using other promotional methods such as pay per click advertising, article marketing, joint ventures, paid advertising in ezines and offline advertising.

5. Search Engine Optimization is Prohibitively Expensive

While it is true that very large organizations, ordering services from the top SEO companies, can end up spending thousands of dollars on their optimization campaigns, search engine optimization can be the most inexpensive and cost-effective option for web site promotion.

If you launch a modest pay per click campaign and pay five cents per click and get 100 clicks per day, then your cost is $5.00 per day or $1825 per year. If you learn how to optimize your pages by yourself you may be able to get natural search engine traffic without paying the pay per click fees. This is in fact what many webmasters do. Or, if you opt for a modest search engine optimization package from a professional you can end up spending less than the pay per click fees.

So the next time you hear one of the myths about search engine optimization don't accept it blindly.



About The Author: Donald Nelson is a web developer, editor and social worker. He is the proprietor of A1-Optimization http://www.a1-optimization.com and provides search engine optimization, copywriting, reciprocal linking and article marketing services.

As regular readers know we do our own site optimisation and have established a reasonable level of organic traffic from Google - our sales conversion rates are still not high enough to justify pay per click advertising.

Friday, February 10, 2006

 

Word Research for Copy Writers

Here is an interesting article that came in this week:

Marketing Research: Individual Words

We all know that phrases like "Who else wants to know" in a
headline can improve our sales. Have you ever thought
about the individual words and their impact on your
profitability?

I recently performed a statistical analysis on several
thousand ads while looking at individual words and
profitability.

The first task was to determine the profitability of each
ad being analyzed. This was done using the age-old
mailorder marketing method. Basically, if you see an
advertisement month after month and year after year, it is
probably profitable. If you see an ad only once or twice
and then it changes or disappears completely, the
advertisement was probably not very profitable.

The next task was to simply look for the occurence of a
list of words in each ad while noting whether the ad was
profitable or not. The results were tallied and lots of
words were removed from the list because there simply
wasn't sufficient data to come up with a statistically
significant result.

I won't bore you with the rest of the details. Here is a
list of some of the words found much more often in
profitable ads than in ads that didn't produce a profit:

accessories, an, best, blue, buy, by, causes, cheap,
discount, discover, easily, fast, find, guaranteed, has,
improve, increase, lower, more, nationwide, near, need,
of, on, one, order, payments, powered, pricing, rates,
reduce, stop, superb, the, view, what, with

Here is a list of the words found much more often in ads
that were NOT profitable:

affordable, after, and, as, at, before, better, help,
here, how, else, excellent, experience, for, led, listings,
loan, method, money, mortgage, naturally, now, options,
photos, search, secret, secrets, sell, step, to, try,
unlimited, us, who, you, your

Now keep in mind that correlation can not prove causality.
This research isn't saying that all ads that use the word
"excellent" are doomed to being unprofitable. However, it
is saying that a statistically significant percentage of
ads that use the word "cheap" are profitable and a majority
of those that use the word "affordable" are not profitable.

If your ad copy currently uses the word "affordable" (a
word from the "bad" list above) and you change that word
to "cheap" (a word from the "good" list above), will your
profitability increase? There are no guarantees. There
are an unlimited number of factors that could impact that
result. Not ALL ads that use the word "cheap" were
profitable. Not ALL ads that use the word "affordable"
were unprofitable. However, the use of the word "cheap"
instead of "affordable" is more likely to improve your
profitability.

You still need to split test to find out the answer in any
particular situation. But, why not start out with the most
likely words to be profitable in ad copy generally
speaking?

Take a look at your current ad copy and see if you can find
any of the words in the "bad" list that have good
replacements in the "good" list. Run a split test and see
if your profitability increases. What can it hurt to put
some math on your side?




James D. Brausch is the creator of QuitThatJob.com,
a step-by-step coaching membership site to help you build
an Internet business with residual income that will help
you QUit That Job! QuitThatJob.com is based on James'
actual method of building his own business and real
research like you found in this article.
http://www.QuitThatJob.com
qtjarticle@yahoo.com


Some of the 'bad' words, such as 'cheap' jar on this side of the Atlantic but there is a lesson there too because most of our visitors come from the US - could we be falling between the two stools of a supposedly common language?

Some food for thought...

Thursday, February 09, 2006

 

Welcome to Visitors from Mainland China

One of the startling things about working on the internet are the occasions when you spot the impact of some international event on your business within days or weeks of it happening.

So it was very pleasant to note that the controversial action taken by Google to offer a censored service to the citizens of mainland china is already having a visible impact on their activity and their interactions with the outside world. In the last couple of weeks we have had our first visitors from the cities of Beijing and Shanghai - so welcome to you. I'm not sure if Paypal or BitPass have got to you yet but in the meantime you are welcome to our free offering s and of course our streaming samples.

Wednesday, February 08, 2006

 

Traffic Steady : Sales Sporadic

Having established a reasonably steady and high level of visitors we have experienced some good sales periods but we have also seen some droughts. It is very hard to know how to respond apart from getting really depressed in the droughts.

It is evident from the bounce analysis that up to half our visitors can see at a glance that our site does not have what they want, whether that is to do with the extent of our catalog, our emphasis on MP3 and payment or the absence of sheet music and they leave promptly. Google make good use of our description tag content so the search results there are usually pretty clear on those issues but I suspect that many searchers ignore them.

Some like the visitors to our iTunes help page take their time to read the content we offer and then move on to other sites ignoring our offers of original MP3s. As it happens they do get to hear a sample plying while they read and even that fails to entice them further in to the site. Some visitors move on to listen to a few ringtones but the vast majority choose to listen to at least one of our samples and unaccountably still don't make a purchase.

Apart from purchases all this activity gives us very little feed back about the quality of our music so perhaps that is where we need to apply some effort.

Tuesday, February 07, 2006

 

BigDaddy is still evolving

By keeping an eye on some of our keywords through MacDar each day, we have seen how the results on the BigDaddy data centers have been evolving. In several cases they are still changing - today coming back towards the current rankings in others drifting further away. At the same time Rankpulse is showing a consistently higher trend level of activity, i.e. nearly double the number of results changes, over the last week or so compared with the usual level.

Analytics is back after it's short intermission and we now have sufficient data for week to week comparisons. There is a slight decline on both Google and Yahoo so it doesn't look as if this churn is having a significantly adverse effect on our visitor volumes and it also looks as if our efforts on the iTunes page were not totally in vain as we have reduced the bounce rate slightly. In this instance the viewing time suggests that we have succeeded in providing relevant content for most visitors because the average viewing time is sufficient for most people to read the body copy quite carefully. It looks as if we will have to hope that they come back later to hear more of our music - perhaps a bookmark button would be appropriate.

Monday, February 06, 2006

 

Analytics Hiccup

Yes I had to check the spelling of Hiccup or Hiccough - both OK.

At least we hope that is what it is - Analytics has stopped reporting traffic since early on Friday - no comment from Google so far but the user forums have confirmed that they were not just picking on us!

Still it is beta and we have commented before that you have to expect some problems with beta software and its free.

More worrying are the comments that it have been affecting the performance of some site - by slowing downloading.

Fingers crossed while they are not drumming.

Friday, February 03, 2006

 

Google Big Daddy SearchQuake About to Rumble Your Ranking?

I thought you might like to see another point of view on the BigDaddy front:

Google Big Daddy SearchQuake About to Rumble Your Ranking?

Copyright © February 1, 2006 by Mike Banks Valentine

Running ranking reports for clients is a standard part of an
SEO's job. This week I created a position report for a client
- one for which we'd made significant gains in ranking for
their targeted search phrase - and proudly sent off the
report to them before a scheduled conference call to discuss
our progress and status.

The client sent an email upon receiving the report saying
"There is something wrong with your report - we rank higher
than this report claims." I went back to Google and typed in
the search phrases to find rankings exactly where the report
showed them the previous day.

I explained to that client that Google has (at last count)
nine data centers which serve up search results and that they
were getting results from a data center in the Eastern US
which showed differing results from results shown to us here
in California.

The difference was substantial enough to move the client from
page two to page one in the search results and therefore made
a dramatic difference in their satisfaction with our work.
Differences are rarely that substantial in previously
observed ranking reports, so it prompted me to dig a bit
deeper into the issue and I sent the note below to the client.

"Take a look at this link where Google datacenter IP addrresses
are listed in detail."

http://www.webworkshop.net/seoforum/viewtopic.php?t=548

"Here is an overview of a coming update to all Google
datacenters expected in February or March of 2006."

http://directmag.com/searchline/1-25-06-Google-BigDaddy/

"So you ARE ranking better from your area of the country and
that particular data center which returns results to you.
Things usually update to match in all data centers, but
sometimes you may do better in one data center than
in others. If you search from each individual IP address in
that list discussed in the forum linked above, you'll see
different rankings and may find datacenters where you
rank at the bottom of page two of results."

You might also search from that new "Big Daddy" data center
referenced in that article above, which discusses upcoming
Google ranking algorithm changes due soon.

http://66.249.93.104

Where I'm seeing you ranked at #17 (bottom of page two.)

It's a measure of where you might expect to be when Google
moves to that new algorithm for all data centers in February
or March. (Of course we continue to work to achieve better
results before then.)

This upcoming change in algorithm and the interestingly named
server "Big Daddy" were publicly posted on Matt Cutts blog
for beta testing by SEO's (and other Google Watchers) who
read him regularly. (For those who don't know, Cutts is a
software engineer at Google & shares SEO tips on his blog)

http://www.mattcutts.com/blog/

Of course this news was a bit much for the client to digest
in one chunk and he had little time to read the articles I
referenced in my note above, but it was enough to assure him
that I knew what I was talking about and explain the
differences in my report and his own keyword searches at his
end of the country. It's a bit odd to try to explain to a client
"there are different Googles." Few know or understand this.

Another issue cropped up later in the day when I was doing
further research for a different client and found, while we
were speaking on the phone, that his results differed from my
own on specific query operator searches. We were using the
"site:businessdomain.com" query operator and the
"allinurl:pick-your-own-URL" query operator to limit search
results and got vastly different numbers of results and
rankings for the same searches.

The first stunning thing in this example was that we are less
than 25 miles apart in Southern California. The second
shocker was that I tried simply hitting the "Search" button a
second time after getting the first results page and things
changed again! All of this happening in a single day makes me
believe that some percolating of results is going on as
Google eases into an algorithm change.

Perhaps this is not all that unusual, but in seven years of
this work, I've not seen the volatility noted in January of
2006. Are we about to have a major SearchQuake? Is Google
about to split the earth and spew volcanic new results? Stand
by for the BigDaddy SearchQuake sometime this month or next.

Mike Banks Valentine blogs on Search Engine developments
from http://RealitySEO.com and can be contacted for SEO work
at: http://www.seoptimism.com/SEO_Contact.htm He operates
a free web content distribution site at: http://Publish101.com

I'm a bit surprised at the geographic comments but I suppose we tend to look at these things a little differently on this side of the pond.

As far as the impact on us goes the picture that seems to be emerging is that we are being hit hardest in the more competitive keywords while some of the others are standing firm - probably still early days although some propogation has begun.

Thursday, February 02, 2006

 

Google Analytics - Overlay Feature

Google have just introduced an overlay feature which gives a graphical indication of the links followed from specific links on the pages of our site.

This is the first time I have seen anything like this and it is both shocking and encouraging at the same time. The number of unused links is the shocking thing but more gratifying is the observation of the effect of some of the design changes we have made.

As yet it is not quite clear to me how the JavaScripted links are handled in this feature - some of the indicators are detached from their buttons so we have a little more detective work to do there.

Of course some of the links we provide are largely there for optimisation purposes so we don't need to worry about them too much.

This should be a really valuable tool in page text and layout work - good stuff.

This page is powered by Blogger. Isn't yours?