Ajax Apps w/ XMLHttpRequest

Ajax type behavior can be created in Web applications a few different ways. One of the more “old school” and cross-browser compatible methods was using a hidden IFRAME element to perform client-server transactions in the background. This can still be an appropriate technique today, but most modern browsers have some degree of support for the XMLHttpRequest object in their JavaScript implementation. If you aren’t already familiar with it you should explore What is XMLHttpRequest for details, usage examples and history of XMLHttpRequest.
Continue reading

What Defines Web 2.0 and The Evolution of the Internet?

A new blog by Gabriel Harper discusses Web 2.0 and how the evolution of the Internet led to Web 2.0. Web 2.0 and the Evolving Web goes into greater detail.

Perhaps everyone won’t agree with me on this. But the common trend among everyone’s definition of what Web 2.0 is describes an evolving Web. The bubble bursting forced the strong to be stronger, and shoved the weak out of the game entirely. What’s left is the Web that evolved to meet the challenges of a lucrative yet harsh user-centric world.

Finding Hidden Web Marketing Tactics

The best online marketing is low cost marketing. These marketing offerings are often right under your nose, but very few people can see them because they are so very well hidden. The best styles of low cost marketing may just be bundled with some of the companies you already use.

Your Web Design Company: Every company is trying to overpower this financial slump in the United States. The web design companies you may have already employed to create your website may be no different. Many web design companies are now offering marketing along with the design in order to keep their customers happy and coming back again and again.

Copywriting and SEO: While many online businesses use SEO tactics and copywriting to promote their products and services, they often turn to bidding sites for the marketing articles they need for the company. In many cases, the same copywriting company used for the more professional side of writing, will also employ SEO experts that can write articles perfect for those search engines.

These are just a few simple examples of how you can turn a situation around to find hidden marketing gems. When choosing marketing tactics, looking more closely at the companies you already work with may save a lot of hassle and a little bit of money on the bottom line. Every company wants to earn new customers, but keeping the older ones happy is just as important.

Scriptalicious Payment System Updated

Scriptalicious payment system received some updates:

We are glad to announce that we have updated our payment system to fix problems with orders stuck in PENDING status. Any orders placed prior to this update should have been activated, and any orders you place in the future will be automatically activated once your PayPal payment is complete!

We want to apologize for the inconvenience this has placed on several customers that waited for their downloads. This fix will ensure that your payments are processed instantly in the future.

These updates affects orders for the PageRank Script – the free MySpace script is unaffected and can be downloaded by registered users regardless of the previous problems since no payment is involved.

Quick Search Menu for phpLinkBid

It’s easy to create a quick search menu for phpLinkBid like the one seen on Link Bid Guide (left hand column – under Quick Search of course). These menu items are added in the header template, but a link to a phpLinkBid search can be placed anywhere. These quick search links can help your visitors find information on any number of keywords desired, for example a quick search for “software” could bring up other relevant links apart from the software directory.

A phpLinkBid search search term can be passed in the URL parameter q=searchterm. For example:


Would bring up directory links on your phplb directory. It’s that simple!

Effect of using keywords in URLs with Google

Many years ago I hardly knew what a search engine friendly URL was, but when I stumbled upon the concept and started moving away from dynamic URLs to friendly, keyword rich URLs I noticed a direct and immediate positive impact on how much I was crawled and how well I was ranked in Google. Is that still the case? I still use friendly URLs whenever I can, but there doesn’t seem to be a lot of agreement on the subject.

I’m trying to get everyone’s opinion over at eLanceTalk with my Impact of keywords in URLs on Google performance post. It would be great to see some more data on the direct impact. So far the most recent and applicable studies made publicly available are lackluster at best.

phpLinkBid v1.4.1 fixes link approvals, adds CAPTCHA

phpLinkBid v1.4.1 has been released and is available for download immediately. This update resolves the annoying link approval bug, adds CAPTCHA to the contact form, paging to Links Admin and fixes a few minor bugs.

No template files have been modified from v1.4 release. If you’re using a custom template download the Changed Files Only package or skip the templates uploading the Full Install.

Read more: phpLinkBid v1.4.1 Released

Never Use Undo When You Need a Warning

I’ve developed a strange habit over the years of obsessively pressing “CTRL-A, CTRL-C” whenever I’m typing – be it MS Word or WordPress. My own little personal “undo”. Well, I’ve had this problem with my mouse lately where it loves to register a double click when clicked once. For some reason this makes Firefox upset, and as I’m typing in a textarea (such as this one), it will occassionally turn the textarea completely white. The only way to recover is to refresh the page, and in most cases I lose whatever I was typing. Ironically enough, as I was wrapping up this post, guess what happened? I lost everything. Strange habits can be useful.

A List Apart commented on this subject in Never Use a Warning When you Mean Undo, an interesting article with no real solutions offered, but sparking some great conversation. It seems like everyone is seeking the “One Way” to appropriate oopsie handling, but the reality of the situation requires a much more complicated policy. The title of this blog may suggest disagreement, but in fact I only suggest that undo is only part of a complete policy.

The most important concept I take from this article is the necessity we face as developers to create apps for humans. Humans are imperfect, and the more robust an app the better it should be able to handle that fact. Humans make mistakes, and most Web apps currently do a flimsy job at best of helping a user recover from those mistakes. Sometimes the mistake isn’t even a result of human error, but technological deficiency in other systems and software.

However, I do not believe that we want apps which condone or promote stupidity. I do believe that we want apps which manage stupidity, and help create better users.

When it comes to managing data, especially in volume, a quick transparent undo could be extremely helpful. But a picture comes to mind, of one particular e-Commerce app I built which allowed the admin to edit an unlimited number of product records all on one screen. In addition to 20+ fields of data per record, a number of child records were editable for each product on this same screen. Handling and validating all that data was a party unto itself. Products had groups, groups had categories, categories were recursive… recording each and every change would have been insane, and the maintenance and development requirements would be unreasonable. Revisions and rollbacks were important on this project however, since multiple users were managing multiple records for a high volume production cart. We deployed two databases and users could play in the sandbox and rollback edits, deletes and creates from the last published revision if needed, and publish their completed work to the production server.

A bunch of solutions have been presented to the oopsie: better confirmation boxes, status flags, logging all changes, etc. And these are all appropriate solutions (or at least part of an appropriate solution) for the right application. For example, I would not allow a transparent undo on data that was part of a collaborative effort, as in the example above. However I would allow a change request or published revision since that undo could affect everyone else’s workflow. A confirmation box is an important element in preventing the cascading effects of human error, and in an enterprise scenario a verbose confirmation process might be your best friend. Another seemingly more intelligent policy would allow transparent undo’s until the data was committed in another process for the first time, but then we have to ask if the policy is still understandable and managable.

Single user scenarios are the most appropriate place, but most Web apps probably lie somewhere in between. For example in most popular forum software users manage their own posts, however a thread is a collaborative effort. Does a post reply “depend” on it’s parent, and on what attributes of the parent? The rather simple solution to oopsies in vBulletin or phpBB is an edit button that lets you change your post. In some cases there is a time limit on the edit, much in the case of Digg as mentioned in the above article. In general it’s a pretty effective solution for a collaborative forum/blog type situation where the dependencies are low, but this still isn’t a single-user scenario, and we could do better.

This is a pretty unorganized post, but overall I think confirmation boxes are perfectly acceptable given the state of the Internet. My best advice to a developer is to consider the needs of your user before diving into an overly-complex solution, but I urge anyone to experiment. I am personally trying to improve the usability of my apps every day, and believe that a great spec will identify the right combination of AJAX methodologies, confirmations, clear user messages, logging, states and sessions for the particular project.

Just my ideas – I could probably while away on this for a while. Eheh.

Rand Fishkin Discredits Directories

Rand Fishkin of SEOMoz.org makes several points in this new video diminishing the value of Web directories. I believe Rand made some good points about general link quality that directory owners need to think about whether they like it or not. However his video is an example to me of how little he understands directories. The fact that he says DMOZ is “good” only elucidates this.

The point he was basically making is that buying and selling PageRank is not a healthy business model nor a good plan of action for SEOs. Presumabely he decided to pick on directories to make this point since directories (especially paid directories) often tout things like PageRank to build perceived link value while their goal is ultimately to earn money – not to get you traffic or high quality links.

Much more goes into the quality of a link than the PageRank. I couldn’t agree more with this. However, dburdon on DP makes a good point on what is really valuable.

It all depends from where you’re starting. No PR and no traffic makes many directories seem valuable. Once you’re at the top of the tree you can afford to be more discriminating.

Fishkin provides a short list to illustrate what makes a directory link have less value than… well, he doesn’t really specify that part. At any rate, his list reads as follows:

  1. Admit anyone
  2. General subject
  3. Few relevant inlinks
  4. Not a “trusted domain”
  5. = Very low link value

The four points that result in “very low link value” are all things to look out for when adding your site to a directory. Heck, any Web site with no submission guidelines or standards, no relevancy and low link popularity is a pretty big waste of time if you ask me!

Let’s apply these factors to a directory site. What can a directory owner do to combat not only these tangible issues, but devaluing by the less tangible perception that these values exist when they do not?

  1. Provide clear submission guidelines and only accept links that meet these guidelines, even if it means refunding someone who paid for a submission.
  2. Create well-targetted categories with relevant content. You don’t have to be niche to be relevant.
  3. Obtain high quality, relevant links for your directory. Not just links from other directories to your main page, but relevant links to categories from respective niches.

Q3 Google Updates I: New backlinks and oncoming PR?

Some changes are happening with Google that might indicate the expected Q3 PageRank update is really beginning. Webmasters are starting to report changes on certain DCs (64.23.xx.xx, 72.14.xx.xx). Google backlinks on some of these DCs are fresh, as well as reported PR 0 on domains previously unranked (reporting N/A). Whether desired or not, the impact of a Google PageRank update is fairly significant for Search Engine Optimizers, directory and site owners, and Web advertising and SEO/SEM in general. As much as PR is discredited, and as much as Google tries to control the economy of PageRank, it has influence and will be helping some Webmasters either make or lose a lot of money!