Bonanzle: A Year One Retrospective, Part I

I just realized that Bonanzle’s one year anniversary (since beta launch) passed by a month ago without me taking a minute to chronicle the madness of it all. But I’ve never been much of one for hazy-eyed generalities (e.g., “starting a business is hard work,” “business plans are useless,” etc.), which is most likely what this blog would become if written in free form.

I much prefer hard facts and examples, the sort of thing that gets drawn out through the efforts of a skilled interviewer. Unfortunately, since Brian Williams isn’t returning my calls, we’re going to have to settle for a purely fictional interviewer, who I will name Patsy Brown, on account of the fact that she’s a brownnosing patsy that tosses me softballs to make me sound as smart as possible. Thanks a bunch, Patsy. Mind if I call you Pat? Oh, you’d love it if I called you that? Well, great! Let’s great started.

Pat: So, Bill, it has been an incredible year one for Bonanzle. It’s really something how, with no investment money, this creation has garnered more traction than sites with far more money and far more experience. Even more improbably, it’s been done with a team of two full timers and an ever-changing legion of contractors when comparable sites field teams of 10+ employees. How has this first year compared to your expectations?

Bill: Well thank you, Pat. That’s really nice of you to say. It has been pretty wild, hasn’t it? To be honest, there were very few moments in the early evolution of Bonanzle where I thought I knew what would happen next in terms of traffic, sales, or revenues (early interview tangent: I firmly believe the founders of Twitter had no idea what they were building at first either. I still remember in 2007 when they were advertising it as a means to let your friends know what you were doing on a minute-by-minute basis, which was a pretty dumb premise for a business, if you asked me or the other 3 people that were using it at the time. I am already dreading the inevitable declarations of genius that revisionist historians will bestow upon those founders in the months to come. Anyway, we now return to your regularly scheduled interview. And yes, I have no business interrupting this interview before the first paragraph has been finished). I mean, we spent more than a year building Bonanzle as a site that would compete with Craigslist, and it wasn’t until a couple months after we launched (in beta) that we realized that there was simply no market for a better Craigslist.

Once we figured that out and re-geared ourselves as a utopia for small sellers, the first few months were pretty unreal — growing 10-15% larger with every passing week. That was incredibly tough to manage, because at the time we’re increasing the load on our servers by a factor of about 2x-3x monthly, and I was still only learning how to program web sites. If I had set expectations, these early months would have certainly blown them out of the water.

Pat: Can you give us a story to get a sense of just how hectic those early months were?

Bill: Sure, Pat. One memorable Bonanzle moment for me was a week back in October, when I was housesitting for some friends. This was at a time when our traffic was starting to push into the hundreds of thousands of unique users, and our servers were in what I think could best be termed “perpetual meltdown mode.” I remember one particular night where I was up until 4 AM, fiendishly working on some improvements to our search so that it wouldn’t crash. The Olympics were on TV in the room, and I felt like I had an intimate bond with the athletes — I mean, it takes a certain type of insanity to workout for thousands of hours to become the best athlete in the world, and it takes a similar type of insanity to lock oneself up in a room for 12-14 hours per day and try to scale a site up to hundreds of thousands of visitors with no prior experience. Generally, a team of many experienced programmers is going to be required for that amount of traffic, but, being on the brink of going broke, that wasn’t an option. So I pried my eyelids open until I finished the search upgrades, and wearily made my way back to bed to get up early and repeat.

It turned out that that “getting up early” in this case was about three hours after I went to sleep, when I received a then-common automated phone call from our server monitoring center that Bonanzle was down. I dragged myself out of bed, slogged down to the hot computer room, and spent another couple hours figuring out what had gone wrong. When it was fixed, I turned the Olympics back on and basked in our shared absurdity at 8 AM that morning.

Pat: What were some of the key lessons you learned during those months?

Bill: Well, other than technical lessons, the most salient lesson was that, when you find a way to solve a legitimate pain, amazing things can happen. In our case, by building a marketplace as easy as Craigslist with a featureset that rivaled eBay’s, we had what seemed like every seller online telling us how relieved & empowered they felt to have discovered us. Then they told their friends via blogs and forums. It was heady stuff. Our item count rocketed in a way that we were told had never been seen amongst online marketplace not named “eBay,” as we shot from 0 listings to one million within our first few months.

Pat: But with great success comes great responsibility, right? Tell me about how you dealt with managing the community you suddenly found at your (virtual) doorstep.

Bill: That was a real challenge, but something that was really important to us to get right. I have frequently said that, being a Northwest company, one of my foremost goals is for us to live up to the legacy of the great customer-centric companies that have come from this region, like Costco, Amazon, and Nordstrom. Customer service is a company’s best opportunity to get to know its users and develop meaningful trust. So as we started to appreciate the amount of time and effort that’s required to keep thousands of users satisfied, we knew that it was going to become a full time job, so that’s when Mark was anointed the full time Community Overseer.

Pat: Tell me about your relationship with Mark and what he has been to Bonanzle.

The pairing of Mark and I is the sort of thing you read about in “How to Build a Business”-books. Our personalities are in many ways diametrically opposed, but in a perfectly compatible way for a business that requires multiple unrelated skillsets. Mark is patient, I am impatient. Mark is happy dealing with people for hours on end, I am happy dealing with computers for hours on end. Mark is content to be on an amazing ride, I am neverever satisfied with what we have and constantly looking forward.

Fortunately for us, there are also a few qualities that we have in common. We are both OK with constant chaos around us, which is assured in any startup (though I’d say that goes doubly for any community-driven startup). We both enjoy what we do, so we don’t mind 10-12 hour days 6 days per week (right, Mark? :)). And I think we both are generally pretty good at seeing things in other people’s shoes, painful though that sometimes can be when we can’t get something exactly the way we want it due to resource constraints.

Pat: I hear “community” as a common theme amongst your answers. Tell me about the makeup of the Bonanzle community and what their role has been in the building of Bonanzle.

Bill: Well I think it’s pretty obvious within a click or two of visiting the site that Bonanzle is the community. Almost all of the key features that differentiate us from other marketplaces revolve around letting the many talents of our community shine through. It starts from the home page, which is comprised of catchy groups of items curated by our numerous users with an eye for art/uniformity. Real time interaction, another Bonanzle cornerstone, relies on our sellers’ communication talents. And the traffic growth of the site has been largely driven by the efforts of our sellers to find innovative ways to get people engaged with Bonanzle: from writing to editors about us (it was actually the efforts of a single user that got us a feature story in Business Week), to organizing numerous on-site sales (Christmas in July, etc.) that drive buyers from far and wide.

I think that, from a management standpoint, it’s our responsibility to strive to keep out of the way of our sellers. In so doing, the embarrassment of riches we have in member talent can continue to build Bonanzle in ways that we’d have never even considered.

Pat: If you’re just joining us, I’m talking with Bill Harding, Founder of Bonanzle.com, about the experience of his first year of running Bonanzle. Please stay tuned — when we return, I’m going to talk to Bill about what he sees in today’s Bonanzle, and what he predicts for the future of Bonanzle. With any luck, I’ll even get him to answer the eternal question of which is tastier between pizza, nachos, and pie. But for now, a word from our sponsor:

Nice find, Arlene!

Determine location of ruby gems

Not hard:

gem environment

You can even get free bonus information about where ruby and rubygems are… but only if you order now. For a limited time, we’ll even throw in the command that got us this gem of an answer: “gem help commands”

Copy aptitude packages between Linux (Ubuntu) Computers/Systems/Installations

Surprising this isn’t better Google-documented. Here’s how I do it.

Make a list of your existing gems in a text file (run from machine with gems already installed):

sudo dpkg --get-selections | awk '{print $1}' > installedpackages

Copy the file to your new system. Then run:

cat installedpackages | xargs sudo aptitude install -y

It’s run through each line of your gemlist file and run “sudo aptitude install -y” on it.

Copy Ruby Gems Between Ruby Installations/Computers/Systems

Surprising this isn’t better Google-documented. Here’s how I do it.

Make a list of your existing gems in a text file (run from machine with gems already installed):

gem list | tail -n+4 | awk '{print $1}' > gemlist

Copy the file to your new system. Make sure your new system has the packages installed that will be needed to build your gems (such as ruby1.8, ruby1.8-dev, rubygems1.8). Then run:

cat gemlist | xargs sudo gem install

It’s run through each line of your gemlist file and run sudo gem install on it. If you don’t sudo install your gems, you can remove that bit about sudo.

Or if you want to go for the gold with a single command line:

ssh -o 'StrictHostKeyChecking=no' #{gem_server} #{path_to_gem} list | tail -n+1 | awk '{print $1 $2}' | sed 's/(/ --version /' | sed 's/)//' | tail -n+3 | xargs -n 3 #{path_to_gem} install

After installing REE 1.8.7, I used a slight permutation of this to install my gems with Ruby 1.8.7 from their previous Ruby 1.8.6 installation (previous gem install was accessible as “gem”, REE 1.8.7 gem accessible as “/opt/ruby-enterprise-1.8.7-20090928/bin/gem”):

gem list | tail -n+1 | awk '{print $1 $2}' | sed 's/(/ --version /' | sed 's/)//' | tail -n+3 | xargs -n 3 /opt/ruby-enterprise-1.8.7-20090928/bin/gem install

That reinstalled all my existing gems with the REE 1.8.7 gem manager.

Justice Served: My Favorite Part of the Internet

I’ve read two blog posts in the last day that remind me about my favorite part of the Internet, from a consumer perspective: the opportunity for justice to be done to companies whose draconian policies make our lives as consumers worse.

The first post was Joel’s indictment of Circuit City (spot on in my experience… the only store I had ever gone to where I couldn’t make a return, with a receipt in hand for an unopened product purchased less than a week earlier). The second was Lo Toney’s scathing assessment of some random IT company.

As a consumer, there is no more frustrating feeling than the one that comes when you’re dealing with a company that believes it has little to gain by making you happy. In addition to Circuit City, I’ve felt this way dealing with Staples (wouldn’t let me make a totally reasonable return), and Chase+Paypal (pages take 10+ seconds to load on a regular basis). Comcast has made me feel that way quite a bit too, although less so in the last couple months.

What is frustrating as a consumer is even more infuriating as an entrepreneur, because I understand that companies like these often have millions of dollars they could spend to improve service. But when Powers that Be sit in their boardroom to discuss how to dole out the bounty of revenue, they want to find an equation that describes why they should invest in an improved customer experience. When they can’t find one, the question becomes “Who cares if our unreasonable return policy upsets a few thousand people? Our customers number in the mieeeeeellons. ”

But, by faciliating frictionless communication between a huge body of consumers, the Internet has proven to be the great equalizer for these anti-customer companies. And it seems that they are increasingly meeting the justice they deserve. Circuit City is now gone, Comcast has been spending money in hopes to repair its image. Just as telling, companies that do care about their customers, like Costco, Amazon, and Nordstrom have thrived in the “Communication Age” we now live in.

It seems that the cost of leaving your customers frustrated is increasing with every new Facebook, Twitter, or LiveJournal account that gets registered (not to mention Yelp). As an avowed lover of justice, this trend has my vote as one of the best developments of the last 10 years. I hope & believe we’ll continue to see movement toward customer-friendly policies, as the communication pathways afforded by the web lay waste to old-guard companies that still don’t “get” that frustrated customers cost a lot more than that one customer.

Set/Increase Memory Available in Rubymine

I did not find anything to love about the results I was getting searching for queries like “Increase rubymine memory” and “set rubymine memory”. Here’s the location of the file that dictates Rubymine memory usage in my Ubuntu install:

[RUBYMINE_DIRECTORY]/bin/rubymine.vmoptions

Inside you can specify lines like:

-Xms800m
-Xmx1200m
-XX:MaxPermSize=1000m
-ea

These declare maximum memory usage, maximum sustained memory usage, and, uh, some other stuff.

[Hops onto soapbox]
I love Rubymine, but I sometimes wish that instead of adding a ridiculous number of new features, they’d just make the damn thing run consistently fast, and not make me worry about memory usage. I know that with my web site I always have a preference toward adding cool new stuff, but responsible thing to do is often to perfect the pieces that are already there. Almost everything negative I’ve heard about Rubymine (which isn’t all that much) is along this same vein, but I’ve never heard the Rubymine team admit that it wishes that the product ran faster or its memory management were better.
[Hops back down]

For my money, still definitely the best choice for Linux IDEs.

New Relic Apdex: The Best Reason So Far to Use New Relic

Since we first signed up with New Relic about six months ago, they’ve impressed me with the constant stream of features that they have added to their software on a monthly basis. When we first signed up, they were a pretty vanilla monitoring solution, and impressed me little more than FiveRuns had previously. They basically let you see longest running actions sorted by average time consumed, and they let you see throughput, but beyond that, there was little reason to get excited at the time.

Since then, they’ve been heaping on great additions. First, they added a new view (requested by yours truly, amongst others) that let actions be sorted not just by the average time taken, but by the arguably more important “Time taken * times called,” which tends to give a better bang-per-buck idea of where optimization time should be spent.

They’ve also been rearranging which features are available at which levels, which has made “Silver” level a much more tempting proposition, with both the “Controller Summary” (described last paragraph) and “Transaction Traces,” which allows you to see which specific database calls are taking longest to complete.screenhunter_05-may-07-1504screenhunter_06-may-07-1504screenhunter_07-may-07-15041

But by far my favorite New Relic feature added is their brand new “Apdex” feature. If you’re a busy web programmer or operator, the last thing you want to do is spend time creating subjective criteria to prioritize which parts of your application should be optimized first. You also don’t want to spend time determining when, exactly, an action has become slow enough that it warrants optimization time. Apdex provides a terrific way to answer both of these prickly, subjective questions, and it does it in typical New Relic fashion — with a very coherent and readable graphical interface.

I’ve included some screenshots of the Apdex for one of our slower actions at right. These show (from top to bottom) the actions in our application; ordered from most to least “dissatisfying,” the performance breakdown of one of our more dissatisfying actions; and the degree to which this action has been dissatisfying today, broken down by hour, and put onto a color coded scale that ranges from “Excellent” (not dissatisfying) down to poor. Apdex measures “dissatisfaction” as a combination of the number of times that a controller action has been “tolerable” (takes 2-8 seconds to complete) and “frustrating” (takes more than 8 seconds to complete).

New Relic is to be commended for tackling an extremely subjective problem (when and where to optimize) and creating a very sensible, objective framework through which to filter that decision. Bravo, guys. Now, hopefully after Railsconf they can spend a couple hours running Apdex on their Apdex window, since the rendering time for the window generally falls into their “dissatisfaction” range (greater than 8 seconds) 🙂

But I’m more than willing to cut them some slack for an addition this useful (and this new).

Rails Ajax Image Uploading Made Simple with jQuery

Last week, as part of getting Bloggity rolling with the key features of WordPress, I realized that we needed to allow the user to upload images without doing a page reload. Expecting a task as ordinary as this would be well covered by Google, I dutifully set out in search of “rails ajax uploading” and found a bunch of pages that either provided code that simply didn’t work, or claims that it couldn’t be done without a Rails plugin.

Not so. If you use jQuery and the jQuery-form plugin.

The main challenge in getting a AJAX uploading working is that the standard remote_form_for doesn’t understand multipart form submission, so it’s not going to send the file data Rails seeks back with the AJAX request. That’s where the jQuery form plugin comes into play. Here’s the Rails code for it:

<% remote_form_for(:image_form, :url => { :controller => "blogs", :action => :create_asset }, :html => { :method => :post, :id => 'uploadForm', :multipart => true }) do |f| %>
 Upload a file: <%= f.file_field :uploaded_data %>
<% end %>

Here’s the associated Javascript:

$('#uploadForm input').change(function(){
 $(this).parent().ajaxSubmit({
  beforeSubmit: function(a,f,o) {
   o.dataType = 'json';
  },
  complete: function(XMLHttpRequest, textStatus) {
   // XMLHttpRequest.responseText will contain the URL of the uploaded image.
   // Put it in an image element you create, or do with it what you will.
   // For example, if you have an image elemtn with id "my_image", then
   //  $('#my_image').attr('src', XMLHttpRequest.responseText);
   // Will set that image tag to display the uploaded image.
  },
 });
});

And here’s the Rails controller action, pretty vanilla:

 @image = Image.new(params[:image_form])
 @image.save
 render :text => @image.public_filename

As you can see, all quite straightforward with the help of jQuery. I’ve been using this for the past few weeks with Bloggity, and it’s worked like a champ.

Me No Blog Hella Ugly!

Welcome to the 2000’s, self!

I’m ever so excited to be blogging at a blog that not only understands code highlighting, but doesn’t look like it was crafted by a mad scientist with cataracts in 1992. Now it looks more like it was crafted by a mad scientist without cataracts circa 2008 — which is an entirely more accurate representation of the truth.

That’s the good news.

The bad news? That I have don’t anything meaningful to report in this post.

Maybe I’ll just write some highlighted code instead.

# ---------------------------------------------------------------------------
# options[:except_list]: list of symbols that we will exclude form this copy
# options[:dont_overwrite]: if true, all attributes in from_model that aren't #blank? will be preserved
def self.copy_attributes_between_models(from_model, to_model, options = {})
	return unless from_model && to_model
	except_list = options[:except_list] || []
	except_list << :id
	to_model.attributes.each do |attr, val|
		to_model[attr] = from_model[attr] unless except_list.index(attr.to_sym) || (options[:dont_overwrite] && !to_model[attr].blank?)
	end
	to_model.save if options[:save]
	to_model
end

Hey hey hey code, you're looking quite sexy this evening -- you come around here often?