Copy Ruby Gems Between Ruby Installations/Computers/Systems

Surprising this isn’t better Google-documented. Here’s how I do it.

Make a list of your existing gems in a text file (run from machine with gems already installed):

gem list | tail -n+4 | awk '{print $1}' > gemlist

Copy the file to your new system. Make sure your new system has the packages installed that will be needed to build your gems (such as ruby1.8, ruby1.8-dev, rubygems1.8). Then run:

cat gemlist | xargs sudo gem install

It’s run through each line of your gemlist file and run sudo gem install on it. If you don’t sudo install your gems, you can remove that bit about sudo.

Or if you want to go for the gold with a single command line:

ssh -o 'StrictHostKeyChecking=no' #{gem_server} #{path_to_gem} list | tail -n+1 | awk '{print $1 $2}' | sed 's/(/ --version /' | sed 's/)//' | tail -n+3 | xargs -n 3 #{path_to_gem} install

After installing REE 1.8.7, I used a slight permutation of this to install my gems with Ruby 1.8.7 from their previous Ruby 1.8.6 installation (previous gem install was accessible as “gem”, REE 1.8.7 gem accessible as “/opt/ruby-enterprise-1.8.7-20090928/bin/gem”):

gem list | tail -n+1 | awk '{print $1 $2}' | sed 's/(/ --version /' | sed 's/)//' | tail -n+3 | xargs -n 3 /opt/ruby-enterprise-1.8.7-20090928/bin/gem install

That reinstalled all my existing gems with the REE 1.8.7 gem manager.

Justice Served: My Favorite Part of the Internet

I’ve read two blog posts in the last day that remind me about my favorite part of the Internet, from a consumer perspective: the opportunity for justice to be done to companies whose draconian policies make our lives as consumers worse.

The first post was Joel’s indictment of Circuit City (spot on in my experience… the only store I had ever gone to where I couldn’t make a return, with a receipt in hand for an unopened product purchased less than a week earlier). The second was Lo Toney’s scathing assessment of some random IT company.

As a consumer, there is no more frustrating feeling than the one that comes when you’re dealing with a company that believes it has little to gain by making you happy. In addition to Circuit City, I’ve felt this way dealing with Staples (wouldn’t let me make a totally reasonable return), and Chase+Paypal (pages take 10+ seconds to load on a regular basis). Comcast has made me feel that way quite a bit too, although less so in the last couple months.

What is frustrating as a consumer is even more infuriating as an entrepreneur, because I understand that companies like these often have millions of dollars they could spend to improve service. But when Powers that Be sit in their boardroom to discuss how to dole out the bounty of revenue, they want to find an equation that describes why they should invest in an improved customer experience. When they can’t find one, the question becomes “Who cares if our unreasonable return policy upsets a few thousand people? Our customers number in the mieeeeeellons. ”

But, by faciliating frictionless communication between a huge body of consumers, the Internet has proven to be the great equalizer for these anti-customer companies. And it seems that they are increasingly meeting the justice they deserve. Circuit City is now gone, Comcast has been spending money in hopes to repair its image. Just as telling, companies that do care about their customers, like Costco, Amazon, and Nordstrom have thrived in the “Communication Age” we now live in.

It seems that the cost of leaving your customers frustrated is increasing with every new Facebook, Twitter, or LiveJournal account that gets registered (not to mention Yelp). As an avowed lover of justice, this trend has my vote as one of the best developments of the last 10 years. I hope & believe we’ll continue to see movement toward customer-friendly policies, as the communication pathways afforded by the web lay waste to old-guard companies that still don’t “get” that frustrated customers cost a lot more than that one customer.

Set/Increase Memory Available in Rubymine

I did not find anything to love about the results I was getting searching for queries like “Increase rubymine memory” and “set rubymine memory”. Here’s the location of the file that dictates Rubymine memory usage in my Ubuntu install:

[RUBYMINE_DIRECTORY]/bin/rubymine.vmoptions

Inside you can specify lines like:

-Xms800m
-Xmx1200m
-XX:MaxPermSize=1000m
-ea

These declare maximum memory usage, maximum sustained memory usage, and, uh, some other stuff.

[Hops onto soapbox]
I love Rubymine, but I sometimes wish that instead of adding a ridiculous number of new features, they’d just make the damn thing run consistently fast, and not make me worry about memory usage. I know that with my web site I always have a preference toward adding cool new stuff, but responsible thing to do is often to perfect the pieces that are already there. Almost everything negative I’ve heard about Rubymine (which isn’t all that much) is along this same vein, but I’ve never heard the Rubymine team admit that it wishes that the product ran faster or its memory management were better.
[Hops back down]

For my money, still definitely the best choice for Linux IDEs.

New Relic Apdex: The Best Reason So Far to Use New Relic

Since we first signed up with New Relic about six months ago, they’ve impressed me with the constant stream of features that they have added to their software on a monthly basis. When we first signed up, they were a pretty vanilla monitoring solution, and impressed me little more than FiveRuns had previously. They basically let you see longest running actions sorted by average time consumed, and they let you see throughput, but beyond that, there was little reason to get excited at the time.

Since then, they’ve been heaping on great additions. First, they added a new view (requested by yours truly, amongst others) that let actions be sorted not just by the average time taken, but by the arguably more important “Time taken * times called,” which tends to give a better bang-per-buck idea of where optimization time should be spent.

They’ve also been rearranging which features are available at which levels, which has made “Silver” level a much more tempting proposition, with both the “Controller Summary” (described last paragraph) and “Transaction Traces,” which allows you to see which specific database calls are taking longest to complete.screenhunter_05-may-07-1504screenhunter_06-may-07-1504screenhunter_07-may-07-15041

But by far my favorite New Relic feature added is their brand new “Apdex” feature. If you’re a busy web programmer or operator, the last thing you want to do is spend time creating subjective criteria to prioritize which parts of your application should be optimized first. You also don’t want to spend time determining when, exactly, an action has become slow enough that it warrants optimization time. Apdex provides a terrific way to answer both of these prickly, subjective questions, and it does it in typical New Relic fashion — with a very coherent and readable graphical interface.

I’ve included some screenshots of the Apdex for one of our slower actions at right. These show (from top to bottom) the actions in our application; ordered from most to least “dissatisfying,” the performance breakdown of one of our more dissatisfying actions; and the degree to which this action has been dissatisfying today, broken down by hour, and put onto a color coded scale that ranges from “Excellent” (not dissatisfying) down to poor. Apdex measures “dissatisfaction” as a combination of the number of times that a controller action has been “tolerable” (takes 2-8 seconds to complete) and “frustrating” (takes more than 8 seconds to complete).

New Relic is to be commended for tackling an extremely subjective problem (when and where to optimize) and creating a very sensible, objective framework through which to filter that decision. Bravo, guys. Now, hopefully after Railsconf they can spend a couple hours running Apdex on their Apdex window, since the rendering time for the window generally falls into their “dissatisfaction” range (greater than 8 seconds) 🙂

But I’m more than willing to cut them some slack for an addition this useful (and this new).

Rails Ajax Image Uploading Made Simple with jQuery

Last week, as part of getting Bloggity rolling with the key features of WordPress, I realized that we needed to allow the user to upload images without doing a page reload. Expecting a task as ordinary as this would be well covered by Google, I dutifully set out in search of “rails ajax uploading” and found a bunch of pages that either provided code that simply didn’t work, or claims that it couldn’t be done without a Rails plugin.

Not so. If you use jQuery and the jQuery-form plugin.

The main challenge in getting a AJAX uploading working is that the standard remote_form_for doesn’t understand multipart form submission, so it’s not going to send the file data Rails seeks back with the AJAX request. That’s where the jQuery form plugin comes into play. Here’s the Rails code for it:

<% remote_form_for(:image_form, :url => { :controller => "blogs", :action => :create_asset }, :html => { :method => :post, :id => 'uploadForm', :multipart => true }) do |f| %>
 Upload a file: <%= f.file_field :uploaded_data %>
<% end %>

Here’s the associated Javascript:

$('#uploadForm input').change(function(){
 $(this).parent().ajaxSubmit({
  beforeSubmit: function(a,f,o) {
   o.dataType = 'json';
  },
  complete: function(XMLHttpRequest, textStatus) {
   // XMLHttpRequest.responseText will contain the URL of the uploaded image.
   // Put it in an image element you create, or do with it what you will.
   // For example, if you have an image elemtn with id "my_image", then
   //  $('#my_image').attr('src', XMLHttpRequest.responseText);
   // Will set that image tag to display the uploaded image.
  },
 });
});

And here’s the Rails controller action, pretty vanilla:

 @image = Image.new(params[:image_form])
 @image.save
 render :text => @image.public_filename

As you can see, all quite straightforward with the help of jQuery. I’ve been using this for the past few weeks with Bloggity, and it’s worked like a champ.

Me No Blog Hella Ugly!

Welcome to the 2000’s, self!

I’m ever so excited to be blogging at a blog that not only understands code highlighting, but doesn’t look like it was crafted by a mad scientist with cataracts in 1992. Now it looks more like it was crafted by a mad scientist without cataracts circa 2008 — which is an entirely more accurate representation of the truth.

That’s the good news.

The bad news? That I have don’t anything meaningful to report in this post.

Maybe I’ll just write some highlighted code instead.

# ---------------------------------------------------------------------------
# options[:except_list]: list of symbols that we will exclude form this copy
# options[:dont_overwrite]: if true, all attributes in from_model that aren't #blank? will be preserved
def self.copy_attributes_between_models(from_model, to_model, options = {})
	return unless from_model && to_model
	except_list = options[:except_list] || []
	except_list << :id
	to_model.attributes.each do |attr, val|
		to_model[attr] = from_model[attr] unless except_list.index(attr.to_sym) || (options[:dont_overwrite] && !to_model[attr].blank?)
	end
	to_model.save if options[:save]
	to_model
end

Hey hey hey code, you're looking quite sexy this evening -- you come around here often?

Rails Blog Plugin Bloggity v. 0.5 – Now Available for Consumption

Made another pass at incorporating my newer changes to bloggity this evening. Now in the trunk:

  • FCKEditor used to write blog posts (=WYSIWYG, WordPress-like text area)
  • Images can be uploaded (via AJAX) while creating a blog post. You can then link to themvia the aforementioned FCKEditor
  • Added scaffolding for blog categories, and allowing categories to have a “group_id” specified, so you could maintain different sets of blogs on your site (i.e., main blog, CEO blog, user blogs, etc. Each would draw from categories that had a different group_id)
  • Blog comments can be edited by commenter
  • Blog commenting can be locked
  • Blog comments can be deleted by blog writer

With new features come new dependencies, but most of these are hopefully common enough that you’ll already have them:

  • attachment_fu (if you want to save images)
  • jquery and jquery-form plugin(if you want to upload images via AJAX. The jquery-form plugin is bundled in the bloggity source code)
  • FCKEditor (if you want to use a WYSIWYG editor)

If you’re already running bloggity, you can update your DB tables by running the migration under /vendor/plugins/bloggity/db/migrations. If not, you can just follow the instructions in the previous bloggity post and you should be good to go.

I’m hoping in the next week to do some more testing of these new features and add a README to the repository, but it’s too late for such niceties this evening.

P.S. Allow me to pre-emptively answer why it’s in Google’s SVN instead of Github.

Rails Fix Slow Loads in Development when Images Missing

I have found it useful to populate my local development database with data from our production server in order to be able to get good test coverage. However, a perpetual problem I’ve had with this approach is that it introduces an environment where sometimes images are available and sometimes they aren’t (the database knows about all the images, but some were uploaded locally, some reside on our main servers, and some are on S3).

What I’ve found is that even though Rails doesn’t give exceptions when it finds missing images, it does start to get painfully slow. Each missing image it has to process usually takes about 2 seconds. On pages with 5-10 missing images, the wait could be quite painful.

So I finally got fed up yesterday and wrote a hacky patch to get around this problem. Here it is:

def self.force_image_exists(image_location)
 default_image = "/images/dumpster.gif"
 if(image_location && (image_location.index("http") || File.exists?(RAILS_ROOT +  "/public" + image_location.gsub(/\?.*/, ''))))
  image_location
 else
  default_image
 end
end

This function is part of a utility class (named “UtilityGeneral”) that we use for various miscellaneous tasks. I call this method from a simple mixin:

if RAILS_ENV == 'development'
 module ActionView 
  module Helpers #:nodoc: 
   module AssetTagHelper
   # replace image tag
   def path_to_image(source)
     original_tag = ImageTag.new(self, @controller, source).public_path
     UtilityGeneral.force_image_exists(original_tag)
    end
   end
  end
 end
end

If anyone else works locally with images that may or may not exist, this wee patch should come in handy to save you from load times of doom on pages that are missing images. It just subs in an alternate image when the real image doesn’t exist locally.

P.S. When I grow up, I want a blog about coding that lets me paste code.
P.S.S. 4/10: I grew up!

Is it just me?

Or do you notice that every time you visit a website or a submit a form, and you get the indefinite spinner of doom, that the url always seems to end in .aspx?

Hmmmm!