Not hard:
gem environment
You can even get free bonus information about where ruby and rubygems are… but only if you order now. For a limited time, we’ll even throw in the command that got us this gem of an answer: “gem help commands”

Somebody has to do something, and it's just incredibly pathetic that it has to be us.
Not hard:
gem environment
You can even get free bonus information about where ruby and rubygems are… but only if you order now. For a limited time, we’ll even throw in the command that got us this gem of an answer: “gem help commands”
Surprising this isn’t better Google-documented. Here’s how I do it.
Make a list of your existing gems in a text file (run from machine with gems already installed):
sudo dpkg --get-selections | awk '{print $1}' > installedpackages
Copy the file to your new system. Then run:
cat installedpackages | xargs sudo aptitude install -y
It’s run through each line of your gemlist file and run “sudo aptitude install -y” on it.
Surprising this isn’t better Google-documented. Here’s how I do it.
Make a list of your existing gems in a text file (run from machine with gems already installed):
gem list | tail -n+4 | awk '{print $1}' > gemlist
Copy the file to your new system. Make sure your new system has the packages installed that will be needed to build your gems (such as ruby1.8, ruby1.8-dev, rubygems1.8). Then run:
cat gemlist | xargs sudo gem install
It’s run through each line of your gemlist file and run sudo gem install on it. If you don’t sudo install your gems, you can remove that bit about sudo.
Or if you want to go for the gold with a single command line:
ssh -o 'StrictHostKeyChecking=no' #{gem_server} #{path_to_gem} list | tail -n+1 | awk '{print $1 $2}' | sed 's/(/ --version /' | sed 's/)//' | tail -n+3 | xargs -n 3 #{path_to_gem} install
After installing REE 1.8.7, I used a slight permutation of this to install my gems with Ruby 1.8.7 from their previous Ruby 1.8.6 installation (previous gem install was accessible as “gem”, REE 1.8.7 gem accessible as “/opt/ruby-enterprise-1.8.7-20090928/bin/gem”):
gem list | tail -n+1 | awk '{print $1 $2}' | sed 's/(/ --version /' | sed 's/)//' | tail -n+3 | xargs -n 3 /opt/ruby-enterprise-1.8.7-20090928/bin/gem install
That reinstalled all my existing gems with the REE 1.8.7 gem manager.
I’ve read two blog posts in the last day that remind me about my favorite part of the Internet, from a consumer perspective: the opportunity for justice to be done to companies whose draconian policies make our lives as consumers worse.
The first post was Joel’s indictment of Circuit City (spot on in my experience… the only store I had ever gone to where I couldn’t make a return, with a receipt in hand for an unopened product purchased less than a week earlier). The second was Lo Toney’s scathing assessment of some random IT company.
As a consumer, there is no more frustrating feeling than the one that comes when you’re dealing with a company that believes it has little to gain by making you happy. In addition to Circuit City, I’ve felt this way dealing with Staples (wouldn’t let me make a totally reasonable return), and Chase+Paypal (pages take 10+ seconds to load on a regular basis). Comcast has made me feel that way quite a bit too, although less so in the last couple months.
What is frustrating as a consumer is even more infuriating as an entrepreneur, because I understand that companies like these often have millions of dollars they could spend to improve service. But when Powers that Be sit in their boardroom to discuss how to dole out the bounty of revenue, they want to find an equation that describes why they should invest in an improved customer experience. When they can’t find one, the question becomes “Who cares if our unreasonable return policy upsets a few thousand people? Our customers number in the mieeeeeellons. ”
But, by faciliating frictionless communication between a huge body of consumers, the Internet has proven to be the great equalizer for these anti-customer companies. And it seems that they are increasingly meeting the justice they deserve. Circuit City is now gone, Comcast has been spending money in hopes to repair its image. Just as telling, companies that do care about their customers, like Costco, Amazon, and Nordstrom have thrived in the “Communication Age” we now live in.
It seems that the cost of leaving your customers frustrated is increasing with every new Facebook, Twitter, or LiveJournal account that gets registered (not to mention Yelp). As an avowed lover of justice, this trend has my vote as one of the best developments of the last 10 years. I hope & believe we’ll continue to see movement toward customer-friendly policies, as the communication pathways afforded by the web lay waste to old-guard companies that still don’t “get” that frustrated customers cost a lot more than that one customer.
So you installed the Unbox player when you were logged in as one user at Amazon, and later you purchased and downloaded videos as another user?
Well you’re going to hell, says Amazon.
I’ve just spent half an hour poring exhaustively through every nook and cranny of the Unbox UI, Unbox executable files, and even trying to go through the Modify/Repair options afforded by Windows, all to no avail. So far as I can tell, the Amazon account that you chose when you installed Unbox is the one and only account that you will be using with Unbox, unless you uninstall the entire player and reinstall it.
This is one of the few times that Amazon has chapped my hide. Usually they are decent, reasonable people.
If any commenter knows of a trick to get around this, a gaggle of Google searchers will thank you for it. Me? I just uninstalled and reinstalled Unbox, and wrote a blog about it to pass the time while I waited for the install to download & finish.
Update!
Mega awesome commenter Nick Haas has the answer! Says Nick:
I think I figured it out on Windows 7 at least:
c:\programdata\Amazon
This folder contains another folder which you can just move somewhere else on your computer. The next time you open up Unbox, it will ask you to log in again. I logged in with my other account and also changed the Computer Name and the Download Folder to remind me which account this one was. I then downloaded a free movie with the new account and it worked like a charm. When you want to go back to your other account just switch back to the folder that you moved originally.
I did not find anything to love about the results I was getting searching for queries like “Increase rubymine memory” and “set rubymine memory”. Here’s the location of the file that dictates Rubymine memory usage in my Ubuntu install:
[RUBYMINE_DIRECTORY]/bin/rubymine.vmoptions
Inside you can specify lines like:
-Xms800m
-Xmx1200m
-XX:MaxPermSize=1000m
-ea
These declare maximum memory usage, maximum sustained memory usage, and, uh, some other stuff.
[Hops onto soapbox]
I love Rubymine, but I sometimes wish that instead of adding a ridiculous number of new features, they’d just make the damn thing run consistently fast, and not make me worry about memory usage. I know that with my web site I always have a preference toward adding cool new stuff, but responsible thing to do is often to perfect the pieces that are already there. Almost everything negative I’ve heard about Rubymine (which isn’t all that much) is along this same vein, but I’ve never heard the Rubymine team admit that it wishes that the product ran faster or its memory management were better.
[Hops back down]
For my money, still definitely the best choice for Linux IDEs.
Since we first signed up with New Relic about six months ago, they’ve impressed me with the constant stream of features that they have added to their software on a monthly basis. When we first signed up, they were a pretty vanilla monitoring solution, and impressed me little more than FiveRuns had previously. They basically let you see longest running actions sorted by average time consumed, and they let you see throughput, but beyond that, there was little reason to get excited at the time.
Since then, they’ve been heaping on great additions. First, they added a new view (requested by yours truly, amongst others) that let actions be sorted not just by the average time taken, but by the arguably more important “Time taken * times called,” which tends to give a better bang-per-buck idea of where optimization time should be spent.
They’ve also been rearranging which features are available at which levels, which has made “Silver” level a much more tempting proposition, with both the “Controller Summary” (described last paragraph) and “Transaction Traces,” which allows you to see which specific database calls are taking longest to complete.


But by far my favorite New Relic feature added is their brand new “Apdex” feature. If you’re a busy web programmer or operator, the last thing you want to do is spend time creating subjective criteria to prioritize which parts of your application should be optimized first. You also don’t want to spend time determining when, exactly, an action has become slow enough that it warrants optimization time. Apdex provides a terrific way to answer both of these prickly, subjective questions, and it does it in typical New Relic fashion — with a very coherent and readable graphical interface.
I’ve included some screenshots of the Apdex for one of our slower actions at right. These show (from top to bottom) the actions in our application; ordered from most to least “dissatisfying,” the performance breakdown of one of our more dissatisfying actions; and the degree to which this action has been dissatisfying today, broken down by hour, and put onto a color coded scale that ranges from “Excellent” (not dissatisfying) down to poor. Apdex measures “dissatisfaction” as a combination of the number of times that a controller action has been “tolerable” (takes 2-8 seconds to complete) and “frustrating” (takes more than 8 seconds to complete).
New Relic is to be commended for tackling an extremely subjective problem (when and where to optimize) and creating a very sensible, objective framework through which to filter that decision. Bravo, guys. Now, hopefully after Railsconf they can spend a couple hours running Apdex on their Apdex window, since the rendering time for the window generally falls into their “dissatisfaction” range (greater than 8 seconds) 🙂
But I’m more than willing to cut them some slack for an addition this useful (and this new).
Last week, as part of getting Bloggity rolling with the key features of WordPress, I realized that we needed to allow the user to upload images without doing a page reload. Expecting a task as ordinary as this would be well covered by Google, I dutifully set out in search of “rails ajax uploading” and found a bunch of pages that either provided code that simply didn’t work, or claims that it couldn’t be done without a Rails plugin.
Not so. If you use jQuery and the jQuery-form plugin.
The main challenge in getting a AJAX uploading working is that the standard remote_form_for doesn’t understand multipart form submission, so it’s not going to send the file data Rails seeks back with the AJAX request. That’s where the jQuery form plugin comes into play. Here’s the Rails code for it:
<% remote_form_for(:image_form, :url => { :controller => "blogs", :action => :create_asset }, :html => { :method => :post, :id => 'uploadForm', :multipart => true }) do |f| %>
Upload a file: <%= f.file_field :uploaded_data %>
<% end %>
Here’s the associated Javascript:
$('#uploadForm input').change(function(){
$(this).parent().ajaxSubmit({
beforeSubmit: function(a,f,o) {
o.dataType = 'json';
},
complete: function(XMLHttpRequest, textStatus) {
// XMLHttpRequest.responseText will contain the URL of the uploaded image.
// Put it in an image element you create, or do with it what you will.
// For example, if you have an image elemtn with id "my_image", then
// $('#my_image').attr('src', XMLHttpRequest.responseText);
// Will set that image tag to display the uploaded image.
},
});
});
And here’s the Rails controller action, pretty vanilla:
@image = Image.new(params[:image_form])
@image.save
render :text => @image.public_filename
As you can see, all quite straightforward with the help of jQuery. I’ve been using this for the past few weeks with Bloggity, and it’s worked like a champ.
Welcome to the 2000’s, self!
I’m ever so excited to be blogging at a blog that not only understands code highlighting, but doesn’t look like it was crafted by a mad scientist with cataracts in 1992. Now it looks more like it was crafted by a mad scientist without cataracts circa 2008 — which is an entirely more accurate representation of the truth.
That’s the good news.
The bad news? That I have don’t anything meaningful to report in this post.
Maybe I’ll just write some highlighted code instead.
# ---------------------------------------------------------------------------
# options[:except_list]: list of symbols that we will exclude form this copy
# options[:dont_overwrite]: if true, all attributes in from_model that aren't #blank? will be preserved
def self.copy_attributes_between_models(from_model, to_model, options = {})
return unless from_model && to_model
except_list = options[:except_list] || []
except_list << :id
to_model.attributes.each do |attr, val|
to_model[attr] = from_model[attr] unless except_list.index(attr.to_sym) || (options[:dont_overwrite] && !to_model[attr].blank?)
end
to_model.save if options[:save]
to_model
end
Hey hey hey code, you're looking quite sexy this evening -- you come around here often?
Made another pass at incorporating my newer changes to bloggity this evening. Now in the trunk:
With new features come new dependencies, but most of these are hopefully common enough that you’ll already have them:
If you’re already running bloggity, you can update your DB tables by running the migration under /vendor/plugins/bloggity/db/migrations. If not, you can just follow the instructions in the previous bloggity post and you should be good to go.
I’m hoping in the next week to do some more testing of these new features and add a README to the repository, but it’s too late for such niceties this evening.
P.S. Allow me to pre-emptively answer why it’s in Google’s SVN instead of Github.