The Mothership has Landed

http://www.bonanzle.com

There is no way you could post items for sale faster anywhere on the Internet.

Please support this blog and put some stuff up for sale. The time to take pictures + visit the site + post the items is nearly guaranteed to be less than an hour in total.

Ultimate Guide to Setup SSL on Rails and Apache 2 (with Ubuntu seasoning)

UPDATE 5/24/08: There was a recent APB security bulletin for all those running Debian-based OSes (including Debian) with OpenSSL 0.9.8c (released 2006) onward. You can read about it here. Long story short: if you’re running a flavor of Debian, you should run “sudo apt-get update” before you start these instructions, to ensure that you’re using the patched version of SSL. And now back to your regularly scheduled programming

For something that a million billion web sites have to do, the state of documentation on how to get your Rails app running with SSL in a standard Apache configuration, is.. how shall we say it.. ass. Even the previously lauded “Advanced Rails Recipes” only scratches the surface on what it really takes to get a Rails app running with SSL. This post will aim to change that with a (relatively) comprehensive, front-to-back guide of what I learned in setting up our app with SSL over the last couple days. Maybe it isn’t really the “ultimate” guide, per se, but let’s say we’re speaking relatively here.

Step 1: Understand as little as possible about what SSL is and how it works.

SSL is the technology that’s working when your browser visits an https:// address. It is typically used by production websites for logging in, updating user data, and of course, submitting top secret data. The point of SSL is to encrypt sensitive user information so that it isn’t sent across the web in plain text form, where it could be read and used maliciously by evil pimpled pirates.

To setup an SSL connection, you need to create a key and certificate. You then need to get that certificate endorsed by a “certificate authority” (henceforth, a CArtel… more on the name in a minute). For testing purposes, that CArtel can be your website, but if you endorse your own certificate, visiting users will be asked if they want to trust your site as a certificate authority before proceeding to your https:// pages. So real production sites rely on the syndicate CArtels. The biggest CArtels include Verisign, Geotrust, Thawte, and GoDaddy. Generally speaking, the bigger the CArtel, the more they charge you for the same service. Like the Corleone family. In the process of setting up your SSL, you will quickly learn to despise all of the above CArtels for a variety of reasons, most obviously because they charge ridiculous amounts ($20-$500 yearly for the most basic certificate, $1500+ for an EV one) for a trivial service, and despite that exorbitant cost, all three that I tried were broken to varying degrees and a pain in the ass.

Where was I? Oh yes, SSL. An ultimate guide to it.

Step 2: Generating your SSL key and Certificate Signing Request

Before you can enjoy engaging with your CArtel of choice, you must generate an SSL key and a Certificate Signing Request (CSR). The SSL key is used 1) to generate your CSR and 2) by Apache, to compare against the certificate given to you by your CArtel. All the CSR is is a plain text file that has a big hex string you’ll submit to the CArtel when you’re ready.

The process of creating the key and subsequent CSR varies per operating system. My OS is Gutsy Ubuntu, so if you’re running that (or later versions), you’re in luck, cuz I’ve got some explicit instructions ready for you without leaving the comfort of this blog. If you’re not, these instructions may still work with other versions of Linux, but I don’t know what you do in Windows (seriously, Windows? We’re talking about a Rails stack, aren’t we?).

1. Generate a key
In any directory, run “sudo openssl genrsa -des3 -out your_website_name.key 1024 or “sudo openssl genrsa -out your_website_name.key 1024″

The difference between the first and second command: the first one generates your key in such a way that it will require a password to be entered every time it is read (= every time Apache starts). It is possible to run a program to output this passphrase (Google “SSLPassPhraseDialog“), so you can still start Apache automatically on system reboot, but various websites I read suggested that using one of these programs is generally no more secure than just generating a key without a passphrase (which is the second option). Which you choose depends on whether you can have someone around your server whenever it needs rebooting, and how paranoid you are that someone will be able to log onto your box and steal your key (meaning that their porn site is now indistinguishable from your web site (at least from the standpoint of the the CArtel)).

If you’re running a version of Ubuntu that doesn’t already have SSL installed, you should just be able to install it with “sudo apt-get install openssl”

2. Create a CSR
In the directory that you created your key, run: openssl req -new -key your_website_name.key -out your_website_name.csr

And that’s that.

Step 3: Pick a CArtel, any CArtel! They got stuff you need, and they know it.

Here’s where you get to decide if you want to pay $500 for a mediocre service (Verisign), $20 for a pitiful one (GoDaddy), or something in between for a broken one (GeoTrust, $250). I tried all three of those. My brief summary is thus: Verisign is the biggest name in CArtels, and they charge like it. You can get the exact same certificate VeriSign charges $500 for at GoDaddy for $20. I started setting up a trial certificate at Verisign, but gave up when I realized that they were going to remind every user on my site that our certificate was a trial one until we forked over the $500. I suppose that maybe $500 is worth it for some sites, since you get to put their flashy Verisign logo at the bottom of your SSL pages, but I couldn’t bring myself to subsidize the pork. It feels too much like the Title Insurance company that charges me $1000 when I buy a house for them to run a query in their database.

I actually signed up for a GoDaddy certificate. Their interface is esoteric, clearly designed by an unfed engineer who needed to get home in time for his WoW mission. But, after signing up, then waiting about a day while they verified something-or-other, the GoDaddy certificate worked. It did require a small extra step of download an “intermediate certificate” in addition to my signed certificate, but for the price, it was alright.

GeoTrust simply annoys me. I’ve gone through their signup process twice, and been rejected by their phone verification system as many times. Luckily, that verification system seems to be “just for fun,” because, despite failing that mandatory step, they still mailed me a certificate. Their UI is similarly awful to GoDaddy, and their site appears to be unmanned (clicking on the “login with certificate” button gives a fatal certificate error). But they work, and their logo is fairly well recognized.

I didn’t spend too much time on Thawte or Comodo, but I can’t imagine they’d be much worse than the above.

The ultimate reward for powering through the CArtel process is that they mail you a long string that you paste into a .csr file. In our case, I pasted the certificate given to me by our CArtel into a file I created and named “bonanzle.csr”.

Step 4: Copy the key and CSR files + setup your Apache config file (aka: the hard part)

In Gutsy with Apache2, the recommendation given by this helpful page is to copy your files as follows:

sudo cp your_server_name.crt /etc/ssl/certs

sudo cp your_server_name.key /etc/ssl/private

Where “your_server_name.crt = the file you created in the last step” and “your_server_name.key = the file you created in step 2.1”.

Next, you’ll need to make sure you have the Apache “SSL” mod enabled (for SSL, duh), and the “headers” mod enabled (for Rails to recognize the SSL). In Apache2, this is accomplished by running “sudo a2enmod ssl” and “sudo a2enmod headers”. I did in the Apache directory (/etc/apache2), but I’m not sure if that mattered.

Finally, you setup your Apache config file. I think this could probably be put equally well in either /etc/apache2/apache2.conf, /etc/apache2/httpd.conf, or /etc/apache2/sites-available/default. All of those files are run on Apache startup. I chose the last of those, because it was recommended by the Ubuntu Gutsy SSL setup page (probably just for separation sake). Here was what I added to my /etc/apache2/sites-available/default file:

<VirtualHost *:443>

# SSL requests should proxy just like normal ones... this is the same
# code I use in my "VirtualHost *:80" block to forward http requests
# to my mongrel cluster... If you have different proxying code,
# you'd paste that here.

RewriteEngine On
RewriteCond %{DOCUMENT_ROOT}/system/maintenance.html -f
RewriteCond %{SCRIPT_FILENAME} !maintenance.html
RewriteRule ^.*$ /system/maintenance.html [L]
RewriteRule ^/$ /index.html [QSA]
RewriteRule ^([^.]+)$ $1.html [QSA]


RewriteCond %{DOCUMENT_ROOT}/%{REQUEST_FILENAME} !-f
RewriteRule ^/(.*)$ balancer://railsapp%{REQUEST_URI} [P,QSA,L]

# The actual SSL stuff, make sure the engine is on, enable some options that the Gutsy page said I should,

# and tell Apache where my key+CSR files are
SSLEngine on
SSLOptions +FakeBasicAuth +ExportCertData +StrictRequire
SSLCertificateFile /etc/ssl/certs/bonanzle.crt
SSLCertificateKeyFile /etc/ssl/private/bonanzle.key

# Used by Rails. Mentioned in all the Rails SSL tutorials.
RequestHeader set X_FORWARDED_PROTO "https"
</VirtualHost>

Depending on your CArtel, you may also need to ensure that your ServerName and ServerAlias match the “common name” field that you specified. I know I had to do this. This was accomplished by adding the following lines inside my <VirtualHost *:80> block (I did it at the top, but that almost certainly doesn’t matter):

ServerName www.bonanzle.com
ServerAlias www.bonanzle.com

After that, save the file, reload Apache and cross your fingers. If you Apache restarts sucessfully, you have completed the most difficult part of the process! If not, visit /var/log/apache2/error.log and get to Googling.

Step 5: Setup Rails with SSL (aka the easy part)

Here’s the part you can find in many existing Rails tutorials that copy/paste the README in the ssl_requirement plugin. Firstly, you install the ssl_requirement plugin (“ruby script/plugin install ssl_requirement”). Then, add near the top of your application controller the line “include SslRequirement”. Then add the following block to your application controller, so your development server will continue working:

def ssl_required?
return false if local_request? || RAILS_ENV == 'test'
super
end

Then, any action that should go through SSL just needs to have a line added in its controller, “ssl_required :action_name”. You can also do “ssl_allowed :action_name” if the action can be used either way.

You should probably also know…
Most CArtels will only validate one domain (validating two domains would mean an extra query for their database) for your X hundred dollars, so you need to make sure that all actions that will be SSL can only go through one address. Specifically, you need to make sure that any possible subdomains will map to one address if you don’t want to buy multiple certificates. You’re probably already doing this if you have a production site, because if you allowed stunts like “bonanzle.com” and “www.bonanzle.com”, then users sessions could also be messed up. However, if you haven’t yet setup a redirect from subdomainless URLs to www.* URLs, here’s the code I used to do it:

# Redirect all bonanzle.com/ request sto www.bonanzle.com
Options +FollowSymLinks
RewriteEngine on
RewriteCond %{HTTP_HOST} ^bonanzle\.com
RewriteRule ^(.*)$ http://www.bonanzle.com$1 [R=301,L]

Conclusion
Hopefully this ultimate guide will save you minutes or hours relative to Googling a bajillion sites to put all this data together. Leave a comment if you’ve gone through this and found it to work or not work. I’d also be curious if any else has experienced a CArtel that they would recommend (e.g., one where a UI-minded person or SSL novice go through signup process and not gasp in horror). Thanks for any feedback.

Quick Book Plug: Advanced Rails Recipes

By far the best book I’ve gotten to help with all the not-well-documented aspects of getting a site delpoyed. Beginner Rails help articles are a dime a dozen, but this book is chalk full of example-laden recipes for deployment. You are a fool if you deploy Rails apps and you don’t own this book.

Rails Internet Explorer Integration Guide

After about nine months of blissful Firefox-only development, Bonanzle finally started down the long road to Internet Explorer-compatibility a couple months ago. Though I’ve met few web developers who like the process of supporting IE, browser statistics show that fully 50% of users are still on some version of it (IE 6 has about 30%, IE 7 around 25%), so it’s something we have to deal with. Having just about wrapped up our backporting, I thought I’d share a few observations and tips on the process.

First of all, for background, I had never really touched web development of any sort until about 9 months ago. At the beginning of the backport, we had nary opened IE to see how our site would fare in it. As you might guess, the answer was “not well.” Because Bonanzle is rife with rich Javascript, and we use CSS-based layouts, few of our 30-ish pages were IE-compatible at the start of our backport. Many of our most substantial pages could not even render in IE without spewing 10+ JS errors.

But with a couple tools and rules, the process of moving toward IE compatibility ended up becoming relatively straightforward, and even our most complex and nuanced functionality has now been coerced into IE compliance.

The most important lesson I would impart to aspiring web applications: use a Javascript library. Like all young Rails sites, we started with Prototype. Once we were ready to take the training wheels off, we started using jQuery. Our backport revealed that only about half of our handwritten JS code worked in IE without modification. Some but not all of our Prototype worked. And almost all the jQuery did.

There are plenty of pages already out on the web dedicated to the comparison of jQuery to Prototype, but suffice to say for our purposes, my relationship with Javascript was an antagonistic one until I met jQuery. Now, I almost look forward to writing JS. Being able to batch select elements using pure CSS selectors, not needing to check for nulls when accessing selectors, and being able to concisely make complex behaviors happen with concatenated method calls are all big reasons. The rich plugin architecture is an even bigger reason. There seemed to be no task too large or small for a cross-browser jQuery plugin. Some of our heavily utilized plugins included the drag and drop ui-*.js, the jqModal plugin, and the jquery delegate plugin. We also worked with a contractor who custom-wrote some jQuery plugins for us that, like all jQuery I’ve encountered, “just worked” in IE (well, after they worked in Firefox, but that’s easy to make happen with Firebug).

If you do choose to go the jQuery route in writing your site, do yourself a favor and look into the JS QueueSpring plugin — it’s discussed in the previous blog, and is ideal for binding jQuery behaviors with your DOM elements in a clean and fast-loading way.
As far as CSS goes, there is no easy way to sum up means by which to write CSS that is IE6/7 compatible. I think that for all but the most experienced web developers, it is an iterative process. What I can recommend are some tools to speed up your iterations. First of all, if you’re on Windows, you’ll need to be able to install the version of IE you don’t have (6 or 7). This is most easily done by downloading the IE virtual machines Microsoft provides on their site. These provide an out-of-the-box solution for running IE6 and IE7 side-by-side on your machine (not otherwise possible). They also come bundled with some of the best tools available for figuring out what you’re looking at in IE: the Web Developer toolbar and the Script debugger. The former is basically a wussy version of Firebug that allows you to mouse over elements and see their properties, but not modify those properties dynamically, the way Firebug allows. The latter is a fairly lame way to see what’s going on in your JS when IE encounters errors. For both IE6 and IE7, you’ll need to ensure that you allow Script Debugging, which is under Tools -> Internet Options -> Advanced.

If you’ve got a big project on your hands, you’ll probably find the Script Debugger to be too barren… from what I’ve seen, it doesn’t allow you to set breakpoints in an arbitrary file, it has no watch window, and you can’t edit code from within it. A better choice is to install Visual Studio and use that as your JS debugger. If you don’t have it, you can download a free, “text only” version of Visual Studio with Ruby in Steel. You can then uninstall the RiS if it’s not your cup of tea (though it should be), leaving Visual Studio installed.

That’s the basic framework of what we’ve used to get our site from IE crashfest to lovable huggable puppy dog. Hopefully this may start off you other intrepid cross-browser souls on your journey as well. May you be strong, and repeat after me… “only 12-20 months until IE6 is obsolete.”

Rails Plugin: Javascript Unqueue Spring, all Javascript output at page bottom

I’m releasing my second plugin, a simple library for being able to declare Javascript functions or executable code from anywhere within a template or partial file, and have it all output together in a lump at the bottom of your page.

Why?

It offers the following advantages over simply using javascript_tag:

  • Most (all?) browsers pause rendering when they encounter inline JS, so having JS (especially complicated JS) that is scattered throughout your code means your page won’t be visible until said code runs. That’s no fun.
  • Race conditions. If you declare code inside a javascript_tag that refers to a DOM element that isn’t yet instantiated, you’re in for some trouble. Using Javascript queuing, your JS code is assured not to run until the document is ready (i.e., all page elements have been loaded)
  • Easier to find JS code in Firebug if it’s all in one place at the end of the page
  • Best practices. It’s commonly accepted that inline JS is bad bad mojo. Ask Google if you don’t believe me.

Usage

Pretty damn simple.

<% queue_js do
<<-JAVASCRIPT_CODE
// Run an alert to show we exist:
alert("I go at the bottom of the page and run automatically!");

// Do some other senseless thing
$("a").click(function() { alert('An anchor was clicked!'); });
<<JAVASCRIPT_CODE
end %>

Using this plugin, this would be output by default (i.e., without changing JS to run using Prototype) as follows:

<body>
... (all the HTML rendered during your render pass) ...
</body>
<script>
//<!--[CDATA[
jQuery(function(){
alert("I go at the bottom of the page and run automatically!");
$("a").click(function() { alert('An anchor was clicked!'); });
});
//]]-->
</script>

You can open and reopen (i.e., call the queue_js method) as often as you like during your render call. All JS added during all calls will queued until the render pass finishes.

If you change to the Prototype option, the JS will be wrapped in an equivalent wrapper that Prototype uses to run JS after the DOM has been fully loaded. For specifics, see js_queue_spring.rb in the source code (hint: it uses Event.observe(window, ‘load’)).

Options

If you want to just add code that isn’t going to be automatically run, you can use the :flat parameter like so:

<% queue_js(:flat) do
<<-JAVASCRIPT_CODE
function someFunc() { alert("Doing nothin until you tell me to!"); }
<<JAVASCRIPT_CODE
end %>

You can also look in the /lib/js_queue_spring directory to specify whether you want to use jQuery (tested) or Prototype (untested) syntax to wrap your code that should run when the page has finished loading.

Limitations

Caching pages with queued JS is a problem. Since the JS is not output inline with the page, if you cache a partial with JS that is queued, that JS will not be re-rendered when the cached version of the partial is used. For partials that have cached content, you’re probably better off using javascript_tag.

Get it

From your base directory,

svn export http://jsqueuespring.googlecode.com/svn/trunk/ vendor/plugins/js_queue_spring

And that’s that. The plugin is configured by default to use jQuery’s “jQuery(function(){ }) ” to run the code after the document has loaded. If your site doesn’t use jQuery (well, it probably should, but) you can go into the /lib/js_queue_spring and change the “SPRING_WRAPPER_TYPE = :prototype”. Fair warning: I haven’t experimented much with using this as Prototype. If anyone uses this option, please report as to whether it works, and ideally, how you made it work.

A bit more info available at Google Code project. A bit more instruction also available in the plugins’ readme.

Rails Update Test Fixtures Upon Migration

Ever get the nagging feeling that everyone knows how to do some obvious thing except for you? This is the feeling I’ve long had with updating my Rails test fixtures. All self-respecting Rails developers know that a strong test infrastructure is a key aspect of any Rails application. And test fixtures are generally seen as the key component that drives test cases. But no Google query I’ve figured out yet has shown me a good way to keep my fixtures updated as my database changes. I’ll going to discuss our current working solution here.

What are fixtures? (aka: newb background information)

For newbs that might not have started their testing yet: test fixtures represent each table in your database as a single file (usually in YAML format) that contain specific, known records you can refer to in your tests.

For example, we have an items fixture that loads in a bunch of sample items records our tests to which our refer. A single record in this items YAML file looks something like:

items_not_valid_not_committed_missing_price:
shipping_price:
price:
title: "dummy item"
quantity:
shipping_id:
id: "71"
item_status_id:
category_id:
committed:
description:
seller_id: "3"
image_id:

Fixtures are usually created initially by exporting the data from one of your real database (development or production) running rake db:extract_fixtures. This will create a fixture file for every table, with the records in every table labeled something like “item_001”, “item_002”, etc. As you can see in the example above, my tendency is to rename these records so that they are more semantic. This makes it easier to remember which fixture record is which when loading them in my tests, by having syntax like item = items(‘items_not_valid_not_committed_missing_price‘). A lot easier to remember what that item represents later on, then having it named item = items(‘item_001’)… and then six months later asking “What was the item number 1 record again? Oh yeah! The not valid item because it was missing price! Of course.”

Sounds fine to me. So what’s the problem?

The problem is what happens when your database changes. Especially for an application in the midst of development, the database might change weekly. However, if you re-export your fixtures from the database, you’ll lose all of your custom-named fixture records. You’ll also lose any special cases you might have setup. In our case, we have records where we set stuff like “item_expires_at: <%= 1.day.from_now %>”, and that ain’t gonna fly if you re-export the database.

If you don’t re-export the database, though, then you are left with hundreds of records that are missing (or have extra) fields after your migration. What’s a developer to do?

The partial answer: The Rails Fixture Migration Plugin

This plugin, developed by Josh and Jake is a good start. After installing it, you can run “rake db:fixtures:migrate” and automatically have the fields that were added or removed in your migrations correspondingly added or removed from your fixtures. But there are a number of caveats:

* The code breaks if it finds empty fixtures. This can be fixed by wrapping lines 14-16 in migrate_fixtures.rb with an if so they only try to load non-empty fixtures
* The code evaluates and replaces any inline Ruby in your fixtures. So the previously mentioned “item_expires_at: <%= 1.day.from_now %>” will become “”item_expires_at: March 29, 2008” after migration. I’ve just worked around this by manually replacing those substitutions after running the migration. A bit more annoying is if you have code that does any loops in your fixtures. For example, we have a loop that creates 50 similar items in one of our fixtures, and the fixture migrator simply doesn’t understand what this code is doing, giving an exception when it tries to run it. For the time being, I just remove this loop before migration and re-add it after migration. A pain, surely, but less of a pain than adding new fields to the other 100 records in our items fixture.
* The fixture migration fails with some ambiguous error code if any of the fixtures it wants to migrate already exist. The fixture migration uses a schema_info.yml file in your /test/fixtures directory to keep track of the fixture migration. If you, for example, create a new scaffolding that creates both a migration file and a new fixture, the fixture migration tool will break when it gets to the migration to create the new table, because it seems the new table is already a fixture in your directory.

All in all, they are surmountable obstacles for us at this point, given the lack of other options.

The “better” answer: Uhh…

I’ll be curious to hear if other developers have a more elegant way to deal with the problem of writing tests that stay relevant as your database changes? One school of thought is that one could just write methods that create the records you need programmatically, avoiding fixtures altogether. This has crossed my mind, but fixtures are nice since they come pre-generated, and I think they are generally easier to read through than a hash that creates a new record. A bigger benefit of fixtures is that they are automatically placed in the test database, so you get all the interconnections between your tables loaded at once. It seems to me like it would be quite a headache to create an entire database of data programatically.

So, slightly painful fixture migrations it is, so far.

You can get your own copy of the slightly pain Rails migration plugin here.

Rails Optimize Google Maps Load Times

Ain’t it grand when you get to something on your list that has been getting put off for months, only to discover that somebody did it for you a few days ago?

Such was the case with us and our Google Maps. We had long intended to cache Google Maps (accessed through the YM4R plugin) on account of the 1-2 seconds per page load that the maps cost (!). Most of that time is spent running JS — both from Google (that grabs the map tiles and sets up zooming, scrolling, etc.) and from YM4R (don’t know what that JS is doing, but it’s costly).

Anyway, if you are putting maps on your page, and you don’t need the user to be able to move the map around, don’t be a fool: use static Google Maps and take your load time from 2 seconds to .02 seconds.

And being that we are in the midst of a Rails revolution, it should surprise no one that within 48 hours of Google unveiling the static map service (in late February) someone (<credit>John Wulff</credit>) had already written a gem to use it in Ruby. Unfortunately, gems suck, especially gems like this one with multiple gem dependencies. But there is no reason you can’t just get the functionality as a Rails plugin. Here is the file. All you have to do is put it in your /vendor/plugins directory under something like vendor/plugins/static_gmaps/lib and you’re golden. Usage for the static map is like so:

@map = StaticGmaps::Map.new(:center => [ @lat, @lng ],
:zoom => 10, # gmaps zoom level
:size => [ 120, 75 ], # pixel width, height
:map_type => :mobile, #:mobile or :roadmap, :mobile is a bit more distinct for small maps
:key => YOUR_GMAPS_API_KEY)

Then, you put it in an image tag:

image_tag(@map.url)

Now, I do say, that is easy way to cut out 2 seconds/page load.

Update: Almost forgot to mention one painful lesson I learned when implementing the static maps, which is that, unlike with the YM4R maps, Google does check your HTTP_REFERER when using static maps. So, to all you localhosts: don’t forget to go to Google Maps and grab an API key that works for the IP address in your HTTP_REFERER string. If your referer string doesn’t match your API key all you’ll get is a blank image and no explanation.

Setup Mongrel Cluster Options

Damn Google didn’t tell me what I wanted to know when I wanted to know it. I wanted to know what options were available to put in my mongrel_cluster.yml file. Eventually I found them. If Google smiles upon this blog entry, you will find them more quickly than I did.

Usage: mongrel_rails <command> [options]
-e, –environment ENV Rails environment to run as
-d, –daemonize Whether to run in the background or not
-p, –port PORT Which port to bind to
-a, –address ADDR Address to bind to
-l, –log FILE Where to write log messages
-P, –pid FILE Where to write the PID
-n, –num-procs INT Number of processors active before clients denied
-t, –timeout TIME Timeout all requests after 100th seconds time
-m, –mime PATH A YAML file that lists additional MIME types
-c, –chdir PATH Change to dir before starting (will be expanded)
-r, –root PATH Set the document root (default ‘public’)
-B, –debug Enable debugging mode
-C, –config PATH Use a config file
-S, –script PATH Load the given file as an extra config script.
-G, –generate CONFIG Generate a config file for -C
–user USER User to run as
–group GROUP Group to run as
–prefix PATH URL prefix for Rails app
-h, –help Show this message
–version Show version

These options are the same in the mongrel_cluster.yml file, or if you pass these options through the command line to mongrel_rails. The only exception I’ve found to this rule is that if you want to specify a script to your mongrel_cluster.yml, your line starts with “config_script:” instead of just “script:”. Don’t ask me why.

Engines Plugin Breaks Rails Exception Notifier No More

Courtesy of one Mr. Andrew Roth (via the Engines mailing list, helpful guys there), here is how to fix errors between Exception Notifier and the Engines plugin:

It was easily fixed by making the exception_notifier an engine by making use
of an app folder to play nice with engines (or should we say so engines
would play nice with it).

So a find|grep -v svn in my vendor/plugins/exception_notification folder
gives

./app
./app/helpers
./app/helpers/exception_notifier_helper.rb
./app/models
./app/models/exception_notifier.rb
./app/views
./app/views/exception_notifier
./app/views/exception_notifier/exception_notification.rhtml
./app/views/exception_notifier/_backtrace.rhtml
./app/views/exception_notifier/_environment.rhtml
./app/views/exception_notifier/_inspect_model.rhtml
./app/views/exception_notifier/_request.rhtml
./app/views/exception_notifier/_session.rhtml
./app/views/exception_notifier/_title.rhtml
./init.rb
./lib
./lib/exception_notifiable.rb
./README
./test
./test/exception_notifier_helper_test.rb
./test/test_helper.rb

and cat init.rb is just

require "action_mailer"

Rails Engines Performance

Since installing the Engines plugin as part of the Savage Beast setup process, I’ve been burning with curiosity about what kind of performance impact Engines would have on our production app. The cause for my concern was primarily the 100 or so lines that Engines spits out to my debug console for each and every helper method lookup. Surely, scanning through every plugin on every call to any helper can’t be the most efficient thing for us to be doing?

Tonight I ran some load tests with Apache AB to see if there was any substance to these concerns. Each test did 500 requests on a sample of three random pages from our site (running in production mode). Here were my results:

Looks like Engines is a far different beast in production vs. development (as James said). The difference between integrating SB into the app directly vs. integrating it via engines is well within the margin of error for these tests. Thus, I am no longer concerned about the marginal performance impact of Engines for the time being.

Now, on to fixing its conflict with the Exception Notifier.