There has been a lot of buzz since RubyMotion launched last week. Some Ruby developers seem extremely excited about it, and some Objective-C developers seem to be dismissing it. Being that I feel both Ruby and Objective-C are good languages, I wanted to explain why I’ve decided to wait on buying RubyMotion. Disclaimer: Not owning RubyMotion, and thus not being able to try it, could make some of these points invalid.
RubyMotion is basically just a version of MacRuby tailored to run on iOS. It compiles the MacRuby code written by the developer into LLVM bytecode that will work correctly on iOS. MacRuby is based on the Objective-C runtime, so there is no extra layer of abstraction between RubyMotion and the iOS APIs. This means unlike MonoTouch for example, you have access to native APIs of iOS without a layer of abstraction in between. So when a new version of iOS is released, you should be able to use those new APIs right away from RubyMotion.
Some people feel that MacRuby is cleaner than Objective-C. I think that’s more of a personal preference, so I won’t argue either way on that. The big thing to keep in mind is that you must still use CocoaTouch to write your iOS apps. This means you still must learn CocoaTouch to have a shot at writing a good iOS app. This isn’t going to be like CocoaTouch on Rails. The fact that CocoaTouch is tailored to Objective-C also means that if you are learning CocoaTouch for the first time from inside MacRuby, you have to end up doing some extra work from time to time to make an API work (for example Pointer objects).
The workflow is perhaps the most appealing part of RubyMotion to me. While I think Xcode is okay IDE, I sometimes would rather just be able to do my work from a text editor. This is again a personal preference thing. I don’t really feel an IDE or a non-IDE is better in general. The big gain in the workflow seems to be the additions to the iPhone Simulator. RubyMotion runs a REPL along with the simulator and command clicking on any view inside the simulator causes it to become “self” inside the REPL. This would be really handy for debugging. I should probably do a better job of learning LLDB, but it seems like it could still be a lot simpler than it currently is.
However, gaining this kind of stuff in workflow does have trade offs. There seems to be no debugger for RubyMotion. While the REPL can help you debug a lot of UI problems, breakpoints are still extremely useful. The REPL and simulator features wouldn’t really help you debug something that is running on another thread. I also haven’t seen anything about how well it integrates into Instruments. While you could probably attach to the running app and profile it, I have to imagine the stack traces wouldn’t be nearly as useful as they are if you are using Objective-C.
To me this is really the killer for RubyMotion. The main draw to using Ruby on iOS for me would be having access to all the great open source libraries Ruby has to offer. RubyMotion cannot import gems unless they are designed to support RubyMotion. While many gems will probably add this eventually, right now most gems can’t be used.
I wish there was a cheaper way to try RubyMotion rather than outright buying it. For now I’m going to wait on trying out RubyMotion. The trade offs don’t seem to outweigh the things you gain. It definitely seems like a project worth keeping an eye on though.