Saturday, 14 November 2009

Workaround for an issue with

I've developed a widget to retrieve and update a rating in an application that resides on another domain. Due to this cross domain requirement, I had to exploit the iframe hack, which dojo toolkit implements in the module.
It works perfectly with Firefox but I was experiencing a problem with IE 7 and Safari 4: the update went well only the first time the widged is used, afterwards no other requests were sent to the target application.

At the end I've understood that I need to create the communication iframe for every request, this is my workaround:

if ((dojo.isIE || dojo.isSafari) &&["_frame"])
  var frameName =;["_frame"] = window[frameName] = null;
  if (window.frames)
    window.frames[frameName] = null;

Check my post to dojo-interest group for any update about this topic.

Technorati Tags:

Posted by Nicola Piccinini at 5:52 PM CET in devel/

Wednesday, 3 June 2009

Html view plugin for dijit.Editor

I've searched a bit for a plugin for dijit.Editor able to provide the HTML view of the content but I haven't found anything so I've thought it was a good occasion to try to build a plugin by myself.

Here the code, probably you want to change the module name if you try it:



dojo.requireLocalization('pc.widget', 'editor');

    // Override _Plugin.useDefaultCommand... processing is handled by this plugin, not by dijit.Editor.
    useDefaultCommand: false,

    _initButton: function() {
      // Override _Plugin._initButton() to setup listener on button click

      this.command = "htmlView";
	this.editor.commands[this.command] = "HTML";
	this.inherited("_initButton", arguments);
	delete this.command;

        this.connect(this.button, "onClick", this._htmlView);

    _htmlView: function() {
      this.textarea.value = this.editor.getValue();;

    _initDialog: function () {
      if (!this.dialog) {
        this.dialog = new dijit.Dialog({}, dojo.doc.createElement('div'));
        var messages = dojo.i18n.getLocalization('pc.widget', 'editor', this.lang);
        //var buttonsHtml = '<button>Set</button><button>Set and continue</button>';
        var buttonsHtml = dojo.string.substitute('<button>${0}</button><button>${1}</button>', [
          messages['Set'], messages['Set_and_continue']
        this.dialog.attr('content', '<div class="html-view"><div><textarea></textarea></div><div class="buttons">' + buttonsHtml + '</div></div>');

        this.textarea = dojo.query("textarea", this.dialog.domNode).pop();
        var buttons = dojo.query("button", this.dialog.domNode);
        this.setBtn = buttons.pop();
        this.setAndHideBtn = buttons.pop();
        this.connect(this.setAndHideBtn, 'onclick', this._replaceValueAndHide);
        this.connect(this.setBtn, 'onclick', this._replaceValue);

    _replaceValue: function () {
    _replaceValueAndHide: function () {

// Register this plugin.
dojo.subscribe(dijit._scopeName + ".Editor.getPlugin",null,function(o) {
  if(o.plugin){ return; }
    switch( {
      case "htmlView":
	o.plugin = new pc.widget.editor_html_view({command:});

Here you can see it in action:
and here you can list and download all files:

Posted by Nicola Piccinini at 5:47 PM CEST in devel/

Friday, 5 December 2008

Posting a file with ActiveResource

Recently I had to put or post files to a web service using ActiveResource. I thought that it was a common tasks but, after searching a bit, I wasn't able to find any documentation about how to do that and so I ended up implementing my dirty solution.

Luckily, Rails already knows how to parse a file parameter so I only had to produce an XML conforming to what that parameter parser expects. If you, like me, are receiving the file from a multipart/form-data form in a Rails application, then you have an UploadedStringIO instance to play with:

module ActionController
  class UploadedStringIO
    def to_xml(options = {})
      options[:indent] ||= 2
      xml = options[:builder] ||= => options[:indent])
      xml.instruct! unless options[:skip_instruct]
      dasherize = !options.has_key?(:dasherize) || options[:dasherize]
      _root = options[:root] || 'file'
      root_tag = dasherize ? _root.to_s.dasherize : _root.to_s
      xml.tag!(root_tag, ActiveSupport::Base64.encode64(string), :type => 'file', :name => original_path, 
        (dasherize ? 'content-type' : 'content_type') => content_type)

With that, if you have a Tournament model, with a flier attribute and you assign to it a text file (file name: flier.txt, file content: abc) then ActiveResource will post/put something like:

<?xml version="1.0" encoding="UTF-8"?>
  <flier type="file" name="flier.txt" content-type="text/plain">YWJj

that is what Rails wants.

Technorati Tags:

Posted by Nicola Piccinini at 6:03 PM CET in devel/

Sunday, 30 November 2008

Relay mail via Google SMTP or AuthSMTP with Postfix

A well know trouble with EC2 instances is to send mail reliably, in fact the dynamic nature of the IP numbers in the cloud makes them suspect for most spam countermeasures. The common workaround is to have an external SMTP server thorough which relaying every email message.

A natural choice is to use Google Apps for your domain and to exploit their service which is free up to 2000 messages per day and comes with all the Gmail goodies.
Jules Szemere has a good post (1) about how to do that with Postfix (note for Ubuntu users: the script is in /usr/lib/ssl/misc).

Unfortunately, in my experience (and not only in mine) also messages from Google servers are sometimes considered SPAM. Perhaps switching to the premium edition could solve that, anyway I followed the recommendation (2) from Paul Dowman (EC2 on Rails author) and I'm using AuthSMTP. The minimum fee is lesser than Google Apps premium edition (though the cost per message is not) and it's actually reliable.

What I miss more with AuthSMTP is the lack of a copy of the message in the sent mail folder, I could always add a BCC field but this wastes the service quota.
The ideal solution is to use Google to send to safe addresses that are unlikely to drop the message (especially those in BCC) and to use AuthSMTP otherwise. In practice we have to put together the configurations in (1) and (2) in a smart way. So, thanks to the suggestions of my trusty system administrator, in /etc/postfix/

transport_maps = hash:/etc/postfix/transport

# auth
smtp_sasl_password_maps = hash:/etc/postfix/sasl_passwd

# tls
smtp_tls_policy_maps = hash:/etc/postfix/tls_policy
smtp_sasl_security_options = noanonymous
smtp_sasl_tls_security_options = noanonymous
smtp_tls_note_starttls_offer = yes
tls_random_source = dev:/dev/urandom
smtp_tls_scert_verifydepth = 5
smtpd_tls_ask_ccert = yes
smtpd_tls_req_ccert = no
smtp_tls_enforce_peername = no

Two maps need to be specified, transport_maps defines which relay server to use depending on the destination address:   :[]:587   :[]:587
*           :[]

while smtp_tls_policy_maps defines the policy to use with the relaying server:

[]             none
[]:587            may

Finally, in smtp_sasl_password_maps we specify both Google and AuthSMTP credentials: username:password
Posted by Nicola Piccinini at 5:19 PM CET in devel/

Wednesday, 12 March 2008

mongrel_rails service and $stdout problem on Windows server


First, if you have the choice, ditch windows, not worth it. However, if you are like me and are forced to use windows for a project then your solution is a simple derivative of what you will find here.

I'm forced to work with a Windows server too and I was experimenting the same issue. Charlie put me on the right track and I solved it adding what follows at the beginning of #{RAILS_ROOT}/config/boot.rb:

  $stdout.write("checking $stdout -> ")
  $stdout.flush # really try to write!

As $stdout and $stderr are redirected only when IO#write raises an exception, it is effective only when Mongrel is running as a service and you still can launch it from a terminal or use a console without any problem.

Posted by Nicola Piccinini at 12:57 AM CET in devel/

Wednesday, 22 August 2007

dojo 0.4.x to 0.9 porting

I've just ported a bunch of code from dojo 0.4.x to 0.9 and I like to share some brief considerations, in the hope that they could be useful for somebody:

  • most of the job consists of searching and replacing some common patterns, for example: widget declaration and creation, second argument in topic publishing method (that now is an array), etc. (see the porting guide). It's boring but easy;
  • exotic usage of the dom node used in widget creation can cause problems because it seems that dojo 0.9 substitutes it with a new source dom node. In such cases a slightly refactoring is usually needed;
  • what I initially missed more was the unavailability of debugAtAllCosts and dojo.hostenv.writeIncludes() but luckily the same problem (debugging evaled code) is solved in a better way by Fireclipse;
  • some features are missing in dojo 0.9 but usually there is another way to achieve the same effect or one can solve the problem copying and adapting the right functions from 0.4.x.

After (to be honest) a lot of work I've completely refactored my application that consists approximatively in 15 widgets and 25 other Javascript source file (divided into two modules).

Unfortunately I can't yet precisely evaluate any performance gain because I'm having problem with the build system and the flattening of localization bundles and I'm not able to build a working release (I believe it's a bug of the build system but can't prove that ;-). Anyway:

  • the (gzipped) size of the release is about 25% less than with dojo 0.4,
  • rendering seems a lot faster in 0.9 respect to 0.4 (no numbers, just feelings).

In conclusion, it seems that the porting to dojo 0.9 has improved my application performance but the gain isn't so impressive. In my opinion it's better to promptly follow the evolution of the library, so I believe that it's worth to upgrade anyway.

Technorati Tags:

Posted by Nicola Piccinini at 1:02 AM CEST in devel/

Saturday, 23 June 2007

Connect to Microsoft SQL Server from Debian lenny GNU/Linux

Life is unjust and one can't always use open source technologies ;-) .

There are many guides about how to do that but they usually require to compile the drivers from source. Considering I'm using Debian lenny, I'd prefer a solution compatible with its package system, Debian has more than 18000 packages, there should be those I need! In fact it's so, first step is to install these packages:

# apt-get install tdsodbc libiodbc2 unixodbc odbcinst1debian1
  • tdsodbcs is the package with the precompiled FreeTDS drivers,
  • odbcinst1debian1 contains the utility we'll use to configure the drivers,

Second step is to inform the system that now we have the FreeTDS drivers:

# odbcinst -i -d -f /usr/share/tdsodbc/odbcinst.ini

This adds in /etc/odbcinst.ini the following section:

Description     = TDS driver (Sybase/MS SQL)
Driver          = /usr/lib/odbc/
Setup           = /usr/lib/odbc/
CPTimeout       =
CPReuse         =
UsageCount      = 1

Third step is to configure your ODBC Data Source Names:

$ odbcinst -i -s -r
Description     = Your description
Driver          = FreeTDS
Database        = CatalogName
Server          =
Port            = 1433
odbcinst: Sections and Entries from stdin have been added to ODBC.INI

Note that if you execute the last command as a normal user the section is added to ~/odbc.ini.

That's all, try it:

$ isql YourDsn UID PWD
| Connected!                            |
|                                       |
| sql-statement                         |
| help [tablename]                      |
| quit                                  |
|                                       |
SQL> select * from table_name ;
SQLRowCount returns 4
4 rows fetched
Now, if you want to use the drivers with Ruby and Rails, there are some additional steps.
First I have to install the libdbd-odbc-ruby package too (you system may need more packages ...), then in database.yml:
  adapter: sqlserver
  username: your_user_name
  password: your_password
  dsn: YourDsn
  mode: ODBC

It's all fine except for an annoying problem with table creation: most columns take an unspecified NOT NULL constraint and that causes a lot of troubles. Maybe somebody could suggest a remedy, it would be very well appreciated :-) .

Posted by Nicola Piccinini at 11:21 PM CEST in devel/

Friday, 8 December 2006

Selenium on Rails and Edge, again

It seems that the patch wasn't complete. In fact there was also an issue with test:acceptance task. A description of the problem and relative solution can be found here:

Moreover, if you are wondering why you can't find any result file in selenium-on-rails/log after having run the task (as I was), the reason is that they are removed unless there are some test failures the tests take more time than a configurable limit (see selenium-on-rails/lib/selenium-on-rails/acceptance_test_runner.rb).

Did you save other 5 minutes? ;-)

2006-Dec-09 CET: modified to correct my wrong assertion about the reason why log files aren't preserved after task execution. By the way, I'd prefer if these files weren't removed when there is some failures, as I thought initially.

Technorati Tags:

Posted by Nicola Piccinini at 2:30 AM CET in devel/

Monday, 13 November 2006

Selenium on Rails patch for Edge

The problem with Selenium on Rails and Edge is described here:

You can download the patch here:

Nothing at all but maybe it could save 5 minutes to someone.

Technorati Tags:

Posted by Nicola Piccinini at 1:31 AM CET in devel/

Tuesday, 7 November 2006

FCKeditor in Streamlined

In this post to Streamlined group:
my recipe on how to integrate FCKeditor in Streamlined.

It uses the Ruby FCKeditor Plugin for Rails, mantained by Scott Rutherford.

Posted by Nicola Piccinini at 1:21 AM CET in devel/

Saturday, 23 September 2006

Not using C compiler makes me happy

I was looking again at geospatial matters when I had to reinstall PostGIS. Differently from previous time, life was easy:

stakhanov:~# apt-get install postgresql-8.1-postgis
Reading package lists... Done
Building dependency tree... Done
The following extra packages will be installed:
  libgeos-c1 libgeos2c2a postgis proj


Technorati Tags:

Posted by Nicola Piccinini at 6:03 PM CEST in devel/

Sunday, 25 June 2006

Strange interaction between Mozilla and WEBrick

If you are wondering why WEBrick responds to your XMLHttpRequest from Mozilla 1.7 with:
ERROR bad Request-Line `'
then read this message by prototype's author Sam Stephenson:

Not surprisingly prototype includes the workaround while, unfortunately (for me), dojo doesn't.
This post in the hope that it took you less than me to discover this.

Posted by Nicola Piccinini at 5:38 PM CEST in devel/

Wednesday, 21 June 2006

How to register a new MIME type in Rails

It's almost summer and it's a lot better playing volleyball under the sun than writing about technical stuffs. Anyway I've a blog and I must feed it so I'll try to add a post every one/two months. Hence, this is for June and July :-) .

The REST web-service support in Rails 1.1 is nice and powerful but how to add a new MIME type management, specifically a text/json [1] one? I haven't found any documentation about this and so I dug into the source code searching for a solution. I ended up with the following (after not few troubles):

require 'json'

# register a new Mime::Type
Mime::JSON = 'text/json', :json
Mime::LOOKUP[Mime::JSON.to_str] = Mime::JSON

# its default handler in responder
class ActionController::MimeResponds::Responder
  DEFAULT_BLOCKS[:json] = %q{ do 
      render(:action => "#{action_name}.rjson", :content_type => Mime::JSON, :layout => false) 
  for mime_type in %w( json )
    eval <<-EOT
      def #{mime_type}(&block)
         custom(Mime::#{mime_type.upcase}, &block)

# its param parser
ActionController::Base.param_parsers[Mime::JSON] = do |data|
  {:resource => JSON.parse(data)}

require 'json' is for the json library by Florian Frank.

Note that inserting the new Mime::Type into the LOOKUP Hash is fundamental because otherwise:

  1. one can't register the param parser using the constant but has to write ActionController::Base.param_parsers[Mime::Type.lookup('text/json')] = ... ;
  2. a lot more subtle, registering the param parser with this last instruction create a completely new Mime::Type object o and o.hash == Mime::JSON is false, consequently the responder isn't able to find a match!
    In fact the class Mime::Type doesn't opportunely redefine the Object#hash method though it modifies Object#eql? in order to make true o.eql? Mime::JSON. I think that this isn't appropriated: [...] must have the property that a.eql?(b) implies a.hash == b.hash [...] (from Ruby core RDoc documentation)

Now in controllers you can write something like:

  respond_to do |wants|

Moreover, if your request has the right content type you'll find in param[:resource] a Ruby Hash corresponding to your JSON object.

So far, so good but I wonder if there isn't a cleaner way to do this ...

[1] actually, the official MIME type for JSON is application/json but dojo recognizes text/json. Anyway my troubles with dojo are material for another post that maybe I'll write in August ;-) .

Posted by Nicola Piccinini at 8:32 AM CEST in devel/

Thursday, 4 May 2006

eclipse 3.2

After some plugins messed up eclipse 3.1.2, I've installed a release candidate of version 3.2 intrigued by the Callisto project. It promises to eliminate uncertainty about project version numbers and is aimed at product producers but I believe to benefit from this stability too.

The first impression is that it's faster and a lot more stable than the latest official release :-D .

Technorati Tags:

Posted by Nicola Piccinini at 11:16 PM CEST in devel/

Tuesday, 2 May 2006

Tips from my rails installation

After my useless lucubration and following emendation, it's time to write something that may be helpful. Unfortunately, first I've to introduce the subject and hence I need to lucubrate a little more.

I developed my Ruby on Rails project on stakhanov (system configuration) and deployed it on stratagemma (system configuration). Debian and Ubuntu have a nice packaging tool that makes absolutely undifficult installing rails. Nonetheless I chose to get it using RubyGems because it's as easy and, in this way, I have more control:

  • I can maintain the development and production environments as much similar as possible and that could avoids unexpected and annoying problems. In my case, luckily, I could have installed the same Rails version (1.0.0) both on stakhanov and stratagemma but who assures me that this will be always valid?
    The same reasoning can be repeated for the development team: it is preferable to have an environment as uniform as possible and so one ignores the various exotic packaging system and goes directly to the source.
  • I can upgrade Rails without waiting that the distribution provide an apposite package. Considering the recent release of the 1.1 version of rails (while Debian etch and Ubuntu breeze are stuck at 1.0), that's not so unlikely.

Anyway apt is still fine and convenient and I used it to install on stratagemma ruby1.8, the DBMS, apache2 and every other necessary tool. The problem is how to integrate all these packages with what comes from RubyGems, particularly when, for avoiding incidental conflict with distribution's packages, you chose to install it and its gems outside the standard system directories. In this way various executable and library can't be found without setting specific environment variables:

# ruby
export RUBY_HOME=/path/ruby 
export RUBYLIB=$RUBY_HOME/local/lib/site_ruby/1.8

# ruby gem
export GEM_HOME="$RUBY_HOME/rubygems"
export RUBYOPT=RubyGems 

export PATH=$RUBY_HOME/bin:$GEM_HOME/bin/:$PATH

This solves the problem from command line but, to make work your rails application, you have to pass the same values to the web server. For apache2:

  DocumentRoot /path/to/the/public/directory/in/the/rails/app

  SetEnv RUBY_HOME /path/ruby
  SetEnv RUBYLIB /path/ruby/local/lib/site_ruby/1.8
  SetEnv GEM_HOME /path/ruby/rubygems
  SetEnv RUBYOPT rubygems

That's fine for the CGI dispatcher but doesn't suffice for the FastCGI one and you have to add apposite directives. For libapache2-mod-fcgid (Ubuntu package for mod_fcgid):

  DefaultInitEnv RUBY_HOME /path/ruby
  DefaultInitEnv RUBYLIB /path/ruby/local/lib/site_ruby/1.8
  DefaultInitEnv GEM_HOME /path/ruby/rubygems
  DefaultInitEnv RUBYOPT rubygems

These are my tips :-) . In the web there are numerous tutorials with detailed description of each step to install rails, for a configuration similar to the mine, see for example Ruby, Rails, Apache2, and Ubuntu Breezy (5.10) on

Posted by Nicola Piccinini at 11:42 PM CEST in devel/

Wednesday, 7 December 2005

A contribution to JSON-RPC-Java

It seems that I'm regarded as a contributor of JSON-RPC-Java thanks to my work to handle cases where a method generic signature (such as java.lang.Object) is overloaded with a more specific signature. I'm truly honored :-) !

Posted by Nicola Piccinini at 6:51 PM CET in devel/

Tuesday, 22 November 2005

Apache XML-RPC over HTTPS

Some scattered notes about using Apache XML-RPC over HTTPS.

Example code:

import java.util.Vector;

import org.apache.xmlrpc.CommonsXmlRpcTransport;
import org.apache.xmlrpc.XmlRpcRequest;

public class XMLRPCClientSample

    public static void main(String[] args)
            String serverLocation = "";

/*m1*/      System.setProperty(
	        "", "/keystore.path/"); 
/*m2*/      System.setProperty(
		"", "keystore.password");    

/*m3*/      SecurityTool.setKeyStore(
/*m4*/      SecurityTool.setKeyStorePassword(

/*m5*/      SecureXmlRpcClient client = new SecureXmlRpcClient(serverLocation);

/*m6*/      CommonsXmlRpcTransport transport = 
                new CommonsXmlRpcTransport(new URL(serverLocation));         
/*m7*/      transport.setBasicAuthentication("", "user.password");       

            Vector params;

            params = new Vector();

            XmlRpcRequest request = new XmlRpcRequest("method", params);

            Object o = client.execute(request, transport);
        catch (MalformedURLException e)
        catch (Exception e)



If the key of the server you are connecting is not signed by a trusted certificate authority, the certificate is not automatically trusted. In order to force the certificate to be trusted, you must:

  1. import it into a keystore file (cacerts) using the keytool program;
  2. add the instructions marked /*m1*/, /*m2*/, /*m3*/ and /*m4*/ for specifing which keystore to use. Of course, rather of hard coding in instructions /*m1*/ and /*m2*/ these properties' values, you could invoke your executable with options.
    If you omit instructions /*m1*/ and /*m2*/ the exception below is thrown: the trustAnchors parameter must be non-empty
    If you omit instead instructions /*m3*/ and /*m4*/ the exception below is thrown: Default SSL context init failed: null
    Anyway, this double pointer to the same information seems to me a little bit redundant :-( ;
  3. to download the server certificate (in order to import it into the keystore) you can use Microsoft Internet Explorer that is good for this stuff. Presently, that isn't possible (AFAIK) with Mozilla or Firefox!
    If you haven't Microsoft Internet Explorer in hand (as me), you can use openssl:
    pic@stakhanov:~$ openssl s_client -connect -showcerts
    Thanks to Emanuele Vicentini (my favorite system administrator ;-) that suggested it;
  4. The authentication parameters could be set using the setBasicAuthentication method in org.apache.xmlrpc.XmlRpcClient but it's deprecated. In fact you have to use the homonym method in CommonXmlRpcTransport as in instruction marked /*m7*/ (see Apache XML-RPC javadocs).
    In this way one have to specify the same server location in two places: in instructions /*m5*/ and /*m6*/. Again, this seems to me redundant and error-prone;
  5. the example is useful only if you can configure HTTPS globally and that isn't always the case.
Posted by Nicola Piccinini at 2:32 AM CET in devel/