28
Sep 08

YourKit profiler

Profiling is probably one of the most joyous activity for me these days. Besides the fact that it feels good that you're actually got to the point that you have something to profile, you now get to see how the the software that you thought was so well designed is performing. You basically get to look in the guts of your software, forgetting all the abstractions that were provided to you and all the ones you've layered on top and actually see the how things collaborate to execute the instructions. The best part, you actually get to see what goes on in all the libraries that you thought were so great (or maybe not so great after your look under the hood). See this JIRA bug, which was a result of profiling.

Well, I've tried a few profilers for java and one that really stands out is YourKit. Although most profilers have pretty much the same capabilities, one thing that separates them is how intuitive and well polished the interface is. YourKit stands out in this category.

This weekend, I had to profile a few modules that weren't performing well. Although I have a free license of YourKit that we use on the Alex Build project, more on that later, I decided to download the EAP of version 8 trial to use it for a project not related to Alex Build. I'll most likely purchase a personal copy of YourKit once this trial runs out, as there are many projects I'd like to use YourKit for and for the very reasonable price of $499 you can't go wrong.

So I was profiling a TCP/IP application built on Apache MINA which had a thread model defined using JSE 5 executors. Outside of MINA, the application was scaling far beyond what it's throughput was using MINA. MINA being a very lightweight layer over Java's NIO should not bug down the performance and scalability this much. I saw a exponential performance decrease as throughput requirements increased. So I fired up YourKit and started pushing data into the application. At some point, I took a snapshot and then dove in.

Once you have a snapshot, you pretty much have all the details about the particular application runtime at the particular point in time you took it. I noticed that although the thread model was configured to have a fixed thread pool of 20 threads, only one worker thread was being used to process the payloads. This is very easy to see using the "Call stack (view by thread)" interface. This revealed a bigger problem with MINA 1.3, which basically serialized operations, although they claimed to support a thread model, what ever that means :-). MINA 2.0 change that support to support true concurrent IO Handlers and though I migrated to the new API. The next profile revealed that all worked as expected and the thread pool was utilized as expected.

Yourkit1

The callstack also revealed that 99% of the time was spent in IO, which is not something that we can tune any further at this point, therefore we're probably getting the best throughput we're going to get.

Yourkit2

Netbeans has a very decent profiler as well, but fortunately I used IntelliJ and don't plan on switching to netbeans if I don't have to, even if it's just for profiling. YourKit provides plugins for most IDEs, including IntelliJ and Eclipse.

One other thing I wanted to point out, unlike other commercial profilers, YourKit offers free Open Source licenses to qualified projects. They were very quick and courteous in the process. Thanks YourKit, you make the profiling process very enjoyable.


16
Sep 08

New Java Build System

So I was intrigued by Joe Ottinger’s call for requirements for a new java build system. I’ve had these thoughts for years now, especially after gruesome experiences with ant and maven.

I must admit, I like maven2. But I’ve experienced a rather steep learning curve using it. I now use it on all my projects. It definitely makes life easier, especially getting started. But I still find it rather difficult to at times find plugins or alternative ways of accomplishing something outside of maven’t conventions. Mostly I’ve adapter to maven’s conventions, but there are times when we just can’t escape the rest of the world. I was using GigaSpaces for a project and because of the OSGI-based build/deployment system, found it rather hard building and deploying the necessary artifacts from the standard maven build structure. After days of battling the obvious, I settled on a hybrid (maven/ant) approach. I build everything in maven, then package jars into GigaSpaces deployable units using ant. I also use ant for the rather custom deployment of PUs.

How much easier would that be had maven had straightforward scripting plugins. Just plugin a script tag into any lifecycle event as an AOP advice, for lifecycle before/after pointcuts.

Everything in maven requires a module. It’s pretty simple to incorporate and build a module, but you then have to maintain it outside of your build file and an average Joe (not Ottinger of course, he’s no average Joe:-), will most likely give up or find some rather dense and superfluous way of accomplishing a simple task.

I think an answer to this is a simple, extensible, DSL-based build system that would incorporate the best of both worlds. Convention and flexibility. We don’t have to sacrifice either one to accomplish what we want. Maven gives us convention, ant gives us great flexibility, Alex will give us both.

Anyone interested, the project is just starting up. You can get more details here…

http://code.google.com/p/alexbuild/


20
Jan 07

Generic polymorphism using wildcards

This morning I started thinking about advanced usage of generics to implement polymorphic types. The idea came from the thoughts of doing dynamic runtime binding to a generic collection of types that are bound from some XML representation and not known at runtime. I found a few pieces of information on the web, but those did not provided very clear examples. I thought I’d put together a small use case. I’m currently very busy with something I must finish this weekend, though I won’t provide a very thorough explanation. I’m posting code with comments and will provide a thorough examination of how this is implemented sometime next week. This code also doesn’t demonstrate dynamic binding which I’ll implement with JiBX.

The domain model is a rudimentary encapsulation of the ODM data model, but is only used for demonstration purposes.

Well here it goes…


13
Dec 06

Javapolis XQOM Quicky

Frank Cohen is presenting XQOM in a Quicky session at Javapolis today. He will post the presentation slides a bit later in the day. He’s also going to send over any questions that he receives, so that I can address them here…

Update:

The slides are now available here…
http://downloads.pushtotest.com/200612/XQOM_Quickie_JavaPolis06_final.pdf


10
Dec 06

Great APIs can somewhat make up for bad language design

We all have our preferences of what language(s) we prefer.  There are some languages that are cleaner than others, some are more verbose, and then some are more expressive.  Either way, languages are personal and we get used to various idioms and patterns.  Though, with enough use and experience, we can become proficient in any language and/or API.  Some languages make the learning curve a lot steeper, and even once mastered, still can make it quite difficult to accomplish your objective.

Javascript, I think is one of those languages.  Unfortunately we’re stuck (at least for the time being), using this immaturely designed and in many ways incomplete language.  It tries to resemble some OO features, but is inherently a very procedural language.  Though I’m not a big procedural fan, I’ve used languages like Perl and C for many years with great success.  Perl did a better job of resembling OO capabilities though still fail short in many ways.  C though not having any OO facilities, did in time develop patterns of accomplishing some of the OO idioms in various ways.  Javascript, on the other hand does a poor job of being procedural as well.  Procedural languages are all about data and operations on the data.  With that in mind, they should contain various features and idioms that allow easier operations of such.  Though javascript provides an OK API for binding to the DOM and other browser API binding capabilities, its inability to provide a flexible, extensible programming environment for general purpose programming is effecting the quality of Web 2.0 software development.

Recently I watched a presentation by Joshua Bloch, "How to design a good API and why it matters".  Though the context of the presentation was pretty simple, it actually opened my eyes as to what such simple conventions as naming, focus on usability, and not overwhelming the API with the abundance of patterns, etc…, can do to make the API easier and more enjoyable to use.

Although, I learned a great deal from the presentation and vowed to use some of these conventions in my future work, the importance didn’t really strike me, until this week.  We’re currently redesigning our interface in a more Web 2.0 way, utilizing a lot of Ajax with asynchronous REST calls, event queuing, etc…  This of course requires the writing of a great deal of javascript and trying to employ the available features to implement algorithms and various patterns to support the interface.  In keeping with the tradition of not reinventing the wheel, I started looking for a framework to support the features that we need and then provide an abstraction layer on top to support our custom API needs.  Though I found quite a few libraries out there and many of those exploited javascript to its fullest abilities (or I’d rather call them limitations), none struck me as the mootools framework.  Mootools is a compact modular javascript framework, that allows you to pick and choose the features you would like to use.  Besides the modularity of the features, at its core is a framework that the rest of the mootools APIs are build upon.  The core mootools framework, is an API that allows for working with Javascript in a more OO-friendly way.  It provides for better extensibility capabilities, even for those that are not as familiar with javascript’s horrific OO idioms.

My initial abstraction was the Ajax API.  Besides the fact that I had to build something on top of the mootools ajax framework, the API in its current form did not provide the capabilities needed to process REST based requests.  The onFailure property wasn’t available, though we couldn’t process any non-200 http responses.  In a RESTful world, we are using the non-200 response codes for various server-side exceptions by also appending a custom message header that is used to display an error message.  So the first job was to extend the current Ajax class.  This is very easily accomplished in mootools, using the Class API.  The Class has an ‘implement’ function, which basically allows you to redefine any class properties/functions.  Here is a small example of redefining the Ajax ‘request’ function.

Ajax.implement({ 
  request: function(){
      this.transport.open(this.options.method, this.url, this.options.async);
      this.transport.onreadystatechange = this.onStateChange.bind(this);
      if (this.options.method == ‘post’){
        if (this.options.contentType == null || this.options.contentType == ”) {
          this.transport.setRequestHeader(‘Content-type’, ‘application/x-www-form-urlencoded’);
        }
        else {
          this.transport.setRequestHeader(‘Content-type’, this.options.contentType);
        }
          if (this.transport.overrideMimeType) this.transport.setRequestHeader(‘Connection’, ‘close’);
      }
      switch($type(this.options.postBody)){
          case ‘element’: this.options.postBody = $(this.options.postBody).toQueryString(); break;
          case ‘object’: this.options.postBody = Object.toQueryString(this.options.postBody);
      }
      if($type(this.options.postBody) == ‘string’) this.transport.send(this.options.postBody);
      else this.transport.send(null);
      return this;
  },
});

In the above redefinition, I add the ability to specify the contentType attribute, which is then used to set the content type of the request.  The default mootools Ajax implementation, only uses the application/x-www-form-urlencoded content type for every Ajax type request.  This is not sufficient for a RESTful API, as the content type is used as a part of the request definition and can be used to identify and process requests on the server side based on the type of data being submitted.

You can also add properties and functions in the same way to any class that is build with mootools core Class functionality.

The next step was to create an abstraction class that will better define the API needed for accessing our REST services.  I decided to utilize mootools to define such a class, though also giving it the flexibility needed to later extend it as needed.  There are also other benefits to using the mootools Class API.  It allows for a standardized object initialization, object properties definition, and an abundance of other features.  Though the main attraction was the beauty of its API.

Here is a small snippet of the new class definition…

var RestRequest = new Class({

    setOptions: function(options){
        this.options = {
          uriContext: ‘rest’,
          contentType: ‘json’,
          onComplete: Class.empty,
          onFailure: Class.empty,
          clazz: null,
          method: null,
          params: null,
      content: null,
      url: null,
        };
        Object.extend(this.options, options || {});
    },

    initialize: function(options) {
        this.setOptions(options);
        if (this.options.contentType == ‘json’) {
          this.options.contentType = ‘application/javascript’;
        }
        else if (this.options.contentType == ‘xml’) {
          this.options.contentType = ‘application/xml’;
        }
        this.options.url =
          this.constructUri(this.options.clazz, this.options.method, this.options.params);
    },

    postRequest: function(content) {
      if ($type(content) == ‘string’) {
        this.options.content = content;
      }
      var myAjax = new Ajax(this.options.url, {
        method: ‘post’,
        contentType: this.options.contentType,
        postBody: this.options.content,
        onComplete: this.options.onComplete,
        onFailure: this.options.onFailure,
        }
      );
    myAjax.request();
    },
});

Two common functions that are used above, setOptions and initialize, are used by the Class framework, to allow for constructing and initialization of the object you define.  You define setOptions function to specify the options/attributes (object state) and override any options that are defined in the super class.  (Note, there is no true concept of inheritance in javascript, rather objects are redefined at runtime with new data/functions.)

Initialize is then used as an object constructor.  It is called by the Class framework to instantiate the object.  Instantiate in fact calls setOptions to initially set the option defaults and override them with any options provided to the constructor.

The custom method, postRequest is then defined that basically abstracts the Ajax request to a particular REST based API.

You can then use the above API for making REST request as follows…

var restRequest = new RestRequest({
    clazz: ‘SomeClass’,
    method: ‘some_method’,
    contentType: ‘xml’,
    params: {
        id: ‘1234’,
        some_param: ‘some_param_data’,
        },
    onComplete: function(responseText) { alert(responseText) },
    onFailure: function(response) { alert(response.status + "\n" + response.statusText + "\n" + response.message) },
});
restRequest.postRequest(‘<xml><message>hello</message></xml>);

I hope the above allows one to see how a great API design can make working with a language that we dislike, a lot more enjoyable.  I really advise you to view Joshua’s presentation.  It can be found here (http://www.infoq.com/presentations/effective-api-design).


Page 11 of 13« First...910111213