Pages

Tuesday, November 5, 2013

Automating IntelliJ IDEA Plug-in Development

In the previous post I wrote about managing the dependencies for IntelliJ IDEA plugin development and generating the project files. This was pretty simple and here's the full script:


apply plugin: "java"
apply plugin: "idea"

repositories {
    jcenter()
    maven {
        credentials {
            username 'username'
            password 'password'
        }
        url "http://my.repository.url
    }
}

dependencies {
    compile('org.zeroturnaround:jr-sdk:5.4.1') {
        transitive = false
    }
    compile 'org.ow2.asm:asm-all:4.1'
    compile 'org.slf4j:slf4j-nop:1.6.3'
    compile 'org.apache.commons:commons-compress:1.2'
}

defaultTasks 'setup'

idea.project.ipr {
    beforeMerged { project ->
        project.modulePaths.clear()
    }
}

idea.module.iml {
    withXml {
        it.node.@type = "PLUGIN_MODULE"
    }
}

task setup {
   description = "crete .idea based project structure"
   dependsOn ideaModule, ideaProject
   doLast {
     copy {
        from '.'
        into '.idea/'
        include '*.ipr'
        rename { "modules.xml" }
     }
     project.delete "${project.name}.ipr"
   }
}

task wrapper(type: Wrapper) {
    gradleVersion = '1.8'
}

The script is only limited to pulling down the dependencies from binary repository and generating the project files. However, one really big thing is missing here, is the SDK setup. Adding the new SDK for plugin development is simple enough, but I still have a good reason to automate that. In my plugin, there's a dependency on some SDK artifacts that are not included into SDK classpath by default. So adding the dependencies one by one into the SDK via the UI is a bit tedious and not really encouraging for the team work.

It would be cool if the script could register the IntelliJ Platform Plugin SDK and add it into the project files. It turns out that it should be doable as the SDK definitions are stored in options/jdk.table.xml file in IntelliJ IDEA's configuration directory.

It might be a bit brittle to go tinkering with the file directly, so maybe JPS API could actually be used to add a new entry for the SDK. (just an idea)

Those would be the steps for automating the setup:

  1. Fetch dependencies and generate .idea directory based project
  2. Generate Platform SDK entry to options/jdk.table.xml and add the reference to project *.iml file and .idea/misc.xml file
  3. Optionally, add more dependencies to Platform SDK definition
  4. The final step is the to add the build and release logic to the script

Now, a little bit of dreaming :-)

In the ideal world, the steps could be described as properties in build.gradle file:

  1. idea.module.type="PLUGIN_MODULE" - resolve as plugin module
  2. idea.project.sdk=auto (and idea.module.sdk=auto) - auto-generate the SDK entry (with good defaults) and add it to the project and module
  3. The extra SDK dependencies are probably something that needs a special care. Don't know what could be the best workaround, but maybe something like this:
    idea.project.sdk.dependencies {
       lib1.jar
       lib2.jar
       lib3.jar
    }
  4. Build script would basically mimic the autogenerated the Ant file, and release step actually requires a bit more than just changing the version of the artifact: the version in META-INF/plugin.xml has to be updated as well.

Dreaming aside, I think that the steps described above could be automated with a reasonable effort.

Sunday, October 27, 2013

Managing Dependencies for IntelliJ IDEA Plug-in Development

Probably the most painful part in developing plugins for IntelliJ IDEA is the dependency management of the libraries that your plugin might depend on. There is a multitude of problems that one may encounter:

  • IDEA binaries are not hosted publicly neither in Maven Central, JCenter, or any other repository.
  • Plugins cannot be built with the common tools like Maven or Gradle without setting up your hair on fire.

When your plugin depends on just a few external libraries, and the plugin build itself doesn't require customization, then everything is simple: put the dependencies into lib/ (or whatever) folder in the project and generate the Ant build script through Build -> Generate Ant Build... action. Simple. Well, not really simple: the generated build script will require a few JARs from IDEA by pointing to the IDEA installation directory. So the simplest thing to do in this case is to extract the full IntelliJ IDEA distribution into some directory at the machine where the continuous integration server runs.

Simple, ugly, but works. Why not to make things a bit more kosher?

In my world, any developer in the team should be able to clone the project, open it in the IDE(A), and ideally, launch the project without any additional tuning.

So what could we do in case of IntelliJ IDEA plugins projects? Store the *.iml files in VCS so that when the project is opened by another developer he would automatically have all the dependencies attached to the module? Not kosher.

For me, clearly, the dependencies should be managed by dependency management tool. Options: Maven, Ivy, Gradle. IDEA provides a very good Maven support. But not for its own plugin modules. In fact, for plugin modules I wouldn't even try to use Maven as once IDEA recognizes it as a Maven project, it will erase the information about its plugin origin.

Other options: Ivy and Gradle. Ivy is awesome. It works. There's also a nice plugin for IDEA that will automatically import the dependencies if it locates ivy.xml in the project directory. In that case you could still use the autogenerated Ant build script - just alter it a bit so that it would make a call to ivy task to get the dependencies and incorporate 'em into the build classpath.

Sounds a bit too hardcore to me. There's a high chance that the project import will not be as smooth as you would like it to be. Especially if the project structure isn't very trivial.

Gradle to the rescue!

Gradle is awesome when it comes to non-trivial project structures. Its Ant-like flexibility along with nice Groovy syntax and all the Maven-like goodies is just awesome!

First of all, managing dependencies is very easy. For instance:

  repositories {
    jcenter()
  }

  dependencies {
    compile('org.zeroturnaround:jr-sdk:5.4.1') {
        transitive = false
    }
    compile 'org.ow2.asm:asm-all:4.1'
    compile 'org.slf4j:slf4j-nop:1.6.3'
    compile 'org.apache.commons:commons-compress:1.2'
  }

The little cool part is that JetGradle (in IDEA 12) would also automatically resolve and add the dependencies into the project.

This is cool but it is not enough. First of all, JetGradle cannot import the project as an IDEA plugin module. Secondly, there's no JetGradle in IDEA 13 as Gradle integration is getting a major overhaul. OK, back to the drawing board.

There's an 'idea' plugin for Gradle, how cool is that?

Just run gradle idea and it will generate the project files, incorporating the references to the required dependencies. Good. But the generated module type is a Java module. And I need Plugin module. Have no fear, Gradle's here! We can easily customize the build script to adjust the XML project descriptor to our requirements.
  apply plugin: "idea"

  repositories { ... }
  dependencies { ... }

  idea.project.ipr {
    beforeMerged { project ->
      project.modulePaths.clear()
    }
  }

  idea.module.iml {
    withXml {
      it.node.@type = "PLUGIN_MODULE"
    }
  }

The withXml hook makes the magic here - instead of the auto-generated 'JAVA_MODULE' the final descriptor will contain 'PLUGIN_MODULE' which will make IDEA think that the module is an IntelliJ IDEA plugin.

Only little problem that bothered me: the generated project descriptor is an *.ipr file that is kind of deprecated. It would have been much better if the plugin generated a directory based project metadata. The feature request was filed long ago but still isn't resolved.

After experimenting with the project descriptors for a bit, I actually found a simple workaround for this minor annoyance. The .idea directory structure requires only modules.xml file that is identical to the *.ipr file that idea plugin generates. So I could just use this simple task to create directory based structure:

  task setup {
     dependsOn ideaModule, ideaProject
     doLast {
       copy {
          from '.'
          into '.idea/'
          include '*.ipr'
          rename { "modules.xml" }
       }
       project.delete "${project.name}.ipr"
     }
  }
[hate mode="on"]Just writing delete "*.ipr" would not work. That's annoying.[hate mode="off"]

So now I could just execute gradle setup and it would generate the directory based project structure, with the correct references to the required dependencies and I could import the project into IDEA without any hassle.

This is all good. However, this is not the end of the story! There's more to do and more issues to resolve:

  • Get the required IDEA internal artifacts to a local repository, so that these dependencies could also be downloaded via Gradle dependency manager.
  • Migrate auto-generated Ant build script into Gradle script.
  • Adopt the Gradle script to be able to manage the releases of the plugin.

However, [hate mode="on"]none of this hassle would be needed if IntelliJ IDEA provided a sane way of building the plugins and managing the dependencies.[hate mode="off"]

Sunday, August 4, 2013

Java Build Tools Survey - Results

Here come the results for the last week's Java build tools survey. 675 people responded - thanks everyone who contributed!

The answers were not mutually exclusive and a good portion of responders selected several answers. I interpret that as "I use Maven at work and Gradle at home". Or maybe there are multiple projects that the guys are working on and the old projects aren't migrated from Ant to Maven/Gradle.

Anyways, here are the results:

480 out of 675 people indicated that they use Maven, which makes 71.1% out of all responders. Actually this number somehow matches my personal assumption about Maven share. Jason actually confirmed that this should be in line with the previous similar surveys:

Gradle was used 235 people out of 675, which makes 34.8%. I was quite surprised by the number and I think this is due that the survey was mostly published via Twitter - I have some followers from Groovy/Gradle community. While I do believe that Gradle is getting momentum as a build tool, I really doubt that its share is high in Java enterprise/legacy projects.

Ant (+Ivy) was mentioned by 114 responders and is at the third place with 16.9%. I think this number will be much much higher if we take only the legacy/enterprise projects.

Maven vs Gradle vs Ant

I think that there's no Maven vs Ant debate any more. Maven has won. Tooling support is (mostly) excellent and setting up a Maven projects in all major Java IDEs is (mostly) a no-brainer. There's definitely the Maven vs Gradle debate - there's plenty of details to debate about, which is better (at the moment of writing this post). But yet again, tooling support for Maven is the main argument why Maven is better. I think that without good IDE support Gradle won't overtake the build tools' market share, period.

BTW, I'll keep the survey opened for some time - maybe it will get more responses.

P.S. There's a good summary about the build tools at Rebel Labs, if you're interested.

UPDATE: SBT was the 4th in the standings. 2.8% of respondents have indicated that they use SBT.

Monday, July 29, 2013

Java Build Tools Survey

Just curious, what could be the share of Java build tools. I will share the results as soon as there's enough answers for this survey to make sense.
P.S. If the browser doesn't display the embedded survey frame, here's the direct link: http://www.surveymonkey.com/s/WLGLW2K
P.S.2 BTW, I'm looking for volunteers to give the new JRebel Beta a try. If you're interested, please sign up here - we will start publishing the binaries soon.

UPDATE: I've published the results here

Friday, July 26, 2013

JRebel 6.0 Beta Registration Started

It’s been more than a year since JRebel 5 has been released. During the year there have been a few minor releases and a lot of things were improved – remoting, debugger integration, support for more frameworks, IDE plugins, etc.

Now, JRebel 6 is on the horizon. There will be several big changes (not revealing yet), which I hope will improve the UX very much. JRebel will be able to do things that it wasn't able to do before - that's very cool!

Since it is a major release and there are a lot of changes that we introduced to the product, I'm looking for volunteers to try out the new beta version on real-life Java projects.

You can register for the beta program here.

If all goes as planned the first builds will land late August / early September.

Friday, July 19, 2013

Java EE 8: Coherence Is The Key

Java EE 7 was released just recently. And now it seems to be the time of preparing to the design on Java EE 8. There are already some whichlists published in the blogs:

I might we wrong, but I don't remember seeing (or hearing) from anyone what is the ultimate goal of all these improvements. IMO, Java EE should take a look on Ruby-on-Rails or Play! and try to inspire from the coherent nature of those frameworks. Currently, my feeling is that Java EE looks like a collection of different specs that integrate with each other to a certain degree. But actually to be productive, one shouldn’t care about the names of the specs (CDI, JAX-RS, EJB, etc), but instead “just write” the code. Java EE could just really align itself better with RoR.

Monday, July 1, 2013

Rebel Labs: Java 8 - lambdas, default methods, bulk data operations

I've spent some time composing an overview of the new features in Java 8. It includes the overview of lambda expressions in Java 8 and accompanying interesting features - default methods and bulk data operations for Java collections. I hope you will find the text interesting.

Tuesday, June 25, 2013

Speaking at JavaOne SF 2013

I've got 4 talks to deliver at JavaOne SF this year:
  • UGF10388 - NetBeans Community Tools with JRebel, Jelastic and others. I'll be talking about our NetBeans plugin for JRebel - how it's made and what it does.
  • CON2585 - Embedding JVM Scripting Languages. This talk has been brewing in my head for a long time now. It started at a time when I was investigating applicability business rule engines at my previous job. Amazingly, after some time I realized that in order to implement a BRMS, you actually have to have a very very very good reason. In most cases a rule engine can be as easy as a simple dynamic script that follows some conventions. Scripting languages on JVM allow just that! I'll be talking about embedding scripting into a Java app and assessing different options for the implementation. Will cover Groovy, JRuby and a bit of JavaScript too.
  • CON2578 - Taming Java Agents. Since JRebel is stands on the shoulders of Instrumentation API and bootstraps via -javaagent VM argument, there's good portion of interesting things that I could talk about. Thankfully, Nikita will assist me as he's involved with Plumbr, which also based largely on Instrumentation API, however, it includes some native code as well. Combined, it will be an interesting overview of the underlying technology that enables us to develop awesome tools for Java platform.
  • CON3477 - Apples and Oranges: The Highlights of Eclipse, IntelliJ IDEA, and NetBeans IDE.
  • This will the showcase of all 3 major IDEs - NetBeans IDE, IntelliJ IDEA, and Eclipse (with JBoss Tools plugin). Geertjan suggested we do it together, so there will be 3 speakers, one for each IDE: Geertjan for NetBeans IDE, Max Andersen for Eclipse, and myself for IntelliJ IDEA.

See You At JavaOne!

Thursday, April 11, 2013

GeekOUT 2013, Tallinn, June 12-14

GeekOUT conference started in 2011 as a local Java event. In 2013 it will be the 3rd coming for GeekOUT and the conference agenda is quite outstanding!

There are 3 things about the conference that stand out:

Agenda - 15 carefully selected talks. It is all about Java, JVM and JVM programming languages and tooling.
Workshops - a dedicated day for hands-on workshops.
DemoGrounds - a number of very cool vendors will be exhibiting: Hazelcast, CloudBees, Atlassian and more!

However, one interesting thing about the conference is that it takes place in Tallinn. I can tell you for sure - Tallinn is absolutely awesome in June! Well, it usually is :) So if you are a Java developer and haven't been to Tallinn yet - now you have a good reason to visit the city!

Monday, March 18, 2013

Back from 33rd Degree

33rd Degree conference have changed location this year. This time the action took place in Warsaw. It seems that 33rd Degree has outgrown its venue in Krakow and the organizers decided to pursue the possibility to expand the conference.

The venue

I realized that the venue "configuration" counts a lot. This time the conference was located equally on 2 floors of the Gromada Airport hotel. Actually it was a fair separation for both vendors and speakers, but there's a caveat. If the fancier sponsors are on the top floor and the most famous speakers are there as well, then the vendors who locate in the "bunker" should be worried. Actually the speakers who get to speak in the other floor should be worried as well. Literally, on the first day when I arrived, I could not wait in the line for a cup of tea at the top floor and at the same time it was dead empty at the bottom floor. This is just how the crowd moves, and it is very hard to fix it.

My talks

My first talk Do you really get your IDE? happened to take place at the bottom floor at the same time when a beer party was starting at the top floor. I was almost sure that noone will be interested in coming to the talk and my bet was that only 5-10 people would show up. I was totally wrong. The room was full. Very surprising. Since it was a BOF format and a lot time was consumed by conversations in the middle of the talk, I didn't really cover all the cool tricks that I wanted but I would count this talk pretty successful anyway - it was fun and entertaining. I hope I can "sharpen the saw" in delivering this talk a bit more since people really enjoy learning the tools.

My second talk was about JRebel and how it can be used for updating Java applications. Not so many people came this time, probably because JRebel is a well known tool already and Baruch has dragged all the attendees in his talk instead.

Other talks

I usually don't attend that many sessions at the conferences since I know a lot of speakers personally and can learn from them directly in off-line conversations. However, this time I decided to go and listen and learn some cool stuff at the sessions.

Leading the technical change by Nathaniel Schutta. The talk title and abstract did not fire up my curiosity. I just know that Nathaniel has a style delivering the presentations and I wanted to go and learn from his performance. The idea is very simple actually: instead of bullet points Nathaniel creates a slide per bullet point and makes it look attractive so that when the next slides comes he can see it from his laptop screen and then he knows what and how he should say. That is why the delivery is so smooth, no matter which topic he presents. Of course Nathaniel didn't skip the book he co-authored, the Presentation Patterns - very useful book for everyone who wants to present at the conferences.

Being Honest - Rethinking Enterprise Java by Adam Bien. Adam is a very energetic speaker and I actually like his presentations about Java EE very much. But this time the technical aspects of the presentation didn't do good. The large room was full, but the screen was way too small for everyone to see the code. Plus the mic wasn't really working and it was quite hard to listen, so I could help myself but leave the room for some other talk.

So I made it to the talk about Kotlin by Hadi Hariri. Kotlin's ecosystem is doing great steps forward. Unfortunately I didn't see the whole talk. In the part that I grasped this time Hadi was talking about the features that allow creating DSLs and he presented his own development, Spek, the specification framework for Kotlin.

Scripted: Embracing Eclipse Orion by Martin Lippert. This was very unfortunate. Very nice talk about very nice tool with almost empty room. I call it "bad marketing". Martin is a good speaker and he talks about interesting topics, but it seems the title of the talk turned off the crowd. Who cares about Eclipse Orion at this kind of conference? Besides, nothing in the talk was really about Eclipse Orion. It was about Scripted - a kick-ass browser based JavaScript editor, very interesting R&D project developed at SpringSource/VMWare.

Programming with Lambda Expressions in Java by Venkat Subramaniam. Venkat's talks are so perfect it is hard to get a seat in the room. Attendees usually occupy the room in advance before the talk starts and those who are late steal the chairs from other rooms in order to get a seat. Nothing really advanced in the topic, but Venkat presents it with passion. Very entertaining.

How we took our server-side application to the Cloud and liked what we got by Baruch Sadogursky. Very interesting talk for those who want to learn the basics of multy-tenancy and approaches in implementation for SaaS. Baruch talked about the solution they chose for the hosted version of Artifactory - what were the challenges and pitfalls.

Overall

33rd Degree was definitely a success for the organizers but still there's plenty of details to improve: technical equipment would be the first on the list. Something needs to be done for managing the crowd - most of the time people are late for the sessions by a lot (with exception for the Venkat's talks). This is very distracting for the speakers, I think, even if they say it is not.

The great thing about 33rd Degree is that Grzegorz works super-hard to organize it all: get the great (NFJS) speakers, cool vendors (Atlassian, Plumbr, JetBrains, etc), and the venue, which is actually very nice: plenty of space, great food, very close to the airport. plus, the price for the conference is still very affordable.

Saturday, March 9, 2013

Talking at 33rd Degree conference

Once again, I'm back to 33rd Degree conference, taking place in Warsaw on March 13-15.

I'm going to have 2 talks there:

Do you really get your IDE?, which is actually a BOF. This is my first time experience in running a BOF, so I'm not really sure how it turns out. I want to discuss with the folks, how the IDEs are used and how are the developers using the IDEs. I will play a bit with the code in IntelliJ IDEA and probably jump into other IDEs as well.

Reloading Java applications like a Pro. At many conferences I've been talking about the very root of the turnaround problems in Java, the reasons and pitfalls. But this time I decided that it would be much more fun to showcase what JRebel can do for different types of Java applications. I plan to talk about the mechanics of the updates and what is happening inside Java application when JRebel is doing it work. Hopefully I can fit several demo scenarios into the talk: for Spring based application, for JavaEE, maybe something for non-conventional apps and desktop apps (e.g. JavaFX).

I'm glad that 33rd Degree organizers allowed me to talk about JRebel directly. At many conferences the organizers are actually quite hesitant to accept any talks about commercial products. However, every time I give a talk on some technical topic, the attendees actually are eager to ask questions about JRebel rather than about the talk topic itself. It means that JRebel is more interesting, than, for instance, Java bytecode. So why bother about the commercial side of it?

Tuesday, January 15, 2013

Welcome to RebelLabs!

In ZeroTurnaround we just launched a new page with awesome content collected in one place. RebelLabs hosts very nicely designed documents with interesting content produced by ZT engineers.
My favorites so far are the reports on Scala adoption and Developers Productivity Report from 2012

Tuesday, January 8, 2013

Java EE 7 Public Draft was published. I demand Java EE Light Profile!

On December 20, 2012 a public draft of Java EE 7 has been uploaded. From the first sight, the new spec is rather an improvement of the subsequent specs in Java EE 6. For instance, I really like the Web Profile idea. But is is a shame that it wasn't a part of Java EE 6 Web Profile.
The Web Profile is targeted at developers of modern web applications
IMO, most of the modern web applications make use of REST. Or at least this is my perception. In Rails world, AFAIK, violating REST principle is a subject for brutal prosecution by the colleagues :) Luckily Java EE 7 fixes that mistake and JAX-RS specification is now a part of Web Profile.
Targeting “modern” web applications then implies offering a reasonably complete stack, composed of standard APIs, and capable out-of-the-box of addressing the needs of a large class of web applications.

OK, now you can really develop "modern" web apps with Web Profile, but...

In terms of completeness, the Web Profile offers a complete stack, with technologies addressing presentation and state management. (JavaServer Faces, JavaServer Pages), core web container funtionality (Servlet), business logic (Enterprise JavaBeans Lite), transactions (Java Transaction API), persistence (Java Persistence API) and more.

Sounds like redundancy to me. For instance, why would you need EJBs there? If CDI supported interceptors properly there wouldn't be a need for EJBs in that sense. Or, JSF? Well, I'm just not a fan of that.

What I'm trying to say here is that since for compatibility reasons there wouldn't be possible to drop specs from Web Profile, maybe it is now time to create a "Light Profile"? A minimalistic set of Java EE specs that would be sufficient for building modern web applications.

Of course the term is a bit foggy - what should we consider a modern web application. These days it is a combination of a REST backend and UI technologies such as HTML5 and JavaScript. My logic says that since Java EE doesn't specify UI technology then the main specification that required is JAX-RS and the complementary specifications to support transactions (JTA/JTS), persistance (JPA), and dependency injection (CDI). Of course, there are some nice complementary specifications such as Bean Validation and Java API for JSON processing. But I would definitely drop JSF and EJBs for sure.

This would bring the containers like Tomcat and Jetty even closer to the spec and who knows maybe one day we will have a Java EE "Jetty Profile", why not :)

Disqus for Code Impossible