April 22, 2014

SWT on JavaFX update-site available

Just for those who want to give SWT on JavaFX a try themselves you can now grab the nightly bits from http://download.eclipse.org/efxclipse/experimental-nightly/site_assembly.zip.

To develop a plain SWT/JavaFX app (no OSGi!) you should add the following jars:

  • org.eclipse.fx.runtime.swt_1.0.0.$buildtimestamp.jar: this is the real SWT implementation
  • org.eclipse.fx.runtime.swtutil_1.0.0.$buildtimestamp.jar: this is a set of utility classes who close the gap between things hard to implement in SWT on JavaFX like e.g. open a blocking shell – it will work on current SWT ports (win32,cocoa,gtk) and SWT on JavaFX
  • org.eclipse.fx.ui.controls_1.0.0.$buildtimestamp.jar: JavaFX controls implemented at e(fx)clipse and reused from the SWT port

and add them to your build-path.

A simple hello world application would look like this:

package sample;

import org.eclipse.fx.runtime.swtutil.SWTUtil;
import org.eclipse.fx.runtime.swtutil.SWTUtil.BlockCondition;
import org.eclipse.fx.runtime.swtutil.SWTUtil.SWTAppStart;
import org.eclipse.swt.SWT;
import org.eclipse.swt.layout.GridData;
import org.eclipse.swt.layout.GridLayout;
import org.eclipse.swt.widgets.Button;
import org.eclipse.swt.widgets.Display;
import org.eclipse.swt.widgets.Shell;

public class HelloWorldSWT implements SWTAppStart {

	public BlockCondition createApp(Display display) {
		Shell shell = new Shell(display);
		shell.setText("Is it SWT or JavaFX?");
		shell.setLayout(new GridLayout());
		Button button = new Button(shell, SWT.PUSH);
		button.setText("Hello SWT on JavaFX");
		button.addListener(SWT.Selection, (e) -> System.out.println("SWT on JavaFX is great!"));
		button.setLayoutData(new GridData(GridData.CENTER, GridData.CENTER, true, true));
		return null;

	public static void main(String[] args) {
		SWTUtil.getInstance().bootstrap(new HelloWorldSWT());

(If you only use org.eclipse.fx.runtime.swtutil_1.0.0.$buildtimestamp.jar and the current stable SWT ports the above code will work equally!)

I think there are only 2 things that need explanation:

  • The bootstrapping or creation of the Display instance is abstracted because JavaFX requires to bootstrap applications in a very special way so creating a Display yourself is discouraged
  • Blocking until a condition is met e.g. a Shell is closed is. JavaFX does not directly expose the event loop. In SWT you stop the workflow of your application with code like this
    while(<condition>) {
      if(!display.readAndDispatch()) {

    Event loops are e.g. used to open a Shell and block until it is closed because the SWT-Shell only has a none blocking method named open(). JavaFX counter part to Shell is Stage and unlike Shell it has 2 methods show() and showAndWait() while SWT on JavaFX can emulate the SWT-Event-Loop spinning with the original API it is much more efficient to call out to showAndWait when you want to open a blocking shell.

Running the above code should open an application like this

Tabris 1.4 Preview: Google Analytics

On June 26th we will release Tabris 1.4. Currently we are working on the new features for this release. One of the Killer-Features, a tracking API for the Tabris UI framework, was finished last week. The API comes with a ready-to-use Google Analytics integration. All you need to do is set your tracking ID and activate it. In your code this might look like this:

UIConfiguration configuration = new UIConfiguration();
Tracking tracking = new Tracking( new GoogleAnalyticsTracker( "YOUR_TRACKING_ID", "Your App Name" ) );
tracking.start( configuration );

As you can see, the new Tracking type needs a Tracker. In our case, a GoogleAnalyticsTracker. To start tracking, you just need to call start and pass your Tabris UIConfiguration. Everything else is done for you by the Tracking framework. In the background, tracking takes note of every Tabris UI interaction. This includes:

  • Page Transitions: Every time a user opens a page.
  • Action Executions: Every time a user presses an Action.
  • Searches: Every time a user executes a search.

In Google Analytics you can see that the events will be tracked in realtime and you can do every kind of analysis as you may know them from websites.

overview 300x184 Tabris 1.4 Preview: Google Analytics realtime 300x100 Tabris 1.4 Preview: Google Analytics search 300x169 Tabris 1.4 Preview: Google Analytics

As you can see, besides the events, the GoogleAnalyticsTracker also tracks device information, location information and much more.

One more thing…

The tracking API we provide is generic. It means you can also implement you own Tracker to send events to your favourite tracking system, log etc. We will have the documentation ready when the release comes out. For all of you that don’t want to wait: You can try this tracking right now when using the latest build from our staging p2 repository: http://download.eclipsesource.com/technology/tabris/downloads/staging or you can checkout the sources.

Please Note: The feature described above is a preview of the upcoming Tabris 1.4 features. Tabris 1.4 will be available on June 26th 2014. If you want to try it out before the release just send  us a quick request.


Leave a Comment. Tagged with google, Java, rap, Tabris, google, Java, rap, Tabris

Eclipse and the Ubuntu team working together – Solving the Eclipse menu issue under Ubuntu

I’m very happy to report that the Eclipse platform team worked together with the Canonical Ubuntu team to solve the broken menu issue of Eclipse under Ubuntu. This fix is available for Ubuntu 14.04.

It just to be the case that the menu of the Eclipse IDE did not work out of the box under Ubuntu and you had to use several tricks to make Eclipse work. This was relatively hard to discover for new users of Eclipse under Ubuntu and increased the barrier in using Eclipse.

We (Eclipse platform team) reached out to the Ubuntu team and William Hua from Canonical did solve this issue. I would like to thank William and the whole team from Ubuntu for that effort, they could also have pushed the issue back to the Eclipse side. But they went through the extra effort to solve it on their Ubuntu side. Kudos to them!

Thanks to William for doing most of the work, and also a big thanks to all involved parties, including Jono Bacon, Jason Warner from Canonical and Aleksandar Kurtakov, Paul Webster, Ian Skerrett and Mike Milinkovich from the Eclipse side. Especially thanks to Aleksandar Kurtakov from Red Hat for answering GTK3 SWT questions.

See Ubuntu Launchpad bug report for details.

April 21, 2014

Happy Easter with Egglipse!

EgglipseCan you find the Easter egg in Eclipse? Granted, it’s not a real Easter egg, but an egg or rather a non-circular ellipse. The Eclipse ellipse can be found in the program and window icons (not in the splash screen).

For instance, in eclipse256.png the Eclipse marble is 230 pixels wide and 223 (instead of 230) pixels high:


I’m curious how the new Egglipse Eclipse logo will look by Christmas. ;-)

Flattr this

Tutorial – A Simple Approach to Writing JavaFX Eclipse RCP Apps

I love the architecture of JavaFX. It has a simple and self consistent API that lets you build scene graphs of your UI layout and send it off to the GPU to render. That should give our apps snappier performance and allow for some pretty cool transitions and effects. I can see it replacing both Swing and SWT one day. Hardware accelerated user interfaces are where we need to go to make apps feel alive.

However in it’s current state, it’s tricky to set up your development environment in Eclipse, at least it is today. Tom Schindl does have his e(fx)clipse tooling, but I’ve come up with a simpler way that uses the standard Eclipse Java 8 support. For me, it’s always important to have a development environment that’s easy to set up and understand. Being able to use the base Eclipse tools for building JavaFX RCP apps lowers the barrier of entry for those who want to try out this new world.

This tutorial shows a fairly simply way to build an Eclipse RCP application using JavaFX for it’s user interface. It’s a simple Hello World similar to the standard JavaFX Hello World tutorial. To make it Ecipse-y, I created an extension point that creates JavaFX buttons. In the end, there are two plug-ins. The main one implements the entry point and the second one adds the Hello World button. I also show how to build an Eclipse product for all this so you can use Maven and Tycho to create an distribution you can hand out to people.

Screen Shot 2014-04-21 at 10.06.46 AM

public class EFXApplication extends Application implements IApplication {
    public Object start(IApplicationContext context) throws Exception {
        return IApplication.EXIT_OK;

    public void start(Stage primaryStage) throws Exception {
        primaryStage.setTitle("Hello World!");
        StackPane root = new StackPane();

        IConfigurationElement[] elements
            = Platform.getExtensionRegistry().getConfigurationElementsFor(Activator.PLUGIN_ID, "buttonProvider");
        for (IConfigurationElement element : elements) {
            Object o = element.createExecutableExtension("class");
            if (o instanceof IButtonProvider) {
                Button btn = ((IButtonProvider) o).createButton();
        primaryStage.setScene(new Scene(root, 300, 250));
    public void stop() {


I won’t go into all the gory details. You can clone my git repo up on github, http://github.com/dschaefer/tut-efx, to see what I’ve done. But there are a couple of important points I want to highlight and maybe spark discussion on how the approach could be made better.

To start, you need an Eclipse with Java 8 support. I’m focusing on the Oracle Java 8 VM and am leaving the Java 7 JavaFX solution behind. Java 8 is easier to set up, although as you’ll see it’s still not optimal. Get an Eclipse distribution that also has EGit and Maven so you can check the github repo and use the Maven launch (why are maven builds launches?) to build the product.

The biggest hurdle we need to get over is that JavaFX isn’t part of the Java 8 standard. It’s only available in the Oracle VM. So the Java 8 support in Eclipse doesn’t work for JavaFX out of the box, or at least, doesn’t do anything automatically to help you set it up. There are two issues we need to deal with. First, the JDT doesn’t make the javafx package visible. Which I guess is justified because the Equinox run-time doesn’t expose the javafx package either. JavaFX is in the extension class path so unless you enable that, you can’t build against it. Luckily it’s pretty easy to do, although after a while it becomes a bit of a pain.

First step is to make the javafx packages visible to the JDT compiler. Of course, make sure the Execution Environment is set to JavaSE-8 and that it’s linked to the Oracle Java 8 JRE. In the project properties under Java Build Path, click on the Libraries tab and expand the JRE system library. Edit the access rules and add one that makes javafx/** Accessible. That’s it. Now you can access all the goodies in the javafx package in your plug-in. Unfortunately, you have to do that for every plug-in. But it gets us going.

Screen Shot 2014-04-21 at 10.11.21 AM

The next step is to enable the OSGi runtime to load from the extension classpath when it tries to load you javafx classes. This is done at runtime by changing the default parent classpath for bundles using a handy system property. Add this to your launch configuration and any other places you will be launching your app from.


Now, I’ve been told this is dangerous because it gives access to all the classes in your extension classpath, even the ones you don’t plan on using and may conflict somewhere, or something like that. But since you’re using JavaFX, you’re tied to the Oracle Java 8 VM for now anyway, so I’m not sure it’s really that dangerous. I’d love to hear more if I’m wrong about that.

The third thing comes at releng time. If you’re using Tycho to build you’re product, then there are a couple of tricks you need to do there. First of those is that you need the latest JDT compiler that supports Java 8. At the time of this writing, Tycho only has a Java 7 JDT bundled with it, but that’s easy enough to fix with a little pom magic for your parent pom.



The other thing is that you can’t use the SWT-based binary launchers. They somehow set things up so the JavaFX application framework hangs on startup. I haven’t figured out why, but avoiding them works out best anyway. So I’ve written script based launchers and have bundled them as rootfiles in my project. I then set the ws property on all my environments to ‘javafx’. Some day we’ll need binary launchers that support this configuration, but for now, in the .product file, you need to turn off binary launchers. Works like a peach. I’ve even configured the Mac build to use bundling so it produces a MyApp.app folder which runs like a normal Mac app. Just make sure you have the Java 8 JRE set up as your default JRE.

# Mac launcher                                                                    
appRoot=`dirname $0`/../..
launcher=`ls -rt $appRoot/plugins/org.eclipse.equinox.launcher_*.jar | tail -1`
java \
    -Dorg.osgi.framework.bundle.parent=ext \
    -Xdock:icon=$appRoot/Contents/Resources/Eclipse.icns \
    -jar $launcher

And that’s it. No special tools required and you can get started having fun with JavaFX and Eclipse today. In my next tutorial, I’ll show how to take this example and grow it into an Eclipse IDE on JavaFX. LOL, I kid. But that’s the direction I’m heading. JavaFX gives us a great user interface framework to carry the Eclipse IDE into the next generation. We have some very hard problems to solve with backwards compatibility with an SWT port on JavaFX, but you got to start somewhere. Hopefully this little tutorial will get you excited about the possibilities.

New sandbox servers for IoT developers

The most difficult aspect of developing an IoT solution is probably that one has to deal with the –embedded– hardware that will eventually run the application that talks to the sensors, thermostats, car entertainment, etc. Embedded is definitely complex: the resources are limited, the development tooling usually sets you back a decade, etc.

But when you have figured out your hardware, you need your devices to actually “talk” to the cloud. And there comes the not so trivial problem of finding an open and publicly accessible server to talk to. This is why Eclipse IoT has been proposing an MQTT broker an M3DA server for a while. Many people use them for connecting their prototype solutions, before actually moving to a more production-ready environment.

Last week, we deployed two new open IoT servers at iot.eclipse.org.

CoAP server

CoAP is an UDP-based protocol for IoT that mimics the REST paradigm in an IoT context. It provides an elegant and efficient way to do sensors monitoring on very constrained networks.
A CoAP server exposing test resources is now available at coap://iot.eclipse.org:5683 and is open for anyone to interact with, whether it’s for learning more about CoAP key principles, or for actually testing a client implementation against it.

Lightweight M2M

OMA Lightweight M2M aims at proposing standard resources and workflows on top of CoAP for Device Management. The freshly deployed LWM2M sandbox at Eclipse provides a nice way to connect LWM2M capable devices and monitor them using a web UI and REST API.
It runs the Leshan open-source LWM2M server.

If you’re actually interested in learning more about CoAP and LWM2M, you probably want to check out this presentation from Julien Vermillard at EclipseCon 2014.

Header image is Creative Commons License splorp via Compfight.

April 19, 2014

Switching to Xcore in your Xtext language

This is a followup of my previous post, Switching from an inferred Ecore model to an imported one in your Xtext grammar. The rationale for switching to manually maintained metamodel can be found in the previous post. In this post, instead of using an Ecore file, we will use Xcore,

Xcore is an extended concrete syntax for Ecore that, in combination with Xbase, transforms it into a fully fledged programming language with high quality tools reminiscent of the Java Development Tools. You can use it not only to specify the structure of your model, but also the behavior of your operations and derived features as well as the conversion logic of your data types. It eliminates the dividing line between modeling and programming, combining the advantages of each.

I took inspiration from Jan Köhnlein’s blog post; after switching to a manually maintained Ecore in Xsemantics, I felt the need to further switch to Xcore, since I had started to write many operation implementations in the metamodel, and while you can do that in Ecore, using Xcore is much easier :) Thus in my case I was starting from an existing language, not to mention the use of Xbase (not covered in Jan’s post). Things were not easy, but once the procedure works, it is easily reproducible, and I’ll detail this for a smaller example.

So first of all, let’s create an Xtext project, org.xtext.example.hellocustomxcore, (you can find the sources of this example online at https://github.com/LorenzoBettini/Xtext2-experiments); the grammar of the DSL is not important: this is just an example. We will first start developing the DSL using the automatic Ecore model inference and later we will switch to Xcore.

(the language is basically the same of the previous post).

The grammar of this example is as follows:

grammar org.xtext.example.helloxcore.HelloXcore with 

generate helloxcore "http://www.xtext.org/example/hellocustomxcore/HelloXcore"


    'Hello' name=ID '!'

    'Greeting' name=ID expression = XExpression

and we run the MWE2 generator.

To have something working, we also write an inferrer

def dispatch void infer(Model element, IJvmDeclaredTypeAcceptor acceptor, boolean isPreIndexingPhase) {
   		acceptor.accept(element.toClass("generated." + element.eResource.URI.lastSegment.split("\\.").head.toFirstUpper))
   				for (hello: element.hellos) {
   					members += hello.toMethod
   							("say" + hello.name.toFirstUpper, hello.newTypeRef(String)) [
   						body = '''return "Hello «hello.name»";'''
   				for (greeting : element.greetings) {
   					members += greeting.toMethod
   							("say" + greeting.name.toFirstUpper, greeting.newTypeRef(String)) [
   						body = greeting.expression

With this DSL we can write programs of the shape (nothing interesting, this is just an example)

Hello foo!

Greeting bar {
    sayFoo() + "bar"

Now, let’s say we want to check in the validator that there are no elements with the same name; since both “Hello” and “Greeting” have the feature name, we can introduce in the metamodel a common interface with the method getName(). OK, we could achieve this also by introducing a fake rule in the Xtext grammar, but let’s do that with Xcore.

Switching to Xcore

Of course, first of all, you need to install Xcore in your Eclipse.

Before we use the export wizard, we must make sure we can open the generated .genmodel with the “EMF Generator” editor (otherwise the export will fail). If you get an error opening such editor about resolving proxy to JavaJVMTypes.ecore like in the following screenshot…


..then we must tweak the generated .genmodel and add a reference to JavaVMTypes.genmodel: open HelloXcore.genmodel with the text editor, and search for the part (only the relevant part of the line is shown)


and add the reference to the JavaVMTypes.genmodel:

usedGenPackages="../../../org.eclipse.xtext.common.types/model/JavaVMTypes.genmodel#//types ../../../org.eclipse.xtext.xbase/model/Xbase.genmodel#//xbase

Since we’re editing the .genmodel file, we also take the chance to modify the output folder for the model files to emf-gen (see also later in this section for adding emf-gen as a source folder):


And we remove the properties that relate to the edit and the editor plug-ins (since we don’t want to generate them anyway):


Now save the edited file, refresh the file in the workspace by selecting it and pressing F5 (yes, also this operation seems to be necessary), and this time you should be able to open it with the “EMF Generator” editor. We can go on exporting the Xcore file.

We want the files generated by Xcore to be put into the emf-gen source folder; so we add a new source folder to our project, say emf-gen, where all the EMF classes will be generated; we also make sure to include such folder in the build.properties file.

First, we create an .xcore file starting from the generated .genmodel file:

  • navigate to the HelloXcore.genmodel file (it is in the directory model/generated)
  • right click on it and select “Export Model…”
  • in the dialog select “Xcore”
  • The next page should already present you with the right directory URI
  • In the next page select the package corresponding to our DSL, org.xtext.example.helloxcore.helloxcore (and choose the file name for the exported .xcore file corresponding Helloxcore.xcore file)
  • Then press Finish
  • If you get an error about a missing EObjectDescription, remove the generated (empty) Helloxcore.xcore file, and just repeat the Export procedure from the start, and the second time it should hopefully work


The second time, the procedure should terminate successfully with the following result:

  • The xcore file, Helloxcore.xcore has been generated in the same directory of the .genmodel file (and the xcore file is also opened in the Xcore editor)
  • A dependency on org.eclipse.emf.ecore.xcore.lib has been added to the MANIFEST.MF
  • The new source folder emf-gen is full of compilation errors


Remember that the model files will be automatically generated when you modify the .xcore file (one of the nice things of Xcore is indeed the automatic building).

Fixing the Compilation Errors

These compilation errors are expected since Java files for the model are both in the src-gen and in the emf-gen folder. So let’s remove the ones in the src-gen folders (we simply delete the corresponding packages):


After that, everything compile fines!

Now, you can move the Helloxcore.xcore file in the “model” directory, and remove the “model/generated” directory.

Modifying the mwe2 workflow

In the Xtext grammar, HelloXcore.xtext, we replace the generate statement with an import:

//generate helloxcore "http://www.xtext.org/example/helloxcore/HelloXcore"
import "http://www.xtext.org/example/helloxcore/HelloXcore"

The DirectoryCleaner fragment related the “model” directory should be removed (otherwise it will remove our Helloxcore.xcore file as well); and we don’t need it anymore after we manually removed the generated folder with the generated .ecore and .genmodel files.

Then, in the language part, you need to loadResource the XcoreLang.xcore, the Xbase and Ecore ecore and genmodel, and finally the xcore file you have just exported, Helloxcore.xcore.

We can comment the ecore.EMFGeneratorFragment (since we manually maintain the metamodel from now on).

The MWE2 files is now as follows (I highlighted the modifications):

Workflow {
    bean = StandaloneSetup {
    	scanClassPath = true
    	platformUri = "${runtimeProject}/.."
    	// The following two lines can be removed, if Xbase is not used.
    	registerGeneratedEPackage = "org.eclipse.xtext.xbase.XbasePackage"
    	registerGenModelFile = "platform:/resource/org.eclipse.xtext.xbase/model/Xbase.genmodel"

    component = DirectoryCleaner {
    	directory = "${runtimeProject}/src-gen"

      // switch to xcore
//    component = DirectoryCleaner {
//    	directory = "${runtimeProject}/model"
//    }

    component = DirectoryCleaner {
    	directory = "${runtimeProject}.ui/src-gen"

    component = DirectoryCleaner {
    	directory = "${runtimeProject}.tests/src-gen"

    component = Generator {
    	pathRtProject = runtimeProject
    	pathUiProject = "${runtimeProject}.ui"
    	pathTestProject = "${runtimeProject}.tests"
    	projectNameRt = projectName
    	projectNameUi = "${projectName}.ui"
    	encoding = encoding
    	language = auto-inject {
    		// switch to xcore
        	loadedResource = "platform:/resource/org.eclipse.emf.ecore.xcore.lib/model/XcoreLang.xcore"
			loadedResource = "classpath:/model/Xbase.ecore"
    		loadedResource = "classpath:/model/Xbase.genmodel"
    		loadedResource = "classpath:/model/Ecore.ecore"
			loadedResource = "classpath:/model/Ecore.genmodel"
        	loadedResource = "platform:/resource/${projectName}/model/Helloxcore.xcore"

    		uri = grammarURI

    		// Java API to access grammar elements (required by several other fragments)
    		fragment = grammarAccess.GrammarAccessFragment auto-inject {}

    		// generates Java API for the generated EPackages
    		// switch to xcore
    		// fragment = ecore.EMFGeneratorFragment auto-inject {}

Before running the workflow, you also need to add org.eclipse.emf.ecore.xcore as a dependency in your MANIFEST.MF.

We can now run the mwe2 workflow, which should terminate successfully.

We must now modify the plugin.xml (note that there’s no plugin.xml_gen anymore), so that the org.eclipse.emf.ecore.generated_package extension point contains the reference to the our Xcore file:

<?xml version="1.0" encoding="UTF-8"?>
<?eclipse version="3.0"?>


  <extension point="org.eclipse.emf.ecore.generated_package">
       uri = "http://www.xtext.org/example/helloxcore/HelloXcore" 
       class = "org.xtext.example.helloxcore.helloxcore.HelloxcorePackage"
       genModel = "model/Helloxcore.xcore" /> 



Fixing Junit test problems

As we saw in the previous post, Junit tests do not work anymore with errors of the shape

org.eclipse.xtext.parser.ParseException: java.lang.IllegalStateException: Unresolved proxy http://www.xtext.org/example/helloxcore/HelloXcore#//Hello. Make sure the EPackage has been registered.

All we need to do is to modify the StandaloneSetup in the src folder (NOT the generated one, since it will be overwritten by subsequent MWE2 workflow runs) and override the register method so that it performs the registration of the EPackage (as it used to do before):

public class HelloXcoreStandaloneSetup extends HelloXcoreStandaloneSetupGenerated{

	public static void doSetup() {
		new HelloXcoreStandaloneSetup().createInjectorAndDoEMFRegistration();

	public void register(Injector injector) {
		// added after the switching to Xcore
		if (!EPackage.Registry.INSTANCE.containsKey("http://www.xtext.org/example/helloxcore/HelloXcore")) {
			EPackage.Registry.INSTANCE.put("http://www.xtext.org/example/helloxcore/HelloXcore", org.xtext.example.helloxcore.helloxcore.HelloxcorePackage.eINSTANCE);


And now the Junit tests will run again.

Modifying the metamodel with Xcore

We can now customize our metamodel, using the Xcore editor.

For example, we add the interface Element, with the method getName() and we make both Hello and Greeting implement this interface (they both have getName() thus the implementation of the interface is automatic).

interface Element {
	op String getName()

class Hello extends Element {
	String name

class Greeting extends Element {
	String name
	contains XExpression expression

Using the Xcore editor is easy, and you have content assist; as soon as you press save, the Java files will be automatically regenerated:


We also add a method getElements() to the Model class returning an Iterable<Element>(containing both the Hello and the Greeting objects). This time, with Xcore, it is really easy to do so (compare that with the procedure of the previous post, requiring the use of EAnnotation in the Ecore file), since Xcore uses Xbase expression syntax for defining the body of the operations (with full content assist, not to mention automatic import statement insertions). See also the generated Java code on the right:


And now we can implement the validator method checking duplicates, using the new getElements() method and the fact that now both Hello and Greeting implement Element:

package org.xtext.example.helloxcore.validation

import org.eclipse.xtext.validation.Check
import org.eclipse.xtext.xbase.typesystem.util.Multimaps2
import org.xtext.example.helloxcore.helloxcore.Element
import org.xtext.example.helloxcore.helloxcore.Model

class HelloXcoreValidator extends AbstractHelloXcoreValidator {

	public static val DUPLICATE_NAME = 'HelloXcoreDuplicateName'

	def checkDuplicateElements(Model model) {
		val nameMap = <String,Element>Multimaps2.newLinkedHashListMultimap
		for (e : model.elements)
			nameMap.put(e.name, e)
		for (entry : nameMap.asMap.entrySet) {
			val duplicates = entry.value
			if (duplicates.size > 1) {
				for (d : duplicates)
						"Duplicate name '" + entry.key + "' (" + d.eClass.name + ")",

That’s all! I hope you found this tutorial useful :)


Be Sociable, Share!
April 18, 2014

Web resource optimization and Maven Profiles

Some time ago, I described how to perform CSS and JS minification using Eclipse and WRO4J, thanks to m2e-wro4j.

In this article, we’ll deploy a Java EE 6 Restful application with an HTML5 front-end on a Wildfly application server. We’ll see how, using m2e-wro4j, m2e-wtp and the JBoss Tools Maven Profile Management UI, you can easily switch between minified and regular build profiles.

Setup your Eclipse-based IDE

First you’ll need to install a couple things into an Eclipse Java EE installation. I recommend you use Red Hat JBoss Developer Studio, as it comes with m2e, m2e-wtp, the Maven Profile Manager, the Wildfly server adapter and JBoss Central out-of-the-box, but any Eclipse based installation (Kepler) will do, provided you install the proper plugins.

Red Hat JBoss Developer Studio Eclipse Java EE
  • You can download and install Red Hat JBoss Developer Studio 7.1.1 from here

  • or install it over an existing Eclipse (Kepler) installation from the Eclipse Marketplace

Make sure you install the following JBoss Tools features :

  • Maven Profiles Management

  • JBossAS Server

  • JBoss Central

m2e-wro4j can then be installed from JBoss Central’s Software/Update tab :

m2e wro4j installation

Alternatively, you can find and install m2e-wro4j from the Eclipse Marketplace too.

Also make sure you have a Wildfly server installed on your machine.

About m2e-wro4j

The m2e-wro4j connector allows wro4j-maven-plugin to be invoked when .js, .css, .coffee, .less, .sass, .scss, .json, .template or pom.xml files are modified.

Given the following configuration :


When m2e-wtp is present, m2e-wro4j automatically translates ${project.build.directory}/${project.build.finalName}/* output directories (the default values used by wro4j-maven-plugin) to ${project.build.directory}/m2e-wtp/web-resources/. This gives you on-the-fly redeployment of optimized resources on WTP controlled servers.

In order to use wro4j-maven-plugin, you need a wro.xxx descriptor (that would be src/main/webapp/WEB-INF/wro.xml by default) and a wro.properties. Read https://code.google.com/p/wro4j/wiki/MavenPlugin for more details.

Create an HTML 5 project

Now let’s get to work. From the JBoss Central Getting Started tab, click on the HTML 5 project icon to create a Maven based web application with a JAX-RS back-end and an HTML 5 front-end.

html5 project jboss central

I used kitchensink as a project name.

This project already has some wro4j-maven-plugin configuration we can use. All we need is to make the switch between regular and minified versions of the build more user friendly.

Enable minification

First, we need to remove one xml snippet from the pom.xml which prevents wro4j-maven-plugin from running during Eclipse builds, thus cancelling m2e-wro4j’s efforts. Go to the minify profile and delete :

              <!--This plugin's configuration is used to store Eclipse m2e settings only. It has no influence on the Maven build itself.-->

In order to use the minified versions of javascript and css files in the app, the original index.html file requires you to (un)comment several lines like :

yep: {
          //assign labeled callbacks for later execution after script loads.
          //we are on mobile device so load appropriate CSS
          "jqmcss": "css/jquery.mobile-1.3.1.min.css",
          // For minification, uncomment this line
          //"mcss": "css/m.screen.min.css"
          // For minification, comment out this line
          "mcss": "css/m.screen.css"
      nope: {
          //we are on desktop
          // For minification, uncomment this line
          //"scss": "css/d.screen.min.css"
          // For minification, comment out this line
          "scss": "css/d.screen.css"

This is clearly not practical if you want to be able to quickly switch between minified and original versions of your files.

Fortunately, we can take advantage of some Maven black magic, also known as web resource filtering, to dynamically use the proper (non-minified) version of these files, depending on the profile we’re building with.

We have 3 things to do :

  1. define a maven property for the minified file extension currently in use

  2. enable web resource filtering in the maven-war-plugin configuration

  3. modify index.html so it uses the aforementioned maven property

Setup the maven property

In the main <properties> block, at the top of your pom.xml, add the following property :

<!-- By default, the original filename will be used -->

Now add a <properties> block to the minify profile :


Enable web resource filtering

Modify the default maven-war-plugin configuration to enable filtering on html files :

          <!-- Java EE 6 doesn't require web.xml, Maven needs to catch up! -->

Use the maven property

In index.html, replace at least the following occurrences :

from to







You can also apply similar changes to lodash and jquery if you want.

At this point, you should see the filtered target/m2e-wtp/web-resources/index.html references the original resources, as the minified extension enabled by default is an empty string.

Switch between minified and regular profiles

Let’s see what happens when enabling the minify profile. Ctrl+Alt+P is a shortcut bringing up the Maven Profile Management UI. Just check/uncheck profiles to enable/disable them :

profile selection

Once the minify profile is active, you’ll see that :

  • css/m.screen.min.css, css/d.screen.min.css,js/app.min.js are generated under target/m2e-wtp/web-resources/

  • target/m2e-wtp/web-resources/index.html now references the minified versions of the resources

minified resources

Deploy the application on WildFly

  • Right click on your project and select Run As > Run on Server …

  • Create a new Wildfly Server if necessary, pointing at your locally installed server

  • Once the server is created and deployed, a browser window should open :

deployed application1

If you’re on a desktop, modify the color of the h1 class in src/main/webapp/css/d.screen.css and save. This will trigger wro4j-maven-plugin:run which will regenerate the minified version under target/m2e-wtp/web-resources/css/d.screen.min.css, which in turn will be deployed on Wildfly by the server adapter.

Reloading the page (after the build is complete) will show the changes directly :

deployed application2

Now you can switch back to using the regular, non-minified version by hitting Ctrl+Alt+P in the workbench, unselect the minify profile and wait for the build to complete. After you reload your browser page, you’ll notice, if you look at the source page, the minified versions are not referenced anymore.

The minified files still exist under target/m2e-wtp/web-resources/ and are deployed as such. They’re unused, so they’re harmless, but you’d need to perform a clean build to remove them, if necessary.


Dealing with optimized resources in Eclipse is made really easy with a combination of the right tools. Just focus on code, save your work, see the changes instantly. WRO4J and wro4j-maven-plugin can give you much more than what we just saw here (resource aggregation, obfuscation, less processing …). Hopefully you’ll want to give it a try after reading this article, and if you do, don’t forget to give us your feedback.

Issues with m2e-wro4j can be opened at https://github.com/jbosstools/m2e-wro4j/issues.

Issues with the Maven Profile Manager can be opened at :

As always, for all your WRO4J or wro4j-maven-plugin specific issues, I strongly encourage you to :

Have fun!

Next Beta for Luna

We are ready with JBoss Tools 4.2 Beta1 and Red Hat JBoss Developer Studio 8 Beta1.

jbosstools jbdevstudio blog header

This announcement is a bit special since it is on our new website, with a cleaner and more consistent structure.

There is the specific version download available from JBoss Tools 4.2.0.Beta1.

Then there is the Luna download page which shows the latest Stable, Development and nightly download for Eclipse Luna - that is available from Luna releases.

From either of these you can get to the Downloads, updatesites and What’s New!


JBoss Developer Studio comes with everything pre-bundled in its installer. Simply download it from our JBoss Products pagerun it like this:

java -jar jbdevstudio-<installername>.jar

JBoss Tools or Bring-Your-Own-Eclipse (BYOE) JBoss Developer Studio requires a bit more:

This release requires at least Eclipse 4.4 (Luna) M6 but we recommend using the Eclipse 4.4 JEE Bundle since then you get most of the dependencies preinstalled.

Once you have installed Eclipse, you either find us on Eclipse Marketplace under "JBoss Tools (Luna)" or "JBoss Developer Studio (Luna)".

For JBoss Tools you can also use our update site directly if you are up for it.


Note: Integration Stack tooling will become available from JBoss Central at an later date.

What is new ?

As always there is more than can be covered in a single blog but here are some of my favorites. You can see everything in the What’s New section for this release.

Refactored and improved Server Tools

The server adapters for JBoss AS, EAP and WildFly have been refactored to support…

  1. runtime-less servers (i.e. no need to have a server locally installed)

  2. deployment over management API (i.e. no need to have local or SSH based access anymore, just Management Api)

  3. start/stop scripts for deploy only servers

  4. Profiles for easier setup


These improvements required significant changes in the server adapter core API and UI. The meat of the code is still the same - just split out to be more reusable and composable.

We would really like to hear from you if the UI is good, bad, better, worse; and in any case how we can make it even better. We’re obviously interesting in knowing whether these features do work for you.

You can see more in the Servers What’s New.

OpenShift Downloadable Cartridges

OpenShift has been offering support for custom defined cartridges - now OpenShift tools supports this natively too.

This means you can create or use an existing cartridge by choosing the "Code Anything" option for either Applications or Cartridges.

Downloadable Cartridges

This allows you to, for example use a Cartridge like WildFly 8 or the recent OpenShift on JRebel experiment without leaving the IDE.

More support for multiple Cordova versions

The Cordova runtime support is now extended to also support using locally downloaded runtimes. Making it possible to use custom builds or simply distributions that are not directly available from Cordova repositories.


There are also various other fixes which can be viewed at Aerogear What’s New.

p.s. we are contributing these tools to eclipse.org under Project Thym.

The Cordova Simulator also now understands the notion of multiple Cordova runtimes allowing you to test against multiple versions.


Arquillian XML Editor

The Arquillian tooling (for now only available in JBoss Tools) added a Sapphire based editor for the arquillian.xml file format.

For now it is a simple structured editor but we plan on using it for experimenting with Sapphire to provide better XML oriented editors.


This Arquillian editor was based on ideas and initial contribution from Masao Kunii working for Nippon Telegraph and Telephone Corporation (NTT) - thank you!

Forge 2 Goodies

Recent version of Forge 1 and 2 are now distributed with tools and especially the later one brings an interesting feature.

Forge 2 Connection Profiles now will consider Eclipse Database Tooling Platform (DTP) connections meaning you no longer have to configure your database settings in multiple places.

Connection Profiles

This feature is called connection profiles in Forge 2.

Usage tracking

We have learned a lot from our last 3+ years usage tracking and continue to be amazed with how many nationalities, countries and operating system distributions the tools are used in - it has been and continue to be very informative and helpful in guiding our tooling support.

In Beta1 we have gone a step further into learning not only about how many starts JBoss Tools, but also now which features are being used.

In Beta1 we now collect info about which server types are used and which JBoss Central installs are being done to be able to see how much and how often these features are used.

We will use that in the future to decide new development and maintanence work - thus if you love a feature in JBoss Tools then please say yes to usage tracking and use the feature and we’ll notice.

Note: We are only collecting aggregated summaries of i.e. server types and installs - meaning we cannot use any of this info to identify you. It is truly anonymous usage statistics - we are not the NSA!

Next steps

While we wait for feedback on Beta1, we are already working on what will become Beta2. Some of things that are moving here are:

  1. Looking at doing radical changes to how the visual page editor works since XULRunner is not maintained anymore and we need HTML5 support

  2. Getting better (read: much better) javascript content assist with help from tern.java

  3. Improve JavaEE 6 and JavaEE 7 support

  4. Full Java8 support

  5. …and more!

Hope you enjoy it and remember…

Have fun!

Max Rydahl Andersen

JBoss Tools Integration Stack 4.1.4.Final / JBoss Developer Studio Integration Stack 7.0.1.GA

Fuse Tooling goes Final in the Stack!

jbosstools jbdevstudio blog header

The Integration Stack for JBoss Tools Developer Studio is a set of plugins for Eclipse that provides tooling for the following frameworks:

  • BPEL Designer - Orchestrating your business processes.

  • BPMN2 Modeler - A graphical modeling tool which allows creation and editing of Business Process Modeling Notation diagrams using graphiti.

  • Drools - A Business Logic integration Platform which provides a unified and integrated platform for Rules, Workflow and Event Processing.

  • JBoss ESB - An enterprise service bus for connecting enterprise applications and services.

  • Fuse Apache Camel Tooling - A graphical tool for integrating software components that works with Apache ServiceMix, Apache ActiveMQ, Apache Camel and the FuseSource distributions.

  • jBPM3 - A flexible Business Process Management (BPM) Suite - JBoss Enterprise SOA Platform 5.3.x compatible version.

  • Modeshape - A distributed, hierarchical, transactional and consistent data store with support for queries, full-text search, events, versioning, references, and flexible and dynamic schemas. It is very fast, highly available, extremely scalable, and it is 100% open source.

  • Savara (JBoss Tools only) - A tool for ensuring artifacts defined at different stages of the software development lifecycle are valid against each other, and remain valid through the evolution of the system.

  • SwitchYard - A lightweight service delivery framework providing full lifecycle support for developing, deploying, and managing service-oriented applications.

  • Teiid Designer - A visual tool that enables rapid, model-driven definition, integration, management and testing of data services without programming using the Teiid runtime framework.

All of these components have been verified to work with the same dependencies as JBoss Tools 4.1 and Developer Studio 7, so installation is easy.


To install the Integration Stack tools, first install JBoss Developer Studio from the all-in-one installer, bundled and configured out of the box with everything you need to get started. Alternatively, if you already have eclipse-jee-kepler installed, you can install JBoss Developer Studio or JBoss Tools from the Eclipse Marketplace via Help > Eclipse Marketplace…


Once Developer Studio is installed, restart Eclipse and select the Software/Update tab in the JBoss Central view and look for the JBoss Developer Studio Integration Stack installation section. Select the items you’d like to install:


If you want to try out Savara you will need to use the JBoss Tools Integration Stack URL instead:

Note: If you installed into your own Eclipse you should bump up the launch resource parameters:

--launcher.XXMaxPermSize 256m --launcher.appendVmargs -vmargs -Dosgi.requiredJavaVersion=1.6 -XX:MaxPermSize=256m -Xms512m -Xmx1024m

What’s New?

Fuse Apache Camel Tooling has gone .Final

The previous release of the JBDS Integration Stack had a candidate release of Fuse Apache Camel Tooling. The support is now complete and ready to run with the Fuse Active MQ 6.1 release.

Updated Components

Fix release versions of BPMN2 Modeler, SwitchYard and Teiid Designer are also available in this release.

The new JBoss Tools home is live

Don’t miss the new Features tab for up to date information on your favorite Integration Stack component: /features

April 17, 2014

EMFStore 1.2.0 released

We just promoted the 1.2.0 RC to become the EMFStore 1.2.0 release.  EMFStore now includes extension points and API which allow you to more simply replace the storage technology on client and on server side based on the EMF resource set abstraction. As a proof-of-concept we have implemented a mongoDB back-end based on Mongo-EMF.

To try EMFStore and get started please see the updated Getting Started with EMFStore tutorial. Enjoy!

 EMFStore 1.2.0 released


Leave a Comment. Tagged with emf, EMFStore, emf, EMFStore

April 16, 2014

Welcome Richard

I’m very happy to introduce the Eclipse Foundation’s newest staff member, Richard Burcher.

Richard will be helping me with “project stuff”. If you take a quick look at Richard’s blog or Twitter feed, you’ll notice that he’s got a an affinity for location/geo-sorts of things in particular and open source software in general (along with an apparent disdain for capital letters). As part of that, Richard has a bunch of experience working with and growing communities. I intend to fully exploit that experience.

We’ve already begun Richard’s immersion into all things Eclipse. It’s been a fascinating and valuable experience for me to explain how things work around here. Assembling materials for the Eclipse Committer Bootcamp, helped me illuminate potential areas for improvement in our process. Working with Richard over the last few days has had an even bigger effect. I’m excited that we’ve already identified a potential ways that we can improve and streamline our implementation of the Eclipse Development Process, while honoring its fundamental underlying principles.

I’m excited by the potential for us to do more to help our open source projects evangelize and court their communities.

More on this later. But for now, please join me in warmly welcoming Richard aboard.

How to manage Git Submodules with JGit

For a larger project with Git you may find yourself wanting to share code among multiple repositories. Whether it is a shared library between projects or perhaps templates and such used among multiple different products. The Git built-in answer to this problem are submodules. They allow putting a clone of another repository as a subdirectory within a parent repository (sometimes also referred to as the superproject). A submodule is a repository in its own. You can commit, branch, rebase, etc. from inside it, just as with any other repository.

JGit offers an API that implements most of the Git submodule commands. And this API it is I would like to introduce you to.

The Setup

The code snippets used throughout this article are written as learning tests1. Simple tests can help to understand how third-party code works and adopting new APIs. They can be viewed as controlled experiments that allow you to discover exactly how the third-party code behaves.

A helpful side effect is that, if you keep the tests, they can help you to verify new releases of the third-party code. If your tests cover how you use the library, then incompatible changes in the third-party code will show themselves early on.

Back to the topic at hand: all tests share the same setup. See the full source code for details. There is an empty repository called parent. Next to it there is a library repository. The tests will add this as a submodule to the parent. The library repository has an initial commit with a file named readme.txt in it. A setUp method creates both repositories like so:

Git git = Git.init().setDirectory( "/tmp/path/to/repo" ).call();

The repositories are represented through the fields parent and library of type Git. This class wraps a repository and gives access to all Commands available in JGit. As I explained here earlier, each Command class corresponds to a native Git pocelain command. To invoke a command the builder pattern is used. For example, the result from the Git.commit() method is actually a CommitCommand. After providing any necessary arguments you can invoke its call() method.

Add a Submodule

The first and obvious step is to add a submodule to an existing repository. Using the setup outlined above, the library repository should be added as a submodule in the modules/library directory of the parent repository.

public void testAddSubmodule() throws Exception {
  String uri 
    = library.getRepository().getDirectory().getCanonicalPath();
  SubmoduleAddCommand addCommand = parent.submoduleAdd();
  addCommand.setURI( uri );
  addCommand.setPath( "modules/library" );
  Repository repository = addCommand.call();
  F‌ile workDir = parent.getRepository().getWorkTree();
  F‌ile readme = new F‌ile( workDir, "modules/library/readme.txt" );
  F‌ile gitmodules = new F‌ile( workDir, ".gitmodules" );
  assertTrue( readme.isF‌ile() );
  assertTrue( gitmodules.isF‌ile() );

The two things the SubmoduleAddCommand needs to know are from where the submodule should be cloned and a where it should be stored. The URI (shouldn’t it be called URL?) attribute denotes the location of the repository to clone from as it would be given to the clone command. And the path attribute specifies in which directory – relative to the parent repositories’ work directory root – the submodule should be placed. After the commands was run, the work directory of the parent repository looks like this:

The library repository is placed in the modules/library directory and its work tree is checked out. call() returns a Repository object that you can use like a regular repository. This also means that you have to explicitly close the returned repository to avoid leaking file handles.

The image reveals that the SubmoduleAddCommand did one more thing. It created a .gitmodules file in the root of the parent repository work directory and added it to the index.

[submodule "modules/library"]
path = modules/library
url = git@example.com:path/to/lib.git

If you ever looked into a Git config file you will recognize the syntax. The file lists all the submodules that are referenced from this repository. For each submodule it stores the mapping between the repository’s URL and the local directory it was pulled into. Once this file is committed and pushed, everyone who clones the repository knows where to get the submodules from (later more on that).


Once we have added a submodule we may want to know that it is actually known by the parent repository. The first test did a naive check in that it verified that certain files and directories existed. But there is also an API to list the submodules of a repository. This is what the code below does:

public void testListSubmodules() throws Exception {
  Map<String,SubmoduleStatus> submodules 
    = parent.submoduleStatus().call();
  assertEquals( 1, submodules.size() );
  SubmoduleStatus status = submodules.get( "modules/library" );
  assertEquals( INITIALIZED, status.getType() );

The SubmoduleStatus command returns a map of all the submodules in the repository where the key is the path to the submodule and the value is a SubmoduleStatus. With the above code we can verify that the just added submodule is actually there and INITIALIZED. The command also allows to add one or more paths to limit the status reporting to.

Speaking of status, JGit’s StatusCommand isn’t at the the same level as native Git. Submodules are always treated as if the command was run with &dash&dashignore-submodules=dirty: changes to the work directory of submodules are ignored.

Updating a Submodule

Submodules always point to a specific commit of the repository that they represent. Someone who clones the parent repository somewhen in the future will get the exact same submodule state although the submodule may have new commits upstream.

In order to change the revision, you must explicitly update a submodule like outlined here:

public void testUpdateSubmodule() throws Exception {
  ObjectId newHead = library.commit().setMessage( "msg" ).call();
  File workDir = parent.getRepository().getWorkTree();
  Git libModule = Git.open( new F‌ile( workDir, "modules/library" ) );
  parent.add().addF‌ilepattern( "modules/library" ).call();
  parent.commit().setMessage( "Update submodule" ).call();
  assertEquals( newHead, getSubmoduleHead( "modules/library" ) );

This rather lengthy snippet first commits something to the library repository (line 4) and then updates the library submodule to the latest commit (line 7 to 9).

To make the update permanent, the submodule must be committed (line 10 and 11). The commit stores the updated commit-id of the submodule under its name (modules/library in this example). Finally you usually want to push the changes to make them available to others.

Updating Changes to Submodules in the Parent Repository

Fetching commits from upstream into the parent repository may also change the submodule configuration. The submodules themselvs, however are not updated automatically.

This is what the SubmoduleUpdateCommand solves. Using the command without further parametrization will update all registered submodules. The command will clone missing submodules and checkout the commit specified in the configuration. Like with other submodule commands, there is an addPath() method to only update submodules within the given paths.

Cloning a Repository with Submodules

You probably got the pattern meanwhile, everything to do with submodules is manual labor. Cloning a repository that has a submodule configuration does not clone the submodules by default. But the CloneCommand has a cloneSubmodules attribute and setting this to true, well, also clones the configured submodules. Internally the SubmoduleInitCommand and SubmoduleUpdateCommand are executed recursively after the (parent) repository was cloned and its work directory was checked out.

Removing a Submodule

To remove a submodule you would expect to write something like
git.submoduleRm().setPath( ... ).call();
Unfortunately, neither native Git nor JGit has a built-in command to remove submodules. Hopefully this will be resolved in the future. Until then we must manually remove submodules. If you scroll down to the removeSubmodule() method you will see that it is no rocket science.

First, the respective submodule section is removed from the .gitsubmodules and .git/config files. Then the submodule entry in the index is also removed. Finally the changes – .gitsubmodules and the removed submodule in the index – are committed and the submodule content is deleted from the work directory.

For-Each Submodule

Native Git offers the git submodule foreach command to execute a shell command for each submodule. While JGit doesn’t exactly support such a command, it offers the SubmoduleWalk. This class can be used to iterate over the submodules in a repository. The following example fetches upstream commits for all submodules.

public void testSubmoduleWalk() throws Exception {

  int submoduleCount = 0;
  Repository parentRepository = parent.getRepository();
  SubmoduleWalk walk = SubmoduleWalk.forIndex( parentRepository );
  while( walk.next() ) {
    Repository submoduleRepository = walk.getRepository();
    Git.wrap( submoduleRepository ).fetch().call();
  assertEquals( 1, submoduleCount );

With next() the walk can be advanced to the next submodule. The method returns false if there are no more submodules. When done with a SubmoduleWalk, its allocated resources should be freed by calling release(). Again, if you obtain a Repository instance for a submodule do not forget to close it.

The SubmoduleWalk can also be used to gather detailed information about submodules.
Most of its getters relate to properties of the current submodule like path, head, remote URL, etc.

Sync Remote URLs

We have seen before that submodule configurations are stored in the .gitsubmodules file at the root of the repository work directory. Well, at least the remote URL can be overridden in .git/config. And then there is the config file of the submodule itself. This in turn can have yet another remote URL. The SubmoduleSyncCommand can be used to reset all remote URLs to the settings in .gitmodules

As you can see, the support for submodules in JGit is almost at level with native Git. Most of its commands are implemented or can be emulated with little effort. And if you find that something is not working or missing you can always ask the friendly and helpful JGit community for assistance.

  1. The term is taken from the section on ‘Exploring and Learning Boundaries’ in Clean Code by Robert C. Martin

Inactive Blogs