As Docker is gathering more interest from the big IT companies, it seems to be a perfect way to
delivery the applications to the production environemnt. If until now the maven artifct was the
jar or war, now it looks natural the artifact should be the docker image. In fact this strategy
is not new: Netflix was, and still does, deliver its application, as AMI images tagged with
the verion and other build informations.
With Docker this approach become easier so that everyone can adopt it without the big Netflix
infrastructure. The question is now, how to make our build generate the docker imag for us? If
we are using maven, the Spotify docker-maven-plugin
comes in handy here. it is able to build an image, tag it and push it on a public or private
It provides three goals:
It is possible to push also from the tag or the build goal, and it is possible to tag also from
the build goal. It uses under the hood the spotify docker-client library backed by Jersey client.
Let take a look how to apply this plugin to the Maven lifecycle: the phases we need are three:
First of all we have to add the plugin block under our build/plugins block:
Similary for what happen for the jar, we would like our artifact to be built as part of the package
phase. So lets bind an execution to the package phase. Assuming that the build is generating a packaged
application myapp.tar.gz and that we have our Dockerfile in /src/main/docker:
The resources block can be used to add all the resources we need in our docker image. There is
also an alternative way to build image, using th xml to define the docker commands, but I will
prefere using the external Dockerfile, as it looks more familiar to the people accustomised to
At the eand of the package phase we will have the myuser/myapp:latest image built on docker.
however we want our image taggeg with the project version, don't we? We can achieve it in the
same build goal but in my opinion, the install phase is a better fit for that.
Before the install phase, we can actually use the generated image for our functional or
integration tests, maybe using the docker-client Spotify library. How the install phase
looks like? here it is:
The image parameter should match with the build goal imageName parameter, if the tag
is omitted, the latest is assumed.
At the end of the install phase, we will have our image built and tagged in the docker host,
now we can decide if we want to push on a remote docker registry. To push to a remote registry
we need actually to rename the image, because we need to add the namespace in front of it. So
we are going to reuse the tag goal again:
Since the 0.1.2 version, released today, it is possible to specify the remote registry
credentials only using maven properties at the moments, but they are enough flexible to
permit you using systen environment on your CI server for example.
I must say Spotify has done a great job to delivery this plugin, in perfect accordance
with the open source spirit.
It is a solution I have written because my collegue was struggling to write a
code that was collecting one element for each list in an ordered way and
accepting list of different size.
The problem was collecting 50 images from a third party supplier. Each record
has a set of albums, a ordered list of image, and a set of featured images pick
from the albums. We want to collect 50 images, starting from the feature and then
going trough the albums in a round-robin fashon (the first for each album, the
second and so on). The album may have different size.
I started to develop a web crawler part of a bigger project, then I have to
choice what kind of HTML parser library I have to use. I have used NekoHTML
in the past and it was pretty good but it doesn't have any helper to select
the DOM elements, you have o use the XPath, very flexible but not so easy.
I have found JSoup to be very cool library, its code is
well written, clean and the interface is powerful. I love it. I was writing
a Scala crawler so beside the JSoup interface is pretty
cool, it is very javish, I prefer to have a better integration with Scala, so
I started my first Pimp My Library pattern.
Let talk the code:
The code has been upload to github SSoup reporitory.
The Google Web Toolkit (GWT) allows you write plain java code, and then to translate it
client side. This promise is amazing, but it isn't the whole true: using
GWT without be aware of the transformation can produce a poor performance and,
worst, unmaintainable project with disastrous results for your business.
I would like to spend some time to write a post about the best practices I
have learnt using GWT and which common pitfalls to avoid.
When I have started to use GWT, the first problem I encountered was the
development of the RPC services:
They need two interfaces.
They should be implemented by java Servlet.
The serialization of objects is a difficult to manage.
Their interface should not follow the java common best practices.
The 1st point is solvable using the maven-gwt-plugin: this will generate the
Async interface as well the Servlet mapping on the web.xml descriptor
(it doesn't work well with generics actually).
The drawbacks of the 2nd point instead is that if you are using Spring, the Servlet
are instantiated outside the Spring context; you cannot apply any aspect on them.
(e.g. they can't be transactional). If you want the GWT service to be managed
by the Spring context, you need to create two instances of the same interface:
A Spring bean
A Servlet that delegates all method implementations to the Spring bean.
This is a possible implementation of that:
So if we need to implement a GWT service interface, we have to extend
and implement our interface. Then we have to write our Servlet
that extends and implements the same interface, delegating
all method implementations to the Spring bean.
This is the example for a interface:
Writing this code for each service is very boring job of course, but your IDE
can do it for you.
How you can notice the expose the concrete type
instead of the interface. It is about the 4, exposing makes
GWT creating one snipped of code for each possible implementation of . This
will generate a oversize codebase and a longer compilation time. Indeed it is against
any Java common best practice, but you have to keep in mind that GWT is not Java.
About the serialization of object:
They should implement Serializable or IsSerializable
They should have an empty default constructor
All transient or final(so crazy) fields will be ignored.
All dependent object should follow the same rules.
Of course GWT should know the codebase of the type for each dependent not final/transient
field. The trick is put the source as dependency of the maven module contains the GWT plugin.
However it is not enough because not all standard java classes are supported (no Calendar for example),
so also if you include a lib source code, unlikely will be supported. Some libs offers a separate
module for the GWT support, as Google Guava does. But they are not so common. Alternatively you
can write the serialization for a particular class yourself. However is not so trivial.
How we have seen there is no simple solutions for the 4, having a serializable object, is
complex and unlikely good integrated in our domain. If for example we are using Hibernate as
ORM, serializing an Hibernate entity is the worst thing you can do. It will serialize your whole
DB in the worst case. However in many case you have to deep copy the object because GWT is not
able to serialize the Hibernate custom collections. And usually you don't need them in client side.
At the end using GWT RPC require to write a GWT service layer with its own domain
objects. They will likely contain already preprocessed data, like the formatted date
(no joda time on GWT, neither Calendar). At the end of the day, the promise to use
your domain object in the client side is a lie.
In the next post I will describe a way to use a JSON REST api with GWT.