OpenShift, JDK 8, Gradle 2.3, Working Together


Recently I rolled out a JDK 8, gradle 2.3 built, service in Heroku’s free tier (see Continuous Delivery, Java to Heroku Via Travis CI). It worked well enough but I noticed that it’s response time wasn’t great and in particular it seemed to need to spin back up after a period of inactivity. So I decided to compare another PaaS’s performance and chose the OpenShift free tier. So I compared one Heroku dino to one OpenShift gear. What I found was that OpenShift was generally more performant and didn’t exhibit the spin up issue.

Easier Said Than Done

As it turned out the OpenShift deployment process is a bit more involved than the Heroku one. Both work via a git push coupled with configuration, so they are conceptually similar, but with Heroku it uses your git repo and a single YAML file, whereas OpenShift expects to work from their git infrastructure, and an an entire file tree of configuration scripts. I got it all sorted, but it meant that re-homing of the service wasn’t simply plug and play.


The characteristics of a project, Java 8, Gradle, Tomcat etc. are supported in OpenShift by adding what they refer to as cartridges. There’s a wide variety of cartridges out there, but I couldn’t find cartridges for the key facets of my project:

  • Java 8
  • Gradle 2.3
  • Embedded Jetty server

Solution: Do It Yourself

OpenShift does have a DiY cartridge, which does little more then provide a rough template of an application. So I went with that. I created a DiY app, and then scrubbed the template cleaner still. This left me with an “app” that was an empty container, but with the .openshift control scripts templates in place. I deployed that app and OpenShift spun up my gear, which didn’t do anything. However, OpenShift does allow you to SSH to your gear and so I did that and poked around.

Java 8

First thing I confirmed was, as someone had posted elsewhere, that Java 8, although it isn’t the default, is available in every gear. Sure enough it was, so now I knew that by setting JAVA_HOME and adding things to my path I’d be good.

Gradle 2.3

Gradle itself provides sandboxing using gradle wrapper to create a gradlew script and dependencies, so all I needed to do was see if I could get that sandbox working on OpenShift.


Some searching turned up that you can run any app, and if it binds to ${OPENSHIFT_DIY_IP}:8080 it will work. So now it was just a matter of configuring my embedded Jetty instance.

Pulling It All Together

With those discoveries, here’s how I pulled it all together.

The OpenShift Basics

First get the rhc tools installed. A warning as we go forward, OpenShift will try to deploy your app every push, so the pushes may appear to hang as we check in partial solutions that OpenShift deploys and fails. Wait them out – it’s fine. Let’s start on the setup:

rhc app create example diy-0.1 -l
cd example
git rm -r diy misc
touch .openshift/action_hooks/build
chmod a+x .openshift/action_hooks/build
git add -f .openshift/action_hooks/build
git commit -m "Cleaning up"
git push

Note, in the create use the email you registered at OpenShift. Ok, now you’ve got the basic control skeleton, lets flesh it out a bit. First the control scripts. Let’s get the gradle build ready to go, edit .openshift/action_hooks/build to look as follows:


export JAVA_HOME=/etc/alternatives/java_sdk_1.8.0
export PATH=${JAVA_HOME}/bin:${PATH}


./gradlew -q stage

The GRADLE_USER_HOME variable tell gradle where to write output, you need this because the OpenShift environment locks down all but a specific area. The stage task in the gradle file we will define later. Here’s a basic .openshift/action_hooks/start example:


export JAVA_HOME=/etc/alternatives/java_sdk_1.8.0
export PATH=${JAVA_HOME}/bin:${PATH}

java -cp build/staging:build/staging/* your.Main ${OPENSHIFT_DIY_IP} 8080 |& /usr/bin/logshifter &

The final line has a couple of noteworthy aspects. First, the build/staging path is the result of the gradle stage task previously mentioned, again I’ll cover that below. Second the ${OPENSHIFT_DIY_IP} and 8080 are the address and port your app must bind to for your service to successfully run in OpenShift, so I’m showing them as arguments since your Java will need them.

Finally lets deal with .openshift/action_hooks/stop to stop your service:


if [[ -n `ps -A | grep java` ]]; then
	pkill -SIGTERM java
exit 0

The Gradle Basics

At this point you want to get going with gradle. For this example, to get a basic skeleton, I’m going to let gradle do the work, I have gradle 2.3 installed locally so:

gradle init --type java-library
echo build >> .gitignore
git add .
git commit -m "Gradle skeleton"
git push

This creates a basic build file, its gradlew wrapper that will pull in gradle 2.3 as needed, and a template src folder.

And the last bit of special sauce, add this task to your build.gradle:

task stage(type: Copy) {
    from sourceSets.main.runtimeClasspath
    into 'build/staging'

stage.dependsOn build

What this does is pull all your generated classes and all jar dependencies together and put them in one directory. This allows your gradle to create a single directory from which OpenShift can run your app.

Let’s test this locally:

./gradlew stage
ls -1  build/staging

Good, so gradlew is able to build and stage the code. Git commit and push.

The Rest Is Up to You

At this point we have:

  1. App in OpenShift
  2. Basic control scripts
  3. Gradle skeleton
  4. All using JDK 8 & Gradle 2.3

What’s left? Well the sample Java code is not a web service and does not bind to the OpenShift ${OPENSHIFT_DIY_IP}:8080 convention, so it doesn’t run. Change src/main/java to be a proper web service that binds to the right address and port, and then your very next push your app will be live!


Continuous Delivery, Java to Heroku via Travis CI

I’ve been using the heroku app Maven-Badges to show the latest release of my projects for some time. I like it, but as I started to use JCenter, and Gradle Plugins to release projects I was looking for similar tools for those. I didn’t find anything that was a good fit. Maven badges is written in Ruby, and my ruby skills are basic, so offering to enhance that didn’t seem like the way to go. So, having never used Heroku, I took it as an opportunity to learn something new, and decided to write one.


First I laid out my objectives:

  • Keep it simple and light weight
  • Make it extensible and easy to maintain
  • Initially support: maven central, jcenter, gradle plugins
  • Try out one of the micro RESTful frameworks for Java
  • Deploy it to Heroku

How it All Came Together In a Couple of Days

After a quick first days iteration, I landed on a tool chain I am pleased and impressed with.

  • Java 8.
  • JUnit and Mackito for testing.
  • Spark Framework for the RESTful server.
  • Gradle as the build tool.
  • Github as the source repository
  • Travis CI for continous integration and deployment.
  • Jacoco and Coveralls for code coverage.
  • Heroku for app hosting.

So basically, I coded a really simple little service in Java and Spark, using Gradle to build it. The source is on Github. When I push to Github, Travis picks it up, builds it, and runs my tests, and has Coveralls deal with the code coverage. If the tests and coverage are clean, Travis deploys a release on Heroku. None of this is original, all I did was cobble together various clear examples. No muss. No fuss. All automated. Pretty damn slick.


I’m not going to walk through many of the details, because as I said, mostly I just cobbled together other examples. You can look at my results in Github or target the repo-redirect app directly.

About the only bits of unique work I ended up doing was figuring out how to implement the gradle stage task that Heroku uses to deploy a gradle based project. I took a dead simple approach adding the following to my build.gradle:

task stage(type: Copy) {
    from sourceSets.main.runtimeClasspath
    into 'build/dependency'

stage.dependsOn build

What that did for me was have gradle stage first build the project and then collect all the dependencies into build/dependency. With that done, Heroku’s Procfile could read as:

web: java -cp build/dependency:build/dependency/* com.github.nwillc.shields.Shields --port $PORT

All the other info you’ll need can be found as follows: