An introduction to Spark, your next REST Framework for Java

This is a post I wrote for the Java Advent. It was initially published here.

Today we’re going to look into a refreshing, simple, nice and pragmatic framework for writing REST applications in Java. It will be so simple, it won’t even seem like Java at all.

We’re going to look into the Spark web framework. No, it’s not related to Apache Spark. Yes, it’s unfortunate that they share the same name.

I think the best way to understand this framework is to build a simple application, so we’ll build a simple service to perform mathematical operations.

We could use it like this:

Screenshot from 2015-11-26 14-57-18

Note that the service is running on localhost at port 4567 and the resource requested is “/10/add/8”.

Set up the Project Using Gradle (what’s Gradle?)

Now we can run:

  • ./gradlew idea to generate an IntelliJ IDEA project
  • ./gradlew test to run tests
  • ./gradlew assemble to build the project
  • ./gradlew launch to start our service

Great. Now, Let’s Meet Spark

Do you think we can write a fully functional web service that performs basic mathematical operation in less than 25 lines of Java code? No way? Well, think again:

In our main method we just say that when we get a request which contains three parts (separated by slashes) we should use the Calculator route, which is our only route. A route in Spark is the unit which takes a request, processes it, and produces a response.

Our calculator is where the magic happens. It looks in the request for the paramters “left”, “operatorName” and “right”. Left and right are parsed as long values, while the operatorName is used to find the operation. For each operation we have a Function (Function2<Long, Long>) which we then apply to our values (left and right). Cool, eh?

Function2 is an interface which comes from the Javaslang project.

You can now start the service (./gradlew launch, remember?) and play around.

The last time I checked Java was more verbose, redundant, slow… well, it is healing now.

Ok, but what about tests?

So Java can actually be quite concise, and as a Software Engineer I celebrate that for a minute or two, but shortly after I start to feel uneasy… this stuff has no tests! Worse than that, it doesn’t look testable at all. The logic is in our calculator class, but it takes a Request and produces a Response. I don’t want to instantiate a Request just to check if my Calculator works as intended. Let’s refactor a little:

We just separate the plumbing (taking the values out of the request) from the logic and put it in its own method: calculate. Now we can test calculate.

I feel better now: our tests prove that this stuff works. Sure, it will throw an exception if we try to divide by zero, but that’s how it is.

What does that mean for the user, though?

Screenshot from 2015-11-26 15-14-08

It means this: a 500. And what happens if the user tries to use an operation which does not exist?

Screenshot from 2015-11-26 15-14-30

What if the values are not proper numbers?

Screenshot from 2015-11-26 15-16-01

Ok, this doesn’t seem very professional. Let’s fix it.

Error handling, functional style

To fix two of the cases we just have to use one feature of Spark: we can match specific exceptions to specific routes. Our routes will produce a meaningful HTTP status code and a proper message.

We have still to handle the case of a non-existent operation, and this is something we are going to do in ReallyTestableCalculator.

To do so we’ll use a typical function pattern: we’ll return an EitherAn Either is a collection which can have either a left or a right value. The left typically represents some sort of information about an error, like an error code or an error message. If nothing goes wrong the Either will contain a right value, which could be all sort of stuff. In our case we will return an Error (a class we defined) if the operation cannot be executed, otherwise we will return the result of the operation in a Long. So we will return an Either<Error, Long>.

Let’s test this:

The result

We got a service that can be easily tested. It performs mathematical operations. It supports the four basic operations, but it could be easily extended to support more. Errors are handled and the appropriate HTTP codes are used: 400 for bad inputs and 404 for unknown operations or values.


When I first saw Java 8 I was happy about the new features, but not very excited. However, after a few months I am seeing new frameworks come up which are based on these new features and have the potential to really change how we program in Java. Stuff like Spark and Javaslang is making the difference. I think that now Java can remain simple and solid while becoming much more agile and productive.

You can find many more tutorials either on the Spark tutorials website.

Interview with David Åse from the Spark web framework project

I think that there are a lot of people looking for ways to get involved in Open-Source projects. I thought I could help by collecting a few stories from people who already started giving back to the community. A few weeks ago I talked with Luca Barbato and today I am going to talk with David Åse.

How David and I met

Recently I started using the Spark web framework, and I wrote a tutorial on it: Getting started with Spark: it is possible to create lightweight RESTful applications also in Java. David saw that post and contacted me. After a few emails, we decided to work together on a series of tutorials for Spark to be published on sparktutorials. While talking with David I learned more about his role in the Spark project and I thought it would be interesting to share.

So let’s get started with the questions:

Hi David, tell us a bit about yourself

Hi! My name is David. I work as a Software Engineer in the UX/UI division of a global telecommunications company, where I’m allowed to do things like create Lemmings-based analytics visualizations, or build a device lab made from LEGOs. When I’m not playing with Lemmings or LEGOs, I do design and web programming with a strong focus on delivering high performance services (~1 second perceived load time for GPRS connections). I hold a Master’s Degree in Computer Science from the Norwegian University of Science and Technology, but I studied music in high school and my parents are both artists.

Is Spark the first open-source project you get involved into?

The first serious one, yes. My master thesis was an open source project which someone else took over, and I created some free mIRC scripts when I was a kid, but Spark is the first project I’ve worked on that’s being used by thousands of people every day.

How did you find out about Spark?

I was looking for a simple Java framework to set up a prototype at work. I had previously worked with Spring, JAX-RS and Play Framework, but I wanted something lighter and simpler. I was googling for lightweight Java web frameworks when I saw Spark. At first I dismissed the project as outdated/dead due to how the website looked, and googled some more. After a little while I came back to Spark again, and I decided to give it a shot when I noticed the website said the project was recently rewritten for Java 8.

How did you get involved?

After having worked with Spark for a day, I was very impressed with how easy everything was and how right it felt. I was worried that other people would (like I did) judge Spark by it’s cover and miss out. So, I sent Per (note: Per refers to Per Wendel, the creator and maintainer of Spark) the following email:


A very intensive three days later, this commit showed up on GitHub:


How did you help?

I completely redesigned and reimplemented the website, then tried to promote it.

For the design part I focused on eliminating unneeded content, only leaving the most important bits. I created a massive banner for the index page to really grab the attention of our visitors, communicating what I think are the main selling points of Spark: Java 8 and “minimal effort”. For the other pages I wanted it to be very clean, so I left everything white. It’s as minimalist as Spark itself.

For the implementation part I focused on writing search engine optimized content and following best practices regarding optimization and accessibility. The page scores 100/100 in mobile usability and 87-94/100 in speed using Google Pagespeed Insight, which makes google like us more and places us higher up in the search results (we didn’t have to worry about Mobilegeddon!). Note: Mobilegeddon refers to the abrupt downgrade Google gave to websites because of their poor performance on mobile usability, read here for details.

After I was pleased with the look and performance of the website, I tried to spread the word online. This was the hard part. I created social media accounts and posted to various Java forums online. The most successful was a post to reddit, which I think got us about a thousand visitors in a few days (which is a lot for a Java web framework).

Talk about the effects of rewriting the website?

It’s hard to say since we did not have analytics on the old page, but I’ve used Alexa and Ahrefs to estimate the past website traffic. When I joined Spark, it’s popularity had fallen from rank 800.000 to about 1.200.000 on Alexa, and it was losing more backlinks than it was gaining. Since then we’ve been on a steady climb up. We’re currently hovering around rank 400.000, and the amount of referencing pages/domains has doubled. The number of visitors to our webpage has increased with 30% comparing Q4 2014 to Q1 2015, so it looks like everything is going the right way. We’ve also increased our google search position a lot, which is important since about 65% of our traffic is from google.

How do you get feedback?

I rely a lot on my friends, colleagues and my girlfriend. I appreciate brutally honest feedback, which can be hard to get from strangers. Other than that I use analytics data a lot to see how the site is performing and how users are behaving, and make changes accordingly.

What plans do you have in the future?

We are currently evaluating if we can establish a dedicated Spark team with paid developers. We recently ran a user survey which gave us a pretty good understanding of who uses Spark and for what, and if our users would be willing to sponsor the project in return for extended support.

If he decides to go that way, I will work part time on the project, expanding the webpage functionality in order to provide better documentation, migration guides and tutorials. If not, I will contribute when I have the time, as I do now.

Are there any other projects which you find interesting?

Of the lesser known projects, I am a big fan of Intercooler. While I do like the concept behind Angular and the like, I just don’t think we’re quite there yet. Especially considering low end devices in emerging markets, going full JavaScript is just too slow.

How was your experience giving back to the community? Did it help you in any way?

I learned a lot about the importance and benefits of analytics and having an “online presence”, which I think a lot smaller open source projects could be better at. There seems to be sort of a “if we build it, they will come” mentality, but people are usually set in their ways and they need to be convinced that your project is worth looking into.

Federico: I fully agree with this. I think everyone is very busy and we have to help them find out immediately what we are providing, and Spark is doing a great job in this respect. “A tiny Sinatra inspired framework for creating web applications in Java 8 with minimal effort” is a clear and effective description of Spark.

What suggestions would you give to people who want to contribute to Open-source, but don’t know where to start?

As I started frequenting reddit (while trying to build Spark’s online presence), I noticed that people sometimes post about wanting to contribute to open source projects in programming language subreddits. These threads usually rank pretty high for a while, so I would just suggest doing that. If you have Java skills and you want to contribute, just go to /r/java and ask for project suggestions. Otherwise, if you already use open source software, there’s almost always a “Contact” or “Contribute” tab you could click on on their webpage.

Federico: I should probably start adding a “Contribute” section to the of my projects, or maybe a file, as several projects are starting to do.

P.S. In the last days David has released a new project called j2html: it is library to build HTML pages programmatically, and the source is available on GitHub. I find it quite useful when I have to throw in some snippets of HTML for which it is not worthy the hassle of adding a template engine. Give it a try!


I found David’s story very interesting because it shows us how complex the Open-Source world is, and how many different things we can do to contribute. He is a technical person and rewrote the Spark website making it amazing, but he also focused on promoting the framework, finding different channels and communicating on all of them, finding ways to monitor the improvements he was doing and recruiting other volunteers (like me :D).

I also like very much the fact that he found ways to contribute focusing on aspects that the maintainer did not consider. I think this is what is great about having many people involved in one project: everyone contributes according to his/her own specific skills and the result is so much more than the sum of the single parts.

As an encouragement to you: There are many different ways to help Open-Source projects, you just have to find one that aligns with your skillset!

A tutorial on using Sql2o with Spark and other updates

A few weeks ago I wrote a tutorial on getting started with Spark (the Java web framework). A few readers appreciated it and it was linked by the Jetbrains blog, republished by DZone and republished by the new Spark tutorials blog.

After that me and David Åse chatted a bit and we decided to work together on a few tutorials to publish on the Spark tutorials blog. So today we publish the first of hopefully a long list: Spark and Databases: Configuring Spark to work with Sql2o in a testable way.

Content of the tutorial on Sql2o + Spark

  • see when to use an ORM and when not

  • how to organize the code that access the database and integrate it with the controllers

  • how to use Sql2o

  • we put everything together and improve the BlogService we have started in the first post on Spark.

At the end we will have something like this:


Plans for the future

David is a great guy that among the other things rewrote the Spark website (does look cool, eh?). I asked him how he was involved in Spark and we are working on a short interview, similar to the one I had with Luca Barbato: I think it always inspiring to learn how people started giving back to the open-source community.

Reviewing, reviewing, reviewing

In the rest of the week I have been fairly busy doing a technical reviews for two books from the Pragmatic Bookshelf (did I tell already that I love their books?). It required a fair amount of effort but I learned a few things on topics I would not have time to spend time on normally, so I am fairly happy.

Getting started with Spark: it is possible to create lightweight RESTful application also in Java

Recently I have been writing a RESTful service using Spark, a web framework for Java (which is not related to Apache Spark). When we planned to write this I was ready to the unavoidable Javaesque avalanche of interfaces, boilerplate code and deep hierarchies. I was very surprised to find out that an alternative world exists also for the developers confined to Java.

In this post we are going to see how to build a RESTful application for a blog, using JSON to transfer data. We will see:

  • how to create a simple Hello world in Spark
  • how to specify the layout of the JSON object expected in the request
  • how to send a post request to create a new post
  • how to send a get request to retrieve the list of posts

We are not going to see how to insert this data in a DB. We will just keep the list in memory (in my real service I have been using sql2o).

Note: I wrote a bunch of other tutorials on Spark. Take a look at Spark tutorials website.

A few dependencies

We will be using Maven so I will start by creating a new pom.xml throwing in a few things. Basically:

  • Spark
  • Jackson
  • Lombok
  • Guava
  • Easymock (used only in tests, not presented in this post)
  • Gson

Spark hello world

Do you have all of this? Cool let’s write some code then.

And now we can run it with something like:

Let’s open a browser and visit localhost http://localhost:4567/posts. Here we want to do a simple get. For performing posts you could want to use the Postman plugin for your browser or just run curl. Whatever works for you.

Using Jackson and Lombok for awesome descriptive exchange objects

In a typical RESTful application we expect to receive POST requests with json objects as part of the payload. Our job will be to check the code is well-formed JSON, that it corresponds to the expected structure, that the values are in the valid ranges, etc. Kind of boring and repetitive. We could do that in different ways. The most basic one is to use gson:

We probably do not want to do that.

A more declarative way to specify what structure we expect is creating a specific class.

And then we could use Jackson:

In this way Jackson check automatically for us if the payload has the expected structure. We could want to verify if additional constraints are respected. For example we could want to check if the title is not empty and at least one category is specified. We could create an interface just for validation:

Still we have a bunch of boring getters and setters. They are not very informative and just pollute the code. We can get rid of them using Lombok. Lombok is an annotation processor that add repetitive methods for you (getters, setters, equals, hashCode, etc.). You can think of it as a plugin for your compiler that looks for annotations (like @Data) and generates methods based on them. If you add it to your dependencies maven will be fine but your IDE could not give you auto-completion for the methods that Lombok adds. You may want to install a plugin. For Intellij Idea I am using Lombok Plugin version 0.9.1 and it works great.

Now you can revise the class NewPostPayload as:

Much nicer, eh?

A complete example

We need to do basically two things:

  1. insert a new post
  2. retrieve the whole list of posts

The first operation should be implemented as a POST (it has side effects), while the second one as a GET. Both of them are operation on the posts collection so we will use the endpoint /posts .

Let’s start by inserting  post. First of all we will parse

And then see how to retrieve all the posts:

And the final code is:


Using PostMan to try the application

You may want to use curl instead, if you prefer the command line. I like not having to escape my JSON and having a basic editor so I use PostMan (a Chrome plugin).

Let’s insert a post. We specify all the fields as part of a Json object inserted in the body of the request. We get back the ID of the post created.

Screen Shot 2015-03-30 at 17.25.22

Then we can get the list of the posts. In this case we use a GET (no body in the request) and we get the data of all the posts (just the one we inserted above).

Screen Shot 2015-03-30 at 17.30.33


I have to say that I was positively surprised by this project. I was ready for the worse: this is the kind of application that requires a basic logic and a lot of plumbing. I found out that Python, Clojure and Ruby do all a great jobs for this kinds of problems, while the times I wrote simple web applications in Java the logic was drown in boilerplate code. Well, things can be different. The combination of Spark, Lombok, Jackson and Java 8 is really tempting. I am very grateful to the authors of these pieces of software, they are really improving the life of Java developers. I consider it also a lesson: great frameworks can frequently improves things much more than we think.

Edit: I received a suggestion to improve one of the example from the good folks on reddit. Thanks! Please keep the good suggestions coming!