Gradle Tip: ear packaging

Do you have trouble creating the ear layout you like to see? Maybe this little tip will help.

Let’s assume we have an ejb.jar which should go into the ear root. Its dependencies should be placed in the lib folder of the ear.

app.ear
|-- ejb.jar
\-- lib
    |-- dep.jar
    |-- ...

The ear plugin does have two configurations which are used to control what is put where into the ear.

  1. first is the deploy configuration. All dependencies will be placed into the ear root. deploy is not transitive, ie. if we use deploy project(':ejb'), ejb.jar will be placed into the root of the ear and it will ignore all dependencies of project(':ejb'):
    app.ear
    \-- ejb.jar
    
  2. second is the earlib configuration. All dependencies will be placed into the lib folder of the ear. earlib is transitive, ie. if we use earlib project(':ejb'), ejb.jar and all dependencies of project(':ejb') will be placed into lib:
    app.ear
    |-- lib
        |-- ejb.jar
        \-- dep.jar
    

Looks like both do not what we need.

The trick is to use both.

deploy to put project(':ejb') into the root and earlib to put its dependencies into lib:

dependencies {
    deploy project(':ejb')
    earlib project(':ejb')
}

The result will look like this:

app.ear
|-- ejb.jar
\-- lib
    |-- ejb.jar  // ups...!
    |-- dep.jar
    |-- ...

Better but still not what we want. We now have an ejb.jar in root and one in lib. How do we get rid of the duplicate ejb.jar?

The trick is to explicitly select the compile configuration of project(':ejb')

The compile configuration does only contain the dependencies and not the ejb.jar artifact. Exactly what should go into the lib folder (we can’t use the runtime configuration because it does include the ejb.jar, we will need it at runtime.. :-)):

dependencies {
    deploy project(':ejb')
    earlib project(path: ':ejb', configuration: 'compile')
}

Which does create the ear we want:

app.ear
|-- ejb.jar
\-- lib
    |-- dep.jar
    |-- ...

Finally!

Gradle: sub-project test dependencies in multi-project builds

I’am currently working on a project where we try to move an ant based build to gradle. One task is to make the tests build and run from gradle. The build creates multiple artifacts and there are test dependencies between the sub-projects which gradle does not handle out of the box.

Let’s say we have a multi-project build with Project B depending on Project A. B does not only have a compile dependency on A but also a test dependency. The tests in B depend on a couple of test helper classes from A.

Handling this with gradle was not as straight forward as I had hoped. In the end it wasn’t very difficult but it took me some effort to understand the details. This may be completely obvious to you. :-)

There are a couple of possibilities:

the naive approach

The dependency for building of B is easy:

build.gradle:

dependencies {
    compile project (':A')
}

This will add the jar artifact from A as a dependency to project B. We can confirm this by using the following snippet to print the compile configuration:

configurations.compile.each {
    println "compile: $it"
}  

which will print:

compile: <path>/ProjectA/build/libs/ProjectA.jar

So the naive approach is to change the dependencies to:

build.gradle:

dependencies {
    compile project (':A')
    testCompile project (':A')
}

But this does not work. Printing testCompile with:

configurations.testCompile.each {
    println "testCompile: $it"
}  

still prints:

testCompile: <path>/ProjectA/build/libs/ProjectA.jar
...

only. Adding testCompile didn’t add anything to the dependencies. When we print the testCompile dependencies without the testCompile line we get the same output. testCompile extends compile and it doesn’t add an artifact we could depend on.

I found two solutions on stackoverflow.

a simple solution, depending on the test output paths

The simpler solution adds a dependency on the test sourceset output:

dependencies {
    ...
    testCompile project(':A').sourceSets.test.output
}

Ok ….? What’s this?

We use sourceSets to group source files. In this case we depend on the output of the test sources group (by default: src/test/java and src/test/resources). The output contains the path to the compiled test sources and resource files.

Let’s see what we get printed for testCompile now:

testCompile: <path>/ProjectA/build/classes/test
testCompile: <path>/ProjectA/build/resources/test
testCompile: <path>/ProjectA/build/libs/LearnA.jar

Looks good and it works. :-)

What I dislike about this solution is that we explicitly depend on some internals from B. It would be nice if we could hide this detail from A. Another issue is handling transitive test dependencies of A. We do not want to handle that in B. When we depend on the test code from A, A should take care of its transitive test dependencies as it does for a normal jar artifact.

improved solution, using a configuration

We can hide the internals of A by creating a new configuration in A

build.gradle in A:

configurations {
    testOutput
}

dependencies {
    testOutput sourceSets.test.output
}

… and adding the dependency on this configuration in B:

build.gradle in B:

dependencies {
    ...
    testCompile project(path: ':A', configuration: 'testOutput')
}

A configuration is simply a group of dependencies. We create a new one using the configurations object and can then add dependencies to it. To depend on it in B we pass a configuration parameter to the project method.

I like this one because using the testOutput configuration we have created something like an interface to the test output. The configuration is also used in the next solution.

improved solution, depending on a test jar

The second solution from stackoverflow creates a test jar from the test sources and uses a configuration to handle the dependencies.

build.gradle in A:

task jarTest (type: Jar) {
    from sourceSets.test.output
    classifier = 'test'
}

configurations {
    testOutput
}

artifacts {
    testOutput jarTest
}

The jarTest task builds the jar from the test outputs adding a ‘test’ to the jar name (there are more properties to customize the jar name). Then we create the configuration. We have seen this in the previous solution. Last step is to add the jar artifact to the configuration.

When someone asks for testOutput gradle will run the jarTest task.

The dependency configuration in B does not change:

build.gradle in B:

dependencies {
    ...
    testCompile project(path: ':A', configuration: 'testOutput')
}

When we now look at the testCompile output from our snippet we see the production and the test jar:

testCompile: <path>/ProjectA/build/libs/ProjectA.jar
testCompile: <path>/ProjectA/build/libs/ProjectA-test.jar

improving jarTest

Although gradle seems to always build the test stuff before creating the jar (the from triggers building the test code?) there is no explicit dependency on a test build task. Looking at the task dependencies of the java plugin jar depends on classes, so it is probably a good idea to add a depends on on testClasses to the jarTest:

task jarTest (type: Jar, dependsOn: testClasses) {
    from sourceSets.test.output
    classifier = 'test'
}

handling transitive dependencies

Next let us assume that ProjectA-test.jar depends on another project: T. With the current solution running the tests of B will fail because the testRuntime classpath won’t contain T.

First we need T to build A. That’s easy we just add a testCompile dependency:

    dependencies {
        testCompile project (':T')
    }

But this doesn’t help at runtime, B depends on the testOutput configuration which doesn’t know anything about T. So let’s add the dependency on T:

    dependencies {
        testCompile project (':T')
        testOutput project (':T')
    }

testOutput is a dependency configuration like testCompile so we can add our dependency on T as we did on testCompile.

Printing testRuntime when we run B

configurations.testRuntime.each {
    println "testRuntime: $it"
}  

we can check that it contains:

testRuntime: <path>/ProjectT/build/libs/ProjectT.jar

.. and it doesn’t fail anymore. Nice :-)

cleaning up duplication

There is just one small issue left: by adding the T dependency to testCompile and testOutput we have created a little bit of duplication.

Since testOutput depends on building the test code it would be helpful to tell gradle that testOutput should re-use the configuration from testCompile. Then we wouldn’t need to explicitly list the T dependency on testOutput.

We can by defining testOutput like this:

configurations {
    testOutput.extendsFrom (testCompile)
}

We can remove the explicit T dependency on testOutput (removing the duplication) now because we will get it from testCompile.

final solution

Our final solution needs more code than just adding a dependency on the test output paths but it will make it easier to
handle the test dependencies and their transitive dependencies in a multi-project build.

Here is the final code:

build.gradle in A:

configurations {
    testOutput.extendsFrom (testCompile)
}

dependencies {
    testCompile project (':T')
}

task jarTest (type: Jar, dependsOn: testClasses) {
    from sourceSets.test.output
    classifier = 'test'
}

artifacts {
    testOutput jarTest
}


build.gradle in B:

dependencies {
    ...
    testCompile project(path: ':A', configuration: 'testOutput')
}   

conclusion

After playing around with this for a while it doesn’t look as magic as at first sight but there are still a few questions…

Gradle is really simple for builds that don’t need anything special but it is not so simple anymore if you want to do something a little bit different. Interesting is that the final solution does not have much code. That’s nice but it is still not very obious to implement it like this (.. at least to me).

My next step is to make a plugin from it so we do not have to duplicate that code into all our sub-projects :-)

Update: Nov 2015

Working code is available from my github page (it is slightly different than the version presented above, which may be the reason for the problems mentioned in the comments). It is a gradle project that creates a gradle plugin. I have also created a simple example project that I have run with gradle 2.2 up to 2.8 without problem. Hope that helps!

Update: Dec 2015

This is now available as a plugin from plugins.gradle.org.

Grails: Named Marshaller

a small particle of knowledge in the Grails universe… :-)

I like to strip the data marshalled to an api consumer to a bare minimum. For example the default Grails marshaller adds the class name and version of a domain class to the json. Usually no information that will be used on the client side of an application.
We can easily change the default and remove all unnecessary information by registering a new standard marshaller:

    JSON.registerObjectMarshaller(Foo) {
        [description:'this is a standard foo!']
    }

The standard marshaller is still a bit limiting. We have just one representation for all api calls. If we have multiple api calls using the same domain objects we have to add everything required by the request that does work on the biggest data set. The other requests will receive more data than they need. Not what we want.

Grails supports named marshaller which will remove this limitation. We can register multiple marshallers fora single domain class by giving them names. We can register marshallers of different domain objects using the same name. That means we can group them by feature, user rights or whatever we like.

To register a named marshaller we use the following code:

JSON.createNamedConfig ('feature') { DefaultConverterConfiguration<JSON> cfg ->
    cfg.registerObjectMarshaller (Foo) { Foo it, JSON json ->
        [description: 'this is a named foo!']
    }
}

To use the named marshaller we wrap as JSON in JSON.use():

JSON fooConverter  = JSON.use ('feature') {
    foo as JSON
}

Calling

foo as JSON

will still use the standard marshaller.

So simple. :-)

This is described in the renderers section of the Grails doumentation.

Grails: Using fork mode with grails-cucumber plugin

Introduction

The good news is that we can finally run grails-cucumber using the forked mode introduced in grails 2.3 using the preferred testing style described in Grails: Cucumber with HTTPBuilder & RemoteControl. :-)

It was a long road making grails-cucumber compatible with forked mode. There were a few things that had to be changed in the plugin itself, then there were a couple of issues in grails functional testing setup and finally the remote-control plugin did not work in forked mode because of a reloading issue in spring-loaded.

To summarize, we will need at least the following versions to run the cucumber features with not forked & forked mode:

mode grails-cucumber remote-control grails
not forked >= 0.11.0-SNAPSHOT >= 1.4 >= 2.3.8
forked >= 0.11.0-SNAPSHOT >= 1.5 >= 2.4.0

the latest version of grails-cucumber is 1.0.0

Now let’s take a look at running the example from Grails: Cucumber with HTTPBuilder & RemoteControl in forked mode.

Running cucumber features in forked mode

What follows is more or less a generic description about running functional tests in forked mode and not specific to (grails-)cucumber.

There are two reasons why we would want to use the forked-mode

  • isolation of the the build path from the runtime/test paths
  • quicker roundtrips in development because we do not have to wait for jvm/grails startup

On my machine (late 2011, (surprise, sooo old already ;-) it takes about 25 seconds to run grails test-app functional:cucumber for the two scenarios in the example in non forked mode. Most of that time is used to startup the jvm and grails.

The first step to use the forked mode is to add the fork configuration into BuildConfig.groovy. If you created your project with grails 2.3 or above it will already exists.

grails.project.fork = [
    // configure settings for the run-app JVM
    run: [maxMemory: 768, minMemory: 64, debug: false, maxPerm: 256],
    // configure settings for the test-app JVM, uses the daemon by default
    test: [maxMemory: 768, minMemory: 64, debug: false, maxPerm: 256, daemon:true]
]

Next we run grails test to start the interactive grails console. I’m running the console in the test enviroment so that the remote-control plugin gets enabled in the running grails application (by default it will only be enabled in the test environment).

If you run ps -ef | grep java in another window you will see two grails java processes running.

We can run our application by simply entering run-app from the grails console:

grails> run-app
| Server running. Browse to http://localhost:8080/Books_remote
| Application loaded in interactive mode. Type 'stop-app' to shutdown.

Running ps again we see four(!) java processes. I expected to see three. Not sure why there are four.

To run the features we can now simply call

test-app functional:cucumber -baseUrl=http://localhost:8080/Books_remote/

.. just taking a few seconds now… since it does not have to start grails anymore.

Note that we have to pass the baseUrl parameter to test-app because test-app does not know where the application is running (baseUrl was broken in grails 2.3 until grails 2.3.8).

Make sure you have the slash at the end. In the test code I pass the url to HttpBuilder and without the slash it will drop the Books_remote from the url path.

and we receive the usual test-app output from grails and cucumber:

grails> test-app functional:cucumber --stacktrace --verbose -baseUrl=http://localhost:8080/Books_remote/
| Running 2 cucumber tests...
2 Scenarios (
2 passed
)
6 Steps (
6 passed
)
0m
0.185s
| Completed 2 cucumber tests, 0 failed in 0m 0s
| Tests PASSED - view reports in /Users/hauner/Development/Grails/grails-cucumber.git/test/projects/Books_remote/target/test-reports

As in development we can change the application code and the running grails application will reload the changes.

That’s it. Happy forking :)

Grails: Cucumber with HTTPBuilder & RemoteControl

Here is (another) simple Book example using Cucumber with Grails.

Introduction

As an alternative to testing against the client (browser) ui (using geb) I will test against the server ui (or api). What’s interesting about this example? Here is a quick overview:

  • it is using HTTPBuilder to talk to the server api
  • it is using remote-control plugin to setup & cleanup test data
  • it is using a cucumber ‘World’ object
  • step definitions are minimal

This is also an update to my blog Automating Specification with Cucumber & Grails where I directly called grails controllers, services etc in the integration test style.

With Grails 2.3 and the new fork mode this style of testing is deprecated. Functional test code and application should run in separate processes to get a more production like environment. The old style will only work in non forked mode.

To prepare and clean up test data we have to choose another solution: The remote-control plugin. It will let us write closures (in our test code) that are executed in another process (the application) and we can use it to run test setup and tear down code using normal grails code.

Note: the remote-control plugin does not currently (grails 2.3.4 2.3.8) work with automatic reloading (reloading issue) in forked mode (serialVersionUID mismatch errror). We will have to run grails with the -noreloading option. This is not necessary if we use non forked mode.  Update (27.4.’14): Unfortunately I don’t know how to disable reloading for the forked jvm’s so we can only run this in non-forked mode for now.

Ok, so let us take a look at the code!

The code

Here is the list of the files. In comparision to the older examples there is a new subfolder world with a few new files that contain all the nice stuff I listed above.

test
  \-- functional
        \-- data
              Data.groovy 
        \-- hooks
              env.groovy
        \-- steps
              BookSteps.groovy
        \-- world
              Books.groovy
              Requests.groovy
              World.groovy
        ListBooks.feature
        NewBook.feature

The features & step definitions

Nothing new in my sample features:

ListBooks.feature

Feature: new book entry
    As a book owner
    I want to add books I own to the book tracker
    so that I do not have to remember them by myself

Scenario: new book
   Given I open the book tracker
    When I add "Specification by Example"
    Then I see "Specification by Example"s details

ListBooks.feature

Feature: list owned books
    As a book owner
    I want to list my books
    so that I can look up which books I own

Scenario: list existing books
   Given I have already added "Specification by Example"
    When I view the book list
    Then my book list contains "Specification by Example"

.. but the step definitions have changed:

steps/BookSteps.groovy

package steps

import static cucumber.api.groovy.EN.*


Given (~'^I open the book tracker$') { ->
    // nop
}

When (~'^I add "([^"]*)"$') { String bookTitle ->
    requestAddBook (bookTitle)
}

Then (~'^I see "([^"]*)"s details$') { String bookTitle ->
    assertAddBook (bookTitle)
}

Given (~'^I have already added "([^"]*)"$') { String bookTitle ->
    bookId = setupBook (bookTitle)
}

When (~'^I view the book list$') { ->
    requestAllBooks ()
}

Then (~'^my book list contains "([^"]*)"$') { String bookTitle ->
    assertAllBooksContains (bookId, bookTitle)
}

As you can see, there is not much code anymore. Of course it is still there, it just has moved to a better place. The nice thing about keeping the step definitions light is that it makes them really cheap and simple. After all it is called glue code. You won’t use a 1 cm layer of glue to stick two pieces together.

The big advantage is that you don’t need to care if you require a step with a slightly changed wording or if there is already a step that has the code you need. Simply create a new one and use that one liner to call your test api. We don’t need to care if there is a little bit of duplication because all the heavy lifting takes place in the test api.

The test api

Forcing ourself to move most of the code out of the steps has another advantage. In the more usual code environment (without the step definition “noise”) it is easier to follow our normal implementation habbits like removing duplication, creating clean code and so on. Hope this make sense to you. :-)

Here is the test api code for Book. I have moved setup, action and assertion methods together because I prefer grouping by topic (not necessarily in a single file of course but here it is small enough). If I want to know anything about the Books test api I just have to look here.

world/Books.groovy

package world

import data.Data
import grails.plugin.remotecontrol.RemoteControl
import static javax.servlet.http.HttpServletResponse.*


class Books {
    def booksRequestData

    def getBooksResponse () {
        booksRequestData.response
    }

    def getBooksResponseData () {
        booksRequestData.data
    }

    Long setupBook (String title) {
        def remote = new RemoteControl ()

        def book = Data.findByTitle (title)
        Long id = remote {
            ctx.bookService.add (book)?.id
        } as Long

        assert id
        id
    }

setupBook is the setup code used to prepare a single book for the list existing books scenario. It is using the remote-control plugin to create the test data.

It looks up the book details by the given title and then calls remote to execute the closure in the running grails application. The closure itself uses the BookService to add the book. The same service is used to add a book by the server api.

The ctx variable is provided by the remote-control plugin so we can get easily at the application artifacts. There is not much more to say about it. Take a look at its documentation for the rest. It is a a quick read.

    void requestAddBook (String title) {
        def newBook = Data.findByTitle (title)
        booksRequestData = post ('book/add', newBook)
        assert booksResponse.status == SC_OK
    }

requestAddBook adds a new book calling the server api. It is used in the new book scenario. It simply sends a normal post request to the application, calling the BookControllers add action, remembering the response information and checking that we got a 200 ok.

In this simple example we could have used it to setup the book as well. If we would need multiple books as test data though we would waste a lot of time running a single request for each book. Looping in the remote control closure will be a lot faster.

    void assertAddBook (String title) {
        def expected = Data.findByTitle (title)
        def actual = booksResponseData

        assert actual.id
        assert actual.title  == expected.title
        assert actual.author == expected.author
    }

This method simply checks that the response data correspond to the requested book.

    void requestAllBooks () {
        booksRequestData = getJson ('book/all')
        assert booksResponse.status == SC_OK
    }

This one is used to get the list of books as json calling the all action of BookController.

    void assertAllBooksContains (Long id, String title) {
        def expected = Data.findByTitle (title)
        def actual = booksResponseData.first ()

        assert actual.id     == id
        assert actual.title  == expected.title
        assert actual.author == expected.author
    }
}

Finally another assertion mehod that checks that the previously requested book list contains the expected book.

post & getJson request

The two network calls post and getJson used in the test api are implemented in the next file. There is no magic here, just two simple HTTPBuilder calls.

world/Requests.groovy

package world

import groovyx.net.http.ContentType
import groovyx.net.http.HTTPBuilder
import groovyx.net.http.Method


class Requests {

    def defaultSuccess = { response, data ->
        [response: response, data: data]
    }

    def defaultFailure = { response, data ->
        [response: response, data: data]
        assert false
    }

    def getJson (String targetUri, Closure success = null, Closure failure = null) {
        def http = new HTTPBuilder(binding.functionalBaseUrl)

        def result = http.request (Method.GET, ContentType.JSON) {
            uri.path = targetUri
//            headers.'X-Requested-With' = 'XMLHttpRequest'
//            headers.'Cookie' = cookies.join (';')
            response.success = success ?: defaultSuccess
            response.failure = failure ?: defaultFailure
        }
        result
    }

    def post (String targetUri, Map params, Closure success = null, Closure failure = null) {
        def http = new HTTPBuilder(binding.functionalBaseUrl)

        def result = http.request (Method.POST, ContentType.JSON) {
            uri.path = targetUri
//            headers.'X-Requested-With' = 'XMLHttpRequest'
//            headers.'Cookie' = cookies.join(';')
            requestContentType = ContentType.URLENC
            body = params
            response.success = success ?: defaultSuccess
            response.failure = failure ?: defaultFailure
        }
        result
    }
}

The only special thing is the line def http = new HTTPBuilder(binding.functionalBaseUrl) in both methods. functionalBaseUrl is the url the application is running on (you can also provide it via the baseUrl command line option) and is provided by grails.

Done? Not yet :-)

If you have read so far you may wonder how the step definitions can call the test api, how the test api finds the request methods and where the binding is comming from.

That is were the World comes into play…

The World, putting everything together

The World is simply an object we can use to to provide some additional stuff to the step definitions via cucumbers World hook. We can also use it share state between the steps of a single scenario. A new world is created for each running scenario.

world/World.groovy

package world

import grails.plugin.remotecontrol.RemoteControl
import static cucumber.api.groovy.Hooks.World


class BookWorld {
    def binding

    BookWorld (def binding) {
        this.binding = binding
    }

    void resetDatabase () {
        def remote = new RemoteControl ()

        boolean success = remote {
            ctx.databaseService.reset ()
            true
        }
        assert success
    }
}

World () {
    def world = new BookWorld (binding)
    world.metaClass.mixin Requests
    world.metaClass.mixin Books
    world
}

Here our World object is of type BookWorld with the Books code and Requests code mixed in. This is a groovy trick to add additional methods to the World object. Because thery are all on the World object they can call each other.

This file is a groovy script and when it is executed we pass the scripts binding to our world so we can use is later to get the functionalBaseUrl.

Reseting the Database

To run the scenarios independend of each other we have to reset our test data in the database. Both scenarios add the same book to the database and we would get a constraint violation if we do not clean up before the next scenario runs.

That what the resetDatabase method in the World is supposed to do. It is simply called from a Before hook like this:

hooks/env.groovy

package hooks

import static cucumber.api.groovy.Hooks.Before


Before () {
    resetDatabase ()
}

resetDatabase is using the remote-control plugin to call a special service that takes care of reseting the database by running a sql script. In this case it is very simple, it just clears the ‘book’ table (see below).

DatabaseService.groovy

package test

import groovy.sql.Sql

import javax.sql.DataSource


class DatabaseService {
    DataSource dataSource

    void reset () {
        def script = new File ('sql/reset.sql').text

        Sql sql = new Sql (dataSource)

        sql.withTransaction { def c ->
            sql.execute (script)
            sql.commit()
        }
        sql.close ()
    }
}

sql/reset.sql

-- clear tables for functional tests

delete from book;

That’s it.

The full code of this example is available on my github page here in the repository of the grails-cucumber plugin.

Thanks for reading :-)

Grails: Injecting Config Parameters

Asume we have a configuration value of type String and we want to use it at multiple places, e.g. in grails services and controllers.

The standard way seems to be an entry in Config.groovy

a.deeply.nested.value = "a.deeply.nested value!"

and then this ugly piece of code to acces the configuration:

class Ugly(Controller|Service) {
    def grailsApplication

    String NESTED_VALUE = grailsApplication.config.a.deeply.nested.value

    // ...
}

Which is already better than using grailsApplication.config.... spread around the controller.

I don’t like this very much because of the extra dependency (grailsApplication) and the config object we may have to setup just to write a simple test for our code. Each additional dependency makes testing harder. And this just because of a simple configuration value.

Is there a better way? Let’s google….

I found a couple of different solutions that don’t need grailsApplication.

 

Springs @Value annotation

 

I found this here.

Using the @Value annotation works out of the box (using grails 2.2.3):

class LessUglyController {
    @Value('${a.deeply.nested.value}')
    String NESTED_VALUE

    // ...
}

This looks better. We get rid of grailsApplication and we can strip grailsApplication.config from our configuration path. Testing gets easier without the grailsApplication dependency.

But.. personally I’m not happy with the annotation “noise” and having the config value in a string. IntelliJ doesn’t do auto completion here ;-)

 

using resources.groovy

 

This is standard grails stuff. Adding an entry for a service works without problem:

resources.groovy:

someService(SomeService) {
    NESTED_VALUE = application.config.a.deeply.nested.value
}

and the service

class SomeService {
    String NESTED_VALUE

    // ...
}

But I didn’t get it working for a controller until I found an enlightening answer to a question on stackoverflow.

The trick is, that we have to specify the full classname with package to overide a controller bean.

resources.groovy:

'com.wordpress.softnoise.SomeController'(SomeController) {
    NESTED_VALUE = application.config.a.deeply.nested.value
}

Note the quotes arround the canonical name.

The controller looks like the service above:

class SomeController {
    String NESTED_VALUE
    // ...
}

Both version with no noise at all :-) Not too bad.

Now, there is still a better version using Config.groovy

 

using beans in Config.groovy

 

Uhh, using Config.groovy to inject a value from Config.groovy into a spring bean? Yes, and it is even documented in the grails documentation here.

We can simply put this into Config.groovy to inject the value into the service bean.

beans {
    someService {
        NESTED_VALUE = a.deeply.nested.value
    }
}

This also works for controllers we just have to use the same trick as in resources.groovy:

beans {
    'com.wordpress.softnoise.SomeController' {
            NESTED_VALUE = a.deeply.nested.value
    }
}

That is nice, no extra noise in the bean and the config path doesn’t leave the Config.groovy file.

 

conclusion

 

I think the easiest and best solution is the beans configuration in Config.groovy.

No more grailsApplication.config. :-)

Intellij IDEA, Cucumber and German Spell Checking

Now that Intellij IDEA (12.1.3 in my case) supports auto completion & syntax highlighting for cucumber features not only in english but also in any other language that is available for gherkin it would be nice to have native spell checking as well.

To use your native language with cucumber you just have to place a language comment at the first line of your feature file. For example see this super useful feature description:

# language: de

Funktionalität: deutsche Feature-Beschreibungen
  Um cucumber in unserer Muttersprache zu benutzten
  möchte ich als Szenario-Schreiber
  die deutschen Schlüsselwörter benutzen

  Szenario: deutsche Schlüsselwörter & Steps
    Angenommen ich schreibe eine Feature-Beschreibung
    Wenn ich die deutschen Gherkin-Schlüsselwörter benutze
    Dann werden die deutschen Steps aufgerufen

To get spell checking for an additional language in IntelliJ we need to add a dictionary for that language. This is done in a few simple steps:

  • first, we need a dictionary for our language. This is a plain text file with a lot of words, each on a single line. I found a german dictionary on sourceforge.
  • second, we need to make sure it is encoded in utf–8. The german.dic file was encoded in latin–1. If it is not encode in utf–8 use your text editor of choice (e.g. Notepadd++ or TextWrangler or …) and convert it to utf–8 (no BOM).
  • third, create a folder (e.g. dictionaries) where you want to save the dic file
  • fourth, tell IntelliJ about the dictionary folder following the documentation, which is in short:
    1. open the Settings dialog
    2. type ‘spell’ into the search box and select Spelling
    3. switch to the Dictionaries tab
    4. and add the folder to the Custom Dictionaries Folder list

You should see now the dictionary under Dictionaries as a user dictionary and the checkbox enabled.

That’s it, no more typos in the features :-)