Maven repository on Bitbucket

There are number of cases when standalone repository management tool like Nexus is an overkill and all you need is a simple service for providing a few maven libraries. I use Bitbucket private Git repositories for such cases several years. Setup is pretty easy and all you have to do is to modify ~/.m2/settings.xml like below:

<profiles>
    <profile>
        <id>some-project</id>
	    <activation>
	        <activeByDefault>true</activeByDefault>
	    </activation>
        <repositories>
            <repository>
                <id>some-project-maven-repo</id>
                <url>https://api.bitbucket.org/1.0/repositories/username/git-repo/raw/releases</url>
            </repository>
        </repositories>
    </profile>
</profiles>

<servers>
    <server>
        <id>some-project-maven-repo</id>
        <username>user@exitcode.net</username>
        <password>normal_or_app_password</password>
    </server>
</servers>

This configuration uses access through Bitbucket REST API v1.0 where:

  • username is a your Bitbucker username or team name
  • git-repo name of the Git repository created for hosting maven libraries
  • releases name of the Git branch the libraries are stored in
  • user@exitcode.net is an email of the user that have an access to Git project (if you use App password to protect main Bitbucket password, you must use username instead of email)

This works pretty well as Bitbucket Cloud API 1.0 challenges authentication using HTTP 401 Unauthorized header while accessing secured content like private repository, when no credentials are given. Once challenged, Maven Wagon HTTP provider sends proper credentials stored in settings.xml to the server in form of Authorization header.

API 2.0

However, Bitbucket announced recently that Cloud API 1.0 is deprecated as of June 30 and what’s worse, all version 1.0 APIs will be removed permanently on 31 December 2018. So, all users are forced to switch to API 2.0. To access the same Git repository with maven artifacts you have to use following configuration:

<profiles>
    <profile>
        <id>some-project</id>
	    <activation>
	        <activeByDefault>true</activeByDefault>
	    </activation>
        <repositories>
            <repository>
                <id>some-project-maven-repo</id>
                <url>https://api.bitbucket.org/2.0/repositories/username/git-repo/src/releases</url>
            </repository>
        </repositories>
    </profile>
</profiles>

<servers>
    <server>
        <id>some-project-maven-repo</id>
        <configuration>
            <httpHeaders>
                <property>
                    <name>Authorization</name>
                    <value>Basic dXNlckBleGl0Y29kZS5uZXQ6cGFzc3dvcmQ=</value>
                </property>
            </httpHeaders>
        </configuration>
    </server>
</servers>

The tricky part is the authentication. You have probably noticed that Authorization header with basic authentication credentials (value user:password encoded in base64) is passed to the server directly. This is because API 2.0 no longer challenges authentication and throws directly HTTP 403 Forbidden if it did not find credentials stored in request. With this configuration Maven Wagon HTTP provider (actually underlying Apache Http Client) won’t wait for authentication challenge that never come and sends credentials immediately.

Maven HTTP communication logs

While I was debugging whats going on under the hood, I discovered that Maven Wagon HTTP Provider bundles entire Apache HTTP Client with other Apache libraries like commons into one fat jar. This is crucial information if you want to enable trace logs of the HTTP client because bundled client is placed in different java package than the original! So, if you want to enable trace logs of the HTTP communication that maven makes, add following line into ${MAVEN_HOME}/conf/logging/simplelogger.properties (create it if not exists):

org.slf4j.simpleLogger.log.org.apache.maven.wagon=TRACE

Foremost, I started with information that Maven since v3.0.4 uses Apache Http Client 4, so I was trying to enable HTTP trace logs on logger org.apache.http as described in official documentation. That did not work so I decompiled wagon-http-2.9-shaded.jar from maven 3.5.4 installation where I found this bundled client and after going through some sources I eventually enabled tracing on the right logger.

Put HDD into standby after wake from suspend

I use suspend on my laptop all the time as it works perfectly on Ubuntu with integrated intel graphics. After I bought second classical (spinning) HDD in addition to SSD I was thinking about reducing power consumption and noise as this second HDD gets started on each system start / resume from suspend. Because it will serve only as media storage, it should run only a couple of minutes per day and the rest time should be powered down.

So my requirements were clear:

  • after start/resume, second HDD should be put into standby mode immediately (HDD is not spinning)
  • when second HDD is not active within 10 minutes it should be put into standby mode as well

Spinning and other HDD settings are controlled using hdparm utility. This is how my current hdparm config file looks like:

$ cat /etc/hdparm.conf

/dev/sdb {
    apm = 255
    apm_battery = 255
    # 120 * 5 = 600 seconds = 10 minutes
    spindown_time = 120
    poweron_standby = off
    standby
}

Note: If you want to get your HDD into standby mode, APM level should be in general lower than 128. Failing to do so is a common mistake and reason why people are wondering that HDD won’t stop spinning after spindown_time elapse. However for my Samsung SpinPoint M9T, APM values lower than 128 cause standby only after a few seconds of idleness so the spindown_time is ignored. For that reason I disabled APM completely (value 255) and now HDD is brought into standby properly after configured spindown_time. If you are experience the same issue, try to disable APM, it might help you too.

This hdparm configuration works fine except HDD is not brought automatically into standby mode after wake from suspend. I found out that only apm, apm_battery and spindown_time settings are re-applied after resume, not the standby. This is done using script

/usr/lib/pm-utils/power.d/95hdparm-apm

which gets called from

/lib/systemd/system-sleep/hdparm (systemd hook)

once laptop is suspended or resumed. No other hdaparm settings are re-applied after resume. So in order to spin down HDD after resume, there are 2 options:

  1. create systemd service in /etc/systemd/system/suspend.target.wants
  2. modify /lib/systemd/system-sleep/hdparm script or create new one in the same directory and suspend disk manually when post event occurs

Until Ubuntu 15.04 there was also a third option, place standby invoking script into /etc/pm/sleep.d directory. As from 15.10, systemd has replaced upstart and pm utils scripts are no longer invoked automatically!.

I took the first approach and created following systemd service, which spins down HDD immediately after resume:

$ cat /etc/systemd/system/suspend.target.wants/standby-hdd.service

[Unit]
Description=Turn off power of the media hdd after resume
After=suspend.target

[Service]
ExecStart=/sbin/hdparm -y /dev/sdb

[Install]
WantedBy=suspend.target

This works for me well.

Ubuntu 14.04 – broken bracketed paste mode in Gnome Terminal

After upgrade to Ubuntu 14.04 LTS (Trusty Tahr) from 13.10 I found out very annoying issue while copy-pasting commands into gnome terminal. Each command pasted from clipboard has characters 0~ in the beginning and 1~ at the end of the text, so it looks like:

$ 0~ls -la1~

This happens only after I go to subshell from Midnight Commander and back a few times using Ctrl+o key.

First time I thought it’s a MC bug but after a bit of googling I discovered that the problem is caused by a bug in bracketed paste mode implementation in VTE library. VTE library is used in most GTK based terminal emulators such as Gnome terminal, XFCE terminal etc. so this problem affects all of them. There is already a patch for development version of libvte (v0.36) but it’s not clear whether this fix will be backported to libvte v0.34 used in Trusty or Ubuntu team will upgrade libvte to at least v0.36 in LTS. Until resolved, you can disable bracketed paste mode manually once this issue appears like Conrad Irwin suggested in his great blog post. Just enter following command in terminal:

$ printf "\e[?2004l"

and issue should go away. I have tried to add this command into ~/.bashrc in order to disable bracketed paste mode when terminal is opened but this seems to not work. You can re-enable bracketed paste mode anytime using similar command:

printf "\e[?2004h"

Bracketed paste mode is generally a good idea. Its main purpose is to inform shell or applications running in shell that text has been pasted from clipboard and not typed manually. This could improve security since when you paste command with newline character (\n) at the end, bash (and most shells too) executes that command immediately. You can find some examples of malicious commands here.

Edit 10/2015: Seems like this issue is finally resolved in Ubuntu. VTE patch has been backported to libvte v0.34 used in Trusty. After upgrade I am no longer able to reproduce this issue.

OutOfMemoryError email alert

When you profiling an application or looking for some memory leak taking heapdump is always essential. There are multiple ways how to do it manually. For instance using directly jmap command or some monitoring tool like VisualVM, JConsole, JMC etc. However when bad things happen in production it’s useful to be alerted and have a heapdump stored. For a such scenario JDK offers several useful command line options:

  • -XX:+HeapDumpOnOutOfMemoryError
  • -XX:HeapDumpPath
  • -XX:OnOutOfMemoryError

which could be used in a following manner:

-XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp/heapdumps -XX:OnOutOfMemoryError=/opt/app/send_alert_email.sh

I think above configuration is self explaining. When OutOfMemoryError occurs, JDK will automatically save heapdump of the application into /tmp/heapdumps folder. Note that, if the heapdumps directory doesn’t exist it won’t be created and the JVM stores dump into a new file called heapdumps instead. Because the file gets overridden every time the application throws java.lang.OutOfMemoryError you should create the target directory manually otherwise the dump from previous application startup would be last forever. Moreover, shell script send_alert_email.sh is called, that actually sends an email notification.

Below is a notification script I normally use for this purpose. It sends an empty email with subject 'Java App - OutOfMemory alert!' to email recipient@domain.com using pre-configured SMTP server with authentication.

#!/bin/sh
mailx -v -s 'Java App - OutOfMemory alert!' -S smtp=10.0.0.1:25 -S from=app@domain.com -S smtp-auth=login -S smtp-auth-user="smtp-user-name" -S smtp-auth-password="smtp-user-password" recipient@domain.com < /dev/null
echo "Alert email sent"

Script must be executable by an user the java application is running under. The above configuration is easy to test using some memory-leaking application. You can use this simple one:

public class OutOfMemoryTest {
    public static void main(String[] args) {
        StringBuilder sb = new StringBuilder();
        while (true) {
            sb.append(new String("test"));
        }
    }
}

Save the class into OutOfMemoryTest.java file, create /tmp/heapdumps folder, compile class and run built test application:

$ javac OutOfMemoryTest.java
$ java -Xmx5m -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp/heapdumps -XX:OnOutOfMemoryError=./send_alert_email.sh -cp . OutOfMemoryTest
java.lang.OutOfMemoryError: Java heap space
Dumping heap to /tmp/heapdumps/java_pid25781.hprof ...
Heap dump file created [3101545 bytes in 0.023 secs]
#
# java.lang.OutOfMemoryError: Java heap space
# -XX:OnOutOfMemoryError="./send_alert_email.sh"
#   Executing /bin/sh -c "./send_alert_email.sh"...
Alert email sent
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
	at java.util.Arrays.copyOf(Arrays.java:2367)
	at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:130)
	at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:114)
	at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:415)
	at java.lang.StringBuilder.append(StringBuilder.java:132)
	at OutOfMemoryTest.main(OutOfMemoryTest.java:6)
$ ls /tmp/heapdumps
java_pid25781.hprof

You should have now an email about OutOfMemoryError in your mailbox.

Motorola Atrix 4G – unlock and install custom ROM

Motorola Atrix 4G is a great phone. I owned it more than a year and a half and I am very satisfied with overall performance, battery life and Android stability. Although Atrix has been released officially in early 2011, there are still not many devices on the market that are substantially more powerful. Unfortunately, Motorola decided to not provide any further updates of Android OS for this device so it’s basically stuck on Gingerbread (2.3.x). On the other side, Android is a OSS and community luckily came up with several custom ROMs you may use to bring some new life to this excellent device.

In this post I will show you how to:

All steps listed above except ROM backup are necessary in order to install custom ROM and can’t be skipped.

Warning

Be aware that using this approach you will most likely void the warranty! Although procedure has been tested multiple times by many users on XDA, there is always a slight chance that some step won’t work for you and you will brick your device. You do it on your own and I can’t be responsible for any damages to your device!

Consider making applications backup

Please consider rooting your original Gingerbread ROM and backup all installed applications with their data using Titanium Backup for instance. The thing is all Android phones have implemented security mechanism which completely erases /data partition when you try to unlock bootloader, so it basically performs factory reset. This is a generally good idea and prevents unknown persons to access your private data when device is stolen or lost. Data partition contains apps+data, sms, contacts etc. and you can’t backup them normally since /data partition is not accessible without root permission.

If you don’t care about data of your installed applications and all stuff like contacts you have already synchronized on Google’s servers than rooting your original ROM is not needed. You can unlock bootloader and flash custom recovery completely without rooting! Any procedure mentioned in this post won’t touch your files stored on /mnt/sdcard partition (internal 16GB memory) or /mnt/sdcard-ext (SD card).

Pre-requirements

Before proceed please make sure fastboot tool is available on your computer. The easiest way how to get it is to install Android SDK although there are also alternative methods without having to install entire SDK. This guide presumes you are running operating system based on Linux.

1. Unlock bootloader

Atrix comes with locked bootloader just as most others Android phones do. What does it mean? Bootloader is a program responsible for starting either Android OS or recovery software. Atrix bootloader only starts original ROMs, which are signed by digital signature from Motorola so it’s locked only to software provided by manufacturer. Unlocked bootloader allows you to install and boot custom Android Recovery that enables you to install (flash) custom ROM and boot into it.

Before the bootloader can be unlocked, we have to flash so-called pudding image. It’s a system binary file (SBF) that contains some tweaks for the bootloader and makes the unlock possible. There are two version of it:

so make sure you have picked up the right version for your device. You also need to download utility (linux-only) for flashing SBF files. Do not attempt to flash wrong image. Very likely you will brick your device. You should be able to find out what kind of version you have from About page accessible through System settings in your device. Since I bought my Atrix in Germany from original distribution, I have used the international version which worked fine.

1.1 Flash pudding image:

Start your phone while holding Volume up key and the Power button. You will boot into RSD Protocol mode. Once you see the Starting RSD protocol support on the screen, you can release the keys. Plug-in phone using USB cable into your computer and perform flash:

root@thinkpad:/home/marlly/Android/Moto4G# ./sbf_flash intl-fix-try1.sbf
SBF FLASH 1.23 (mbm)
http://opticaldelusion.org
=== intl-fix-try1.sbf ===
Index[5]: Unexpected chip 16
Index[6]: Unexpected chip 16
00: RDL03 0x00000000-0x002FFFFF 7F75 AP
01: RDL01 0x00800000-0x008407FF 3556 BP
02: CG02 0x00000010-0x0000580F 4615 AP
03: CG03 0x000000A0-0x0008009F 2135 AP
04: CG42 0x00000020-0x0030001F F03C AP
05: CG44 0x00000050-0x0030004F 0C66 AP
06: CG47 0x00000070-0x0008006F E7CB AP
>> waiting for phone: Connected.
>> uploading RDL03: 100.0%
-- OK
>> verifying ramloader
-- OK
>> executing ramloader
-- OK
>> waiting for phone: Connected.
>> sending erase
-- OK
>> uploading CG02: 100.0%
-- OK
>> uploading CG03: 100.0%
-- OKu
>> uploading CG42: 100.0%
-- OK
>> uploading CG44: 100.0%
-- OK
>> uploading CG47: 100.0%
-- OK
>> rebooting

After restart, you need to shutdown your device and take the battery off. Now when the pudding is in place, you can proceed to actual unlock.

1.2 Boot into fastboot and issue unlock

Turn the phone on again while holding Volume down key until text Fastboot appears. You are now in bootloader menu, where you can navigate through menu items using Volume down key and enter to selected item using Volume up key. In our case just press Volume up to select Fastboot mode. Now run command fastboot devices on your computer while device is still connected through USB cable to assure phone is recognized.

root@thinkpad:/home/marlly/Android/SDK/platform-tools# ./fastboot devices
TA744097OL fastboot

Device is connected so now we can proceed to actual unlock.

root@thinkpad:/home/marlly/Android/SDK/platform-tools# ./fastboot oem unlock
(bootloader) Unlocking your device can permanently VOID your warranty.
(bootloader) This process cannot be reversed. If you wish to proceed,
(bootloader) reissue the unlock OEM command containing the unique ID
(bootloader) of your device: 02804088433FD3D7
OKAY [ 0.001s]
finished. total time: 0.001s

Bootloader raised a warning about warranty and asked for confirmation. Just copy the device ID run unlock again:

root@thinkpad:/home/marlly/Android/SDK/platform-tools# ./fastboot oem unlock 02804088433FD3D7
(bootloader) Device is now unlocked
OKAY [ 6.160s]
finished. total time: 6.160s

Congrats, bootloader on your device is now unlocked. You can perform reboot using fastboot command to avoid taking battery out in order to issue restart:

root@thinkpad:/home/marlly/Android/SDK/platform-tools# ./fastboot reboot
rebooting...
finished. total time: 0.000s

After restart you should see the Unlocked label in the top left corner on the screen while device is booting.

2. Flash Clockworkmod recovery

ClockworkMod is a custom Android recovery that allows you to take a full backup of ROM with all data (nandroid backup), restore it, install new ROM or kernel, mount a partition etc. Those things can’t be done using stock recovery.

To flash ClockworkMod you need to download version for Atrix first and boot again into fastboot. Just follow steps above from section 1.2 and make sure you device is recognized in fastboot. After that run this command:

root@thinkpad:/home/marlly/Android/SDK/platform-tools# ./fastboot flash recovery /home/marlly/Android/Moto4G/recovery-clockwork-5.0.2.0-olympus.img
sending 'recovery' (4824 KB)...
OKAY [ 0.261s]
writing 'recovery'...
OKAY [ 0.760s]
finished. total time: 1.021s

Now when you restart the phone (by taking battery out or fasboot reboot) and hold Volume up while booting, you should be able to access ClockworkMod from Android Recovery menu item.

3. Backup of original Gingerbread ROM

As we are going to flash a new ROM is very good idea to backup the stock ROM and create so-called Nandroid backup. When things go wrong you have the safety net. With no backup of original software, you have to find and download the same version of stock ROM from the internet and risking you can possibly brick your device. So take a backup now!

In previous step we ended up with freshly installed and booted ClockworkMod recovery. Navigate to backup and restore and select backup. CWM will automatically create folder clockworkmod on the sdcard and put a backup marked with timestamp into it. Restore is very similar. Just select restore instead of backup, pick up the right backup and confirm restore.

4. Install CyanogenMod ROM

There are several custom ROMs for Atrix 4g available at this time. CyanogenMod ROM’s:

MIUI v5 ROM and few others. I have tried all of them and the only one stable enough to do the full day job was CM7. Unfortunately, it’s only Gingerbread so quite old today. Others had a deep sleep bug probably due to 3.1.x kernel. I have also experienced several restarts per day on my device. However, there are also plenty of users on XDA satisfied with mentioned ROMs with no such problems so I suggest you to try if you have enough time to play and you will see.

Save downloaded ROM on sdcard and boot device into CWM recovery. From main menu navigate to wipe data and perform factory reset. After that pick up install zip from sdcard, select the ZIP file with ROM and confirm. ROM will be installed onto /system partition. Once installation has finished you can reboot device again from main menu. That’s it, done. Your Atrix will now boot with custom ROM. Enjoy 🙂

EDIT 09/2015: since most download links seem to be dead now, I put all files I downloaded back in 2013 to dropbox.

Mapping namespace-less query parameters using Spring MVC in Liferay

When you work with portlet applications, all request parameters must be namespaced otherwise you won’t have access to their values using Portlet API in Liferay. However, in some cases your portlet has to process URL which was not constructed using Portlet API. In this post I will show you how to map un-namespaced query parameters declaratively using Spring MVC Portlet framework.

Imagine you have a single portlet deployed on your portal page and that portlet displays content of the article whose ID is passed as a query param, for instance:

https://news.portal.my/?id=123

That’s pretty standard requirement. URL like that is clear and easily bookmarkable. However, since parameter id is not namespaced, you portlet won’t have access to it through Portlet API. So the following code prints null into log:

@Controller
public class NewsController {

    private static final Logger LOG = Logger.getLogger(NewsController.class);

    @RenderMapping
    public void render(@RequestParam(value = "id", required = false) Integer id) {
        LOG.debug(id);
    }
}

In portal, namespace is unique for each portlet and it’s used as a prefix for all portlet’s parameters. This mechanism helps to avoid naming collisions when several portlets are deployed on the same page. For that reason you have to put portlet’s namespace into URL to make the prior code work:

https://news.portal.my/?p_p_id=news_WAR_portal&p_p_lifecycle=0&p_p_state=normal&p_p_mode=view&p_p_col_id=column-2&p_p_col_count=1&_news_WAR_portal_id=123

Now the URL is no longer short and clear. You would have to use Liferay’s friendly URL feature to simplify that although it would require additional XML configuration. However you have also a second option. You can access id parameter using Liferay API.

@Controller
public class NewsController {

    private static final Logger LOG = Logger.getLogger(NewsController.class);

    @RenderMapping
    public void view(RenderRequest req) {
        HttpServletRequest liferayServletReq = PortalUtil.getHttpServletRequest(req);
        String paramId = PortalUtil.getOriginalServletRequest(liferayServletReq).getParameter("id");

        LOG.debug(paramId);
    }
}

That worked fine, log revealed correct article number 123. But we can do better. We can wrap the above code into special WebArgumentResolver implementation and create additional annotation similar to @RequestParam to map un-namespaced query parameters to controller’s method arguments. Let’s call it @QueryParam. Using this approach we can utilise great annotation-based programming model in Spring.

@Target(ElementType.PARAMETER)
@Retention(RetentionPolicy.RUNTIME)
@Documented
public @interface QueryParam {

    String value() default "";

    boolean required() default true;

    String defaultValue() default ValueConstants.DEFAULT_NONE;
}

Notice that the code is except annotation name exactly the same as for @RequestParam and it works almost equally. Now the WebArgumentResolver implementation:

public class QueryParamResolver implements WebArgumentResolver {

    @Override
    public Object resolveArgument(MethodParameter param, NativeWebRequest request) throws Exception {
        Assert.isInstanceOf(PortletRequest.class, request.getNativeRequest(),
                "You can use @QueryParam only in application running within a portlet container!");
        if (!param.hasParameterAnnotation(QueryParam.class)) {
            return UNRESOLVED;
        }

        return mapQueryParamToObject(param, (PortletRequest) request.getNativeRequest());
    }

    private Object mapQueryParamToObject(MethodParameter param, PortletRequest portletReq) {
        QueryParam queryAnnot = param.getParameterAnnotation(QueryParam.class);
        String queryParamName = queryAnnot.value();
        String queryParamValue = getServletRequest(portletReq).getParameter(queryParamName);
        if (queryParamValue == null) {
            Class<?> paramType = param.getParameterType();
            if (queryAnnot.required()) {
                throw new IllegalStateException("Missing parameter '" + queryParamName + "' of type ["
                        + paramType.getName() + "]");
            }
            if (boolean.class.equals(paramType)) {
                return Boolean.FALSE;
            }
            if (paramType.isPrimitive()) {
                throw new IllegalStateException(
                        "Optional "
                                + paramType
                                + " parameter '"
                                + queryParamName
                                + "' is not present but cannot be translated into a null value due to being declared as a "
                                + "primitive type. Consider declaring it as object wrapper for the corresponding primitive type.");
            }
        }

        WebDataBinder binder = new WebRequestDataBinder(null, queryParamName);
        return binder.convertIfNecessary(queryParamValue, param.getParameterType(), param);
    }

    private HttpServletRequest getServletRequest(PortletRequest request) {
        return PortalUtil.getOriginalServletRequest(PortalUtil.getHttpServletRequest(request));
    }
}

Only one step left. In order to map un-namespaced query parameter to render method parameter marked with @QueryParam annotation, we need to register resolver in our spring context.

    <bean id="annotationMethodHandlerAdapter" class="org.springframework.web.portlet.mvc.annotation.AnnotationMethodHandlerAdapter">
        <property name="customArgumentResolver">
            <bean class="org.exitcode.liferay.QueryParamResolver" />
        </property>
    </bean>

Finally, we can place our new annotation into controller:

@Controller
public class NewsController {

    private static final Logger LOG = Logger.getLogger(NewsController.class);

    @RenderMapping
    public void render(@QueryParam(value = "id", required = false) Integer id) {
        LOG.debug(id);
    }
}

Using declarative approach our code is now much more readable thanks to clear separation between the parts that describe the mapping (@QueryParam) and how the mapping is actually performed (QueryParamResolver).

Why Eclipse shows warning about synthetic accessor method

Recently one of my colleague came to me with interesting question about builder pattern he saw in Effectiva Java I borrowed him. He implemented the pattern in the same way like Joshua Bloch in his book but no matter what, Eclipse displayed weird warning message about synthetic accessor. I have explained him theory behind synthetic classes but I was not quite sure why Eclipse allows to raise a compiler warning in such cases. In this post I would like to touch that topic and find reasons why Eclipse does that.

The source code of mentioned builder looks like following:

public class NutritionFacts {

    private final int servingSize;
    private final int servings;
    private final int calories;
    private final int fat;
    private final int sodium;
    private final int carbohydrate;

    public static class Builder {

        private int servingSize;
        private int servings;
        private int calories;
        private int fat;
        private int sodium;
        private int carbohydrate;

        public Builder servingSize(int servingSize) {
            this.servingSize = servingSize;
            return this;
        }

        public Builder servings(int servings) {
            this.servings = servings;
            return this;
        }

        public Builder calories(int calories) {
            this.calories = calories;
            return this;
        }

        public Builder fat(int fat) {
            this.fat = fat;
            return this;
        }

        public Builder sodium(int sodium) {
            this.sodium = sodium;
            return this;
        }
        
        public Builder carbohydrate(int carbohydrate) {
            this.carbohydrate = carbohydrate;
            return this;
        }

        public NutritionFacts build() {
            return new NutritionFacts(this);
        }
    }
    
    private NutritionFacts(Builder builder) {
        this.servingSize = builder.servingSize;
        this.servings = builder.servings;
        this.calories = builder.calories;
        this.fat = builder.fat;
        this.sodium = builder.sodium;
        this.carbohydrate = builder.carbohydrate;
    }
}

This is most typical builder implementation. Immutable object NutritionFacts is created by invoking its private constructor from builder method and passing builder instance as parameter. So its kind of copy constructor which copies data from builder to the created instance, that can’t be modified further.

Problem

If you change compiler severity level from ignore (default) to warning for:

Access to a non-accessible member of an enclosing type

in Eclipse IDE (Preferences->Java->Compiler->Errors/Warnings->Code Style), you will see following warnings for builder above:

Access to enclosing constructor is emulated by a synthetic accessor method (line 26)
Read access to enclosing field is emulated by a synthetic accessor method (lines 31 – 36)

Syntethic constructs

Lets remind why java compiler must generate synthetic accessor methods. As you certainly know, private fields/methods are not visibile outside class. The same principle applies to nested classes as well, because on JVM level, there are no nested classes. Concept of nested classes is known only to Java language at source code level. Bytecode has no support for them. After compilation, they end up as standalone classes in their own files. Therefore on JVM, private fields/methods of enclosing class are not accessible for nested classes and enclosing class can’t access private stuff of nested classes (no matter if they are static or inner). However, Java supports access private fields/methods of nested class according JLS. So there are two opposite sides, Java language which syntax supports that and JVM which refuses to execute. For that reason java compiler must generate package-private synthetic methods, using which private stuff of enclosing class can be accessed from nested classes and vice versa, without any need to change the source code. Lets look how the generated synthetic constructor looks like:

/*synthetic*/ NutritionFacts(NutritionFacts.Builder x0, NutritionFacts$1 x1) {
    this(x0);
}

So generated synthetic package-private constructor calls the original private constructor. But how the new synthethic constructor gets called? That’s again work of the java compiler, which modifies content of build() method inside builder to following form:

public NutritionFacts build() {
    return new NutritionFacts(this, null);
}

As second constructor parameter is useless, compiler can safely putt null value in it. Similar generation is done also for private builder fields, that are read within private NutritionFacts constructor. Synthetic accessor methods look following:

    /*synthetic*/ static int access$600(NutritionFacts$Builder x0) {
        return x0.carbohydrate;
    }

    /*synthetic*/ static int access$500(NutritionFacts$Builder x0) {
        return x0.sodium;
    }

    /*synthetic*/ static int access$400(NutritionFacts$Builder x0) {
        return x0.fat;
    }

    /*synthetic*/ static int access$300(NutritionFacts$Builder x0) {
        return x0.calories;
    }

    /*synthetic*/ static int access$200(NutritionFacts$Builder x0) {
        return x0.servings;
    }

    /*synthetic*/ static int access$100(NutritionFacts$Builder x0) {
        return x0.servingSize;
    }

Of course, java compiler needs to also modify private constructor that must use generated synthetic methods to gain access to the private fields of builder:

    private NutritionFacts(NutritionFacts$Builder builder) {
        super();
        this.servingSize = NutritionFacts.Builder.access$100(builder);
        this.servings = NutritionFacts.Builder.access$200(builder);
        this.calories = NutritionFacts.Builder.access$300(builder);
        this.fat = NutritionFacts.Builder.access$400(builder);
        this.sodium = NutritionFacts.Builder.access$500(builder);
        this.carbohydrate = NutritionFacts.Builder.access$600(builder);
    }

If you are intersted how I get to these generated stuff above, please look at Sundararajan’s post that explain some hidden options of java compiler. Simply you need to invoke compiler with -XD-printflat option:

javac -XD-printflat NutritionFacts.java -d generated-src

With this command, java compiler generates source code which already passed some internal transformation (synthetic methods, assertion etc.) into generated-src directory. Note that you must create this directory before you run compiler.

Possible warning reasons

So now, when you mastered basics about synthetic constructs, lets back to the eclipse warnings. What are the possible unwanted implications of using synthetic constructs and why eclipse probably provides such alerts to be shown?

  • Performance penalty – almost negligible on modern JVM if you look at syntetic code that compiler generated
  • Security breach – generated package-private synthetic constructor can be accessed via reflection and thus create an instance of class, which was not intended to be created from the outside on source code level. You have to take this into account when you model your class a you rely on synthetic accessors

Conclusion

On the builder example I tried to demonstrate how java compiler generates synthetic constructs. Fortunately, Eclipse doesn’t enforce such warnings and it ignores them by default, until you state otherwise. Personally, I would continue to take advantage of synthetic methods for nested classes as they lower number of code that needs to be written otherwise. Second reason is that you maintain private stuff private and non-accessible without using reflection.

Interface vs annotation driven events

Few days ago I was on a crossroad to design and implement simple notification system for one project. I was decided to go with standard interface-driven setup as common approach in Java since beginning. However after some googling I came accros another very interesting solution, EventBus. It’s a simple, annotation-driven event utility that’s part of Google Guava library.

EventBus allows publish-subscribe-style communication between components without requiring the components to explicitly register with one another (and thus be aware of each other). It is designed exclusively to replace traditional Java in-process event distribution using explicit registration. It is not a general-purpose publish-subscribe system, nor is it intended for interprocess communication.

Interface-driven solution
Before dive into EventBus, let me show you how the typical interface-based notification system could look like. First we need to define a simple immutable object that should be transferred using event from producer to consumers (reused also in EventBus example):

public class Message {

    private final String sender;

    private final String subject;

    private final String text;

    public Message(String sender, String subject, String text) {
        this.sender = sender;
        this.subject = subject;
        this.text = text;
    }

    public String getSender() {
        return sender;
    }

    public String getSubject() {
        return subject;
    }

    public String getText() {
        return text;
    }
}

Than we need to create a listener and event that would carry an object of above Message type. Single listener must exists for each event (used very effective technique proposed by Laurent Simon when listener and event are bound to each other within event class):

public interface MessageRecievedEventListener {
    public void messageRecieved(Message msg);
}
public interface SystemEvent<L> {
	public void notify(L listener);
}
public class MessageRecievedEvent implements SystemEvent<MessageRecievedEventListener> {

    private final Message msg;

    public MessageRecievedEvent(Message msg) {
        this.msg = msg;
    }

    @Override
    public void notify(MessageRecievedEventListener listener) {
        listener.messageRecieved(msg);
    }
}

Now when we have an event and corresponding listener, we can create a consumer of above event:

public class MessageReceivedEventConsumer implements MessageRecievedEventListener {

    private static final Logger LOG = LoggerFactory.getLogger(MessageReceivedEventConsumer.class);

    @Override
    public void messageRecieved(Message msg) {
        LOG.info("messageRecieved(), msg: {}", msg);
    }
}

The only thing missing there is dispatcher. Dispatcher is a component responsible for registering consumers to particular event a firing events. After specific event is fired, all registered consumers will receive the exact event fired. In our implementation dispatcher is a black box since it knows nothing about specific event or listener, it works only with SystemEvent interface.

public class SystemEventBus {

    // ReentrantReadWriteLock could be used if synchronization has proven to be a bottleneck
    @SuppressWarnings("rawtypes")
    private final Multimap<Class, Object> eventBusListeners = Multimaps.synchronizedMultimap(HashMultimap.<Class, Object> create());

    public <L> void registerListener(Class<? extends SystemEvent<L>> eventClass, L listener) {
        eventBusListeners.put(eventClass, listener);
    }

    @SuppressWarnings("unchecked")
    public <L> void fireEvent(SystemEvent<L> event) {
        Collection<L> eventListeners = (Collection<L>) eventBusListeners.get(event.getClass());
        for (L listener : eventListeners) {
            event.notify(listener);
        }
    }
}

EDIT 08/2015: Sources including unit test are now available on GitHub.

Advantages:

  • static typing
  • dispatcher is a black box, it knows nothing about particular event or listener interface

Disadvantages:

  • necessary to create listener interface for each event
  • potential collision of method names in listener interfaces (when subscriber implements multiple listener interfaces)

Annotation-driven solution
On the other side, implementation of preceding example using EventBus is much easier. No specific interfaces are required. Listener has to only define public method, marked by Subscribe annotation with one parameter, the event that wants to capture.

public class MessageRecievedEvent {

    private final Message msg;

    public MessageRecievedEvent(Message msg) {
        this.msg = msg;
    }

    public Message getMsg() {
        return msg;
    }
}
public class MessageRecievedEventConsumer {

    private static final Logger LOG = LoggerFactory.getLogger(MessageRecievedEventConsumer.class);

    @Subscribe
    public void messageRecieved(MessageRecievedEvent e) {
        LOG.info("messageRecieved(), msg: {}", e.getMsg());
    }
}

Listener must be registered in EventBus instance in order to be notified when an event is fired.

Message msg = new Message("marlly", "Interface vs annotation driven events", "Post about differences between those event architectures");
MessageRecievedEvent msgEvent = new MessageRecievedEvent(msg);
new EventBus().post(msgEvent);

EDIT 08/2015: Sources including unit test are now available on GitHub.

Advantages:

  • less code
  • no specific listener interface for each event
  • can listen to event supertype and take advantage of inheritance
  • detects events that have attached no listeners

Disadvantages:

  • moot lack of static typing (register and post methods accept Object type as parameter)

Conclusion
If you are already utilizing Google Guava library and are looking for simple notification system, you should definitely use EventBus. For others just add Guava libray and use it too :). It’s really simple and effective way how to handle with events.

Non-ASCII file names in ZIP archive

Recently one of our clients reports a bug regarding usage of czech national characters in file names within ZIP archive. They just didn’t display correctly. After some analysis I discovered something I never believed is possible nowadays. Windows 7 has no native support for UTF-8 encoded file names in ZIP archive! Common Microsoft, it’s 2011 and support for UTF-8 file name characters is arround at least 5 years (officialy introduced in v6.3.0 of ZIP specification).

The whole problem with file names encoding lies in the fact that ZIP format uses by default IBM PC character encoding set also known as IBM Code Page 437, IBM437 or CP437. Unfortunately this code page restricts storing file name characters to only those within the original MS-DOS range so it’s quite limited. Therefore if you want to use most national characters in file names within ZIP, you have basically two options:

  • Use UTF-8 and set language encoding flag to instruct the processing tool, that characters in file names are encoded in UTF-8
  • Use whatever encoding that’s native to your specific target platform

First Option
With first option you can achieve the best interoperability among operating systems. Downside of this approach is that Windows users have to use some third-party application to handle ZIP archives because compressed folder doesn’t display UTF-8 characters correctly. All well-known ZIP processing tools I tried on Windows (WinZip, WinRAR, 7-Zip) were able to display UTF-8 encoded file names properly. 7-Zip on unix-based systems has also displayed such a file names correctly. Here is the Java code snippet that creates a ZIP archive containig two empty files with slovak national characters in each file name.

ZipArchiveOutputStream zipOut = new ZipArchiveOutputStream(new FileOutputStream("/tmp/utf8.zip"));
zipOut.setEncoding("UTF-8");
zipOut.setUseLanguageEncodingFlag(true);
zipOut.putArchiveEntry(new ZipArchiveEntry("1_ľščťžýáíé.txt"));
zipOut.closeArchiveEntry();
zipOut.putArchiveEntry(new ZipArchiveEntry("2_úäôňďúě.txt"));
zipOut.closeArchiveEntry();
zipOut.flush();
zipOut.close();

This example uses Apache Commons Compress library which allow to specify encoding and set language flag. If you are lucky and already using Java 7 released last month, you can utilize classes from java.util.zip package that obtained new constructor to set encoding. In addition, these classes use UTF-8 by default and read/write language encoding flag. On Java versions <= 1.6 just stay with commons-compress library.

Second Option
Second option is way to go when you address only one operating system using specific code page (that’s our customer case and approach I eventually employed). Suppose all your users use Windows with code page 852 (CP852, IBM852 – standard code page used by central european countries). In this case you can generate ZIP archive in almost the same way as above but this time set the encoding to CP852 and omit the encoding flag.

ZipArchiveOutputStream zipOut = new ZipArchiveOutputStream(new FileOutputStream("/tmp/cp852.zip"));
zipOut.setEncoding("CP852");
zipOut.putArchiveEntry(new ZipArchiveEntry("1_ľščťžýáíé.txt"));
zipOut.closeArchiveEntry();
zipOut.putArchiveEntry(new ZipArchiveEntry("2_úäôňďúě.txt"));
zipOut.closeArchiveEntry();
zipOut.flush();
zipOut.close();

Every tool on the platform using default code page 852 will display national characters from this ZIP file correctly, including Windows compressed folder tool. In order to find out what code page Windows currently uses simply navigate to the following node in registry:

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Nls\CodePage

and look for a key with the name OECMP.

And remember, there is no such thing as universal, always-working approach to ZIP file names encoding.