Embedding React into legacy web pages

The product I am enthusiastically working on – Navigo3 – was historically written as combination of XSLT and JSP. Do you remember XSLT?


Short time before I joined it it was decided that is should be remade in React.js. It was great decision but transition is not that easy.

Because Navigo3 is in production usage we have to transform from XSLT to React step by step. Some pages are that complex that it would be handy being able to combine XSLT and React on single page. Typically it is required on pages that combine many types of information – dashboards. A few graphs, tables, list of tasks, … It would be hard to rewrite all at glance. But rewrite each one is quite simple.


At the same time we have new pages which are written completely in React and uses hashtag navigation. In main entry file index.js there is redux-router hierarchy and it expects that it runs on purely React page. This actually does not match with demand to embed some components into JSP/XSLT static page with classical URL navigation, right?

But I would like to share React components between brand new react pages and legacy XSLT pages! And at some moment just compose React page from existing components and abandon legacy ones. So how to do that?

  1. Prepare second entry file (like indexLegacy.js) and prepare build task for it (we use Webpack).
  2. Insert result bundle file via <script src=’bundleLegacy.js’/> into JSP/XSLT pages right before </body> element.
  3. In JSP/XSLT/whatever page put empty DIV tag in place where React components should be rendered and give it nice id or class name – like “hack_legacy_tasks_list”.
  4. In new entry file write logic that basically perform list of IFs that fills prepared DIVs – but only if they exists:
    • Check if desired element with unique name is presented in DOM (we use jQuery for that)
    • If it is presented render desired component into it (using React.render(<MyComponent>, element))
  5. Of course list of IFs can be replaced by some metadata and have generic code that process them.
  6. If component needs some context, you can wrap it. But you have to do it for all of them separately. They do not share tree hierarchy (React instance).
  7. If all components on page needs to share something, it can be instantiated in entry file and passed to all components.

That’s it. Lets sum up some advantages:

  • Components are shared between new shiny React pages and legacy pages.
  • There is single entry bundle for legacy stuff. If your legacy pages includes shared footer file you can place reference to legacy bundle there. Because unique ids/classnames are used for binding, it should not harm any existing content.
  • You may place multiple components per legacy page – insert multiple DIVs.

React.js FTW!

Here is sample of indexLegacy.js

import React from "react"
import {render} from "react-dom"
import moment from "moment"

import jQuery from 'jquery'

import {Utils} from './utils/Utils'

//container for legacy components
import {LegacyApp} from './containers/LegacyApp'

import {ReactDemo} from './quark/ReactDemo'
//...and other 10+ components

//key is CSS selector, value is function that returns React element
const instances = {
  '.hack_react_demo': (elem)=><ReactDemo/>   //this defines method that creates React component for HTML element <div class='hack_react_demo'/> placed everywhere
  //...and other 10+ components
//render reactElement into DOM elemenet wrapped by LegacyApp
function __createReactInstance(element, reactElement) {
  render(<LegacyApp>{reactElement}</LegacyApp>, element)

//for each entry in instances object
    //for each DOM element found by jQuery
    jQuery(selector).each((i, element)=>{
      //get function for selector and call it (passing DOM element)
      const reactElement = instances[selector](element)

      //create React component
      __createReactInstance(element, reactElement)

Selenium: Get rid of logs in Eclipse

Selenium for Java by default uses java.utils.logging (JUL) infrastructure for logging. During development I have seen logs like:

Nov 21, 2016 8:16:32 AM org.openqa.selenium.remote.ProtocolHandshake createSession
INFO: Attempting bi-dialect session, assuming Postel's Law holds true on the remote end
Nov 21, 2016 8:16:34 AM org.openqa.selenium.remote.ProtocolHandshake createSession
INFO: Detected dialect: OSS

I am using log4j2 through slf4j and just creating logger in configuration file did not worked. Solution was:

1. Add Gradle/Maven dependency to log4j-jul

2. As documentation says set system property “java.util.logging.manager” to “org.apache.logging.log4j.jul.LogManager” either via -D parameter or in Main like:

public static void main(String[] args) {
  System.setProperty("java.util.logging.manager", "org.apache.logging.log4j.jul.LogManager");

It must be set BEFORE any call to log4j!!!

3. Define log4j logger:

<Logger name="org.openqa.selenium" level="warn" additivity="false">
 <AppenderRef ref="console"/>

And unnecessary logs are gone!

git gui: Error when opening dialogs

I have 3 monitors configuration and recently I have noticed issue in ‘git gui’. When I tried to revert changes or merge branch, this dialog appeared:


That caused hang of window. In command line was visible an error:

jarek@lp-jarek:~/Navigo3/master/np$ git gui
bgerror failed to handle background error.
 Original error: bad pad value "3m": must be positive screen distance
 Error in bgerror: bad pad value "3m": must be positive screen distance

After some googling I have found an issue (bogus DPI):

jarek@lp-jarek:~$ xdpyinfo | grep -A1 dimen
 dimensions: 5408x1152 pixels (0x0 millimeters)
 resolution: -2147483648x-2147483648 dots per inch

And fixed it via:

jarek@lp-jarek:~$ xrandr --dpi 96
jarek@lp-jarek:~$ xdpyinfo | grep -A1 dimen
 dimensions: 5408x1152 pixels (1430x304 millimeters)
 resolution: 96x96 dots per inch

i-tec USB 3.0 Dual Docking Station & Ubuntu 16.04

[There is updated article for Ubuntu 17.10]

My desktop looks like:

2016-10-27 08.18.44.jpg

I wanted to have 2 external monitors, but my Dell  Inspiron 15 (7559) has only one HDMI port. Other thing was that I had to plug/unplug many cables when I wanted to go to meeting room. So I decided to buy USB 3.0 dock. Not an easy task when you love Linux. Most docks only support Windows and OSX.

I have purchased i-tec USB 3.0 Dual Docking because they promise Ubuntu 16.04 compatibility. They only “forgot” to mention that only KMS drivers are supported, so forgot official NVIDIA blob drivers. Even with Nouveau drivers there was issues. Because I had also another issues I switched off dedicated card completely.

Then it was necessary to install DisplayLink driver from here. But after reboot graphic part of dock did not work. By running dmesg I have found out that kernel module was not loaded. I am using SecureBoot and their kernel module is not signed. So I have to do following as root:

cd /root
mkdir signing
cd signing

#generate certificate - do this once, but keep result files - you will have to sign modules for every kernel update
openssl req -new -x509 -newkey rsa:2048 -keyout MOK.priv -outform DER -out MOK.der -nodes -days 36500 -subj "/CN=Descriptive name/"

#import your certificate as trusted into SecureBoot
#remember entered password - you have to fill it after reboot and confirm adding certificate
mokutil --import MOK.der

#sign your module - you have to do this on every kernel update
/usr/src/linux-headers-$(uname -r)/scripts/sign-file sha256 ./MOK.priv ./MOK.der $(modinfo -n evdi)

Then reboot your computer, authorize new certificate and dock should work!

Currently I have last issue: when playing music through dock it is time to time reset. It disconnects monitors, network and audio…

Dell Inspiron 15 (7559), Ubuntu 16.04 and NVIDIA GeForce GTX 960M

I had purchased Dell  Inspiron 15 (7559) when I entered my current job at Navigo3. I was asked to buy Dell to keep same brand with other laptops in company. I made some research and only feasible option (SSD, 16GB RAM, …) was this one. Unfortunately it is based on Intel Skylake which is not fully supported in Ubuntu 16.04 yet. It should be better with 4.8 kernel that should be backported in February 2017.

However main issue is dedicated graphics. I have experiences various issues from freezing during boot, problems with suspend, setting resolution on multiple screens, etc. Because I mostly do web development, I don’t need dedicated graphic card. So I decided to switch off dedicated card and keep only Intel HD Graphics 530. It is quite easy:

Install bbswitch-dkms:

apt install bbswitch-dkms

Switch off NVIDIA card by appending following into /etc/modules

bbswitch load_state=0

Blacklist NVIDIA by appending following into /etc/modprobe.d/blacklist.conf

blacklist nouveau
blacklist nvidia


And finally update initial ramdisk by running

update-initramfs -u

After reboot only Intel should be enabled:


Credits: http://askubuntu.com/a/709552/93726 (Sorry, I cannot upvote because I don’t have enough reputation points)

PAC manager: Slow clipboard paste by middle click

I have create new AWS instance and defined connection to it via PAC manager. Later I realized that pasting through middle mouse button click or pasting through context menu is incredibly slow. Almost like typing it on keyboard. Pasting through CTRL+INSERT was fast.

I suffered two days with it and did not succeed in googling that issue. Today I found out that problem was in “Wait 200 millisecs for automated char sending” setting. In rest of connections it is set to 0 so I have no idea how this setting was done – maybe I set it by incident.


Gradle + Immutables + Eclipse

There is some documentation about IDE integration directly on Immutable web. Unfortunately none of proposed approaches works.

Fortunately there is Gradle plugin gradle-processors. Usage is deadly simple:

plugins {
  id 'org.inferred.processors' version '1.2.3'

dependencies {
  processor group: 'org.immutables', name: 'value', version: '2.2.6'

(Please not usage of processor instead of compile in dependencies)

That’s it! Just do gradle eclipse and restart Eclipse.