Java Posts

Creating a Rest API Call with WorkItemHandler on Red Hat Process Automation Manager

Usually we are using tasks for showing steps of process on top of Red Hat Process Automation Manager (RHPAM). But most of the time we need a customized task involved, thats where WorkItemHandler comes in handy.
For this demo, im trying to create a simple WorkItemHandler to create a custom task on top of RHPAM which is getting some response from a third party api provider.

So here is pretty much the raw concept,

As usual, for beginning we need to create a pom xml file, and it is important to put “scope” variable as provided.

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.edw</groupId>
    <artifactId>AnimeWorkItemHandler</artifactId>
    <version>1.0.4</version>
    <description>a simple rest api to be used within RHPAM</description>
    <packaging>jar</packaging>


    <properties>
        <maven.compiler.source>1.8</maven.compiler.source>
        <maven.compiler.target>1.8</maven.compiler.target>
    </properties>

    <distributionManagement>
        <repository>
            <id>repo-custom</id>
            <url>http://nexus:8081/repository/pam/</url>
        </repository>
        <snapshotRepository>
            <id>repo-custom</id>
            <url>http://nexus:8081/repository/pam/</url>
        </snapshotRepository>
    </distributionManagement>

    <dependencies>
        <dependency>
            <groupId>org.jbpm</groupId>
            <artifactId>jbpm-flow</artifactId>
            <version>7.38.0.Final</version>
            <scope>provided</scope>
        </dependency>

        <dependency>
            <groupId>org.codehaus.jackson</groupId>
            <artifactId>jackson-core-asl</artifactId>
            <version>1.9.9</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.codehaus.jackson</groupId>
            <artifactId>jackson-mapper-asl</artifactId>
            <version>1.9.9</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.codehaus.jackson</groupId>
            <artifactId>jackson-xc</artifactId>
            <version>1.9.9</version>
            <scope>provided</scope>
        </dependency>

    </dependencies>
</project>

and simple java class to do a simple REST Api call,

package com.edw;

import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.net.URL;
import java.net.URLConnection;
import java.util.HashMap;
import java.util.List;

import org.codehaus.jackson.map.ObjectMapper;
import org.drools.core.process.instance.WorkItemHandler;
import org.kie.api.runtime.process.WorkItem;
import org.kie.api.runtime.process.WorkItemManager;

public class AnimeWorkItemHandler implements WorkItemHandler {
    public void abortWorkItem(WorkItem wi, WorkItemManager wim) {
        wim.abortWorkItem(wi.getId());
    }

    // will try fire to https://jikan.moe/ api
    public void executeWorkItem(WorkItem wi, WorkItemManager wim) {
        String name = (String) wi.getParameter("name");

        String nameResponse = "";
        String imageUrl = "";

        try {
            // api endpoint = https://api.jikan.moe/v3/search/character?q=???&limit=1
            URL url = new URL(String.format("https://api.jikan.moe/v3/search/character?q=%s&limit=1", name));

            URLConnection urlConnection = url.openConnection();
            BufferedReader in = new BufferedReader(
                    new InputStreamReader(
                            urlConnection.getInputStream()));
            String inputLine;
            StringBuffer stringBuffer = new StringBuffer();
            while ((inputLine = in.readLine()) != null) {
                stringBuffer.append(inputLine);
            }

            ObjectMapper objectMapper = new ObjectMapper();
            HashMap jsonResult = objectMapper.readValue(stringBuffer.toString(), HashMap.class);
            List<HashMap> results = (List<HashMap>) jsonResult.get("results");

            nameResponse = (String) results.get(0).get("name");
            imageUrl = (String) results.get(0).get("image_url");

            in.close();
        } catch (Exception e) {
            e.printStackTrace();
        }

        HashMap result = new HashMap();
        result.put("nameResponse", nameResponse);
        result.put("imageUrl", imageUrl);
        wim.completeWorkItem(wi.getId(), result);
    }
}

Once created, we need to build it by using an mvn command

mvn clean package

After jar has been build, next step is to upload that jar thru your Business Central,

If uploading success, we can see the uploaded jar file within our Nexus

Next is lets go to RHPAM dashboard, and create a work item definition

And put this values, make sure that the content of “parameters” and “result” is the same as on your java class, which in my case is “name”, “nameResponse” and “imageUrl”.

[
  [
    "name" : "AnimeWorkItemHandler", 
    "parameters" : [ 
        "name" : new StringDataType() 
    ], 
    "results" : [ 
        "nameResponse" : new StringDataType(),
        "imageUrl" : new StringDataType() 
    ], 
    "displayName" : "AnimeWorkItemHandler", 
    "icon" : "" 
  ]
]

The next step, and pehaps the most important is, include our newly created jar file into RHPAM project. We can do so by going thru “Settings” tab, and create a new Work Item Handlers,

And import the library and dependency needed

The last is adding our new WorkItemHandler to our existing workflow,

So as you can see, it’s not that difficult to create a custom task on RHPAM. And for the full code for this project can be accessed on my github page,

https://github.com/edwin/anime-work-item-handler
Google+

Securing Connection Between Pods in Openshift with SSL

On this post, im trying to create a simple microservices application on top of Openshift 3.11 and each services will do a simple secure connection between it by using a self-sign SSL which are managed by Openshift.

The goal of why Openshift are managing SSL certificate thru Openshift Secret is to have a rolling or rotating certificate feature on each services but can be triggered by Openshift without have to replace SSL on each services manually.

First is generate a p12 certificate by using keytool

cert>keytool -genkey -alias edw 
	-keystore edw.p12 -storetype PKCS12 
	-keyalg RSA -storepass password 
	-validity 730 -keysize 4096
What is your first and last name?
  [Unknown]:  Edwin
What is the name of your organizational unit?
  [Unknown]:  Company 01
What is the name of your organization?
  [Unknown]:  IT
What is the name of your City or Locality?
  [Unknown]:  Jakarta
What is the name of your State or Province?
  [Unknown]:  Jakarta
What is the two-letter country code for this unit?
  [Unknown]:  ID
Is CN=Edwin, OU=Company 01, O=IT, L=Jakarta, ST=Jakarta, C=ID correct?
  [no]:  yes

Next is creating two java projects which are connected one and another,

https://github.com/edwin/ssl-pods-example
https://github.com/edwin/ssl-pods-example-2

There are several part of the code that need mentioning,

First is making sure https option is active on application.properties, include our p12 certificate and make certificate password as parameterized. This parameter later on will be injected as environment variables on Openshift.

server.ssl.key-store-type=PKCS12
server.ssl.key-store=cert/edw.p12
server.ssl.key-store-password=${SSLPASSWORD}
server.ssl.key-alias=edw

server.port=8443
server.ssl.enabled=true

And the next is because we are using a custom certificate, dont forget to include it on RestTemplate.

@Configuration
public class MyRestTemplate {

    @Value("${server.ssl.key-store}")
    private String sslKeyStore;

    @Value("${server.ssl.key-store-password}")
    private String sslPassword;

    @Bean
    public RestTemplate restTemplate() throws Exception {
        KeyStore clientStore = KeyStore.getInstance("PKCS12");
        clientStore.load(new FileInputStream(sslKeyStore), sslPassword.toCharArray());

        SSLContext sslContext = SSLContextBuilder
                .create()
                .loadTrustMaterial(clientStore, new TrustSelfSignedStrategy())
                .build();
        SSLConnectionSocketFactory socketFactory = new SSLConnectionSocketFactory(sslContext, NoopHostnameVerifier.INSTANCE);
        HttpClient httpClient = HttpClients.custom()
                .setSSLSocketFactory(socketFactory)
                .build();
        HttpComponentsClientHttpRequestFactory factory = new HttpComponentsClientHttpRequestFactory(httpClient);

        return new RestTemplate(factory);
    }
}

Deploy those two application to Openshift,

oc new-app registry.access.redhat.com/openjdk/openjdk-11-rhel7~https://github.com/edwin/ssl-pods-example

oc new-app registry.access.redhat.com/openjdk/openjdk-11-rhel7~https://github.com/edwin/ssl-pods-example-2

Deploy certificate as OCP Secret and mount it as a volume on our application,

oc create secret generic cert --from-file=cert\edw.p12

oc set volume dc ssl-pods-example --add -t secret -m /deployments/cert --name cert --secret-name cert
oc set volume dc ssl-pods-example-2 --add -t secret -m /deployments/cert --name cert --secret-name cert

And our certificate password as OCP Secret and inject it as environment variable to our application

oc create secret generic sslpassword --from-literal=SSLPASSWORD=password

oc set env dc ssl-pods-example --from=secret/sslpassword 
oc set env dc ssl-pods-example-2 --from=secret/sslpassword 

After all deployed on OCP, next is give a route for our application. Im using re-encrypt method for ensuring an end to end encryption within the app. In order to do so, we need to include our application CA certificate as our route’s destination certificate. We can do so by exporting our certificate from p12 file using this command,

keytool -exportcert -keystore edw.p12 -storetype PKCS12 -storepass password -alias edw -file edw.crt -rfc

And paste the certificate on our route,

The end result would be like below image,

And as you can see, we are using certificate from end to end for securing our connection.

Google+

Deploy a New Application and Building It Using Openshift S2I Feature and a Custom Base Image

Lots of ways to deploy apps to Openshift, one of it is by using oc new-app command. We are trying now to create a new app using corresponding command, but specifying a custom base image for it. For this example, im using a OpenJDK 11 and RHEL 7 base image.

The command is quite easy, run it on your code folder

D:\source> oc new-app registry.access.redhat.com/openjdk/openjdk-11-rhel7~. --name=spring-boot-2

D:\source> oc start-build spring-boot-2 --from-dir=.

It will create a BuildConfig with the name of spring-boot-2,

D:\source> oc get bc spring-boot-2
NAME            TYPE      FROM      LATEST
spring-boot-2   Source    Binary    3

We can see the detail of our BuildConfig by running this command,

D:\source> oc describe bc spring-boot-2

....
Strategy:       Source
From Image:     ImageStreamTag openjdk-11-rhel7:latest
Output to:      ImageStreamTag spring-boot-2:latest
Binary:         provided on build
....

And if we have some code change and want to redeploy, we can run this command

D:\source> oc start-build spring-boot-2 --from-dir=.

It will rebuild the whole image, and using new code which are uploaded from existing source directory.

Google+

Create Async HTTP Process with Red Hat Fuse on Top of Openshift 3.11

Got a unique requirement yesterday where one process on Red Hat Fuse consuming a very long time, therefore creating lots of timeout error from frontend. It happen because some process on Fuse performing a very complicated task and we cannot modified existing api flow due to business requirements.

For this example, i want to simulate the same slowness on my Fuse + Spring Boot app,

package com.redhat.edw;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication
public class Application {
    public static void main(String[] args) {
        SpringApplication.run(Application.class, args);
    }
}

A simple camel route,

package com.redhat.edw;

import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.model.rest.RestBindingMode;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;

@Component
public class Routes extends RouteBuilder {
    @Override
    public void configure() throws Exception {
        restConfiguration("servlet")
                .bindingMode(RestBindingMode.json)
        ;

        rest()
                .get("hello")
                .route()
                .setBody(method("helloRouteHandler", "setHelloWithName"))
                .endRest()
        ;
    }
}

A placeholder bean,

package com.redhat.edw;

import lombok.*;

@Data
@ToString
@AllArgsConstructor
@NoArgsConstructor
@Builder
public class HelloResponse {
    private String content;
}

And a Handler class,

package com.redhat.edw;

import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;

@Slf4j
@Component("helloRouteHandler")
public class HelloRouteHandler {

    @Value("${name}")
    private String name;

    public HelloResponse setHelloWithName() throws Exception {

        /*
         * simulate a very long process (10second)
         */
        Thread.sleep(10000);

        /*
         * this process will still getting called after 10 second regardless of sync or async
         */
        log.info("calling hello for "+name);

        return HelloResponse.builder().content(name).build();
    }
}

And two simple properties file,

# The Camel context name
camel.springboot.name=FuseHelloWorld

# enable all management endpoints
endpoints.enabled=true
management.security.enabled=false

camel.component.servlet.mapping.contextPath=/api/*

name=Edwin
# OpenShift ConfigMap name
spring.application.name=fuse-hello-world

spring.cloud.kubernetes.reload.enabled=true
spring.cloud.kubernetes.reload.strategy=restart_context
spring.cloud.kubernetes.reload.monitoring-config-maps=true
spring.cloud.kubernetes.reload.monitoring-secrets=true
spring.cloud.kubernetes.reload.mode=polling

Run the project and try calling /api/hello api and see how much time is needed to give response time.

As you can see, 10seconds is not an acceptable response for a regular web user and wee need to improve it. The workaround is quite simple, Spring have an @Async method which we will utilize on Handler class,

package com.redhat.edw;

import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.scheduling.annotation.Async;
import org.springframework.stereotype.Component;

@Slf4j
@Component("helloRouteHandler")
public class HelloRouteHandler {

    @Value("${name}")
    private String name;

    @Async
    public HelloResponse setHelloWithName() throws Exception {

        /*
         * simulate a very long process (10second)
         */
        Thread.sleep(10000);

        /*
         * this process will still getting called after 10 second regardless of sync or async
         */
        log.info("calling hello for "+name);

        return HelloResponse.builder().content(name).build();
    }
}

Which will give a faster response,

And we can deploy our code to Openshift by using this command,

mvn fabric8:deploy -Pfabric8

A successful deployment will looks like this,

And finally, the complete code can be downloaded from this url,

https://github.com/edwin/fuse-with-async-http
Google+

Creating a Jenkins Slave Image with Maven 3.6, Java 11 and Skopeo

Openshift have a default maven Jenkins slave image, but too bad it is build on top of Java 8. And on this project which im currently working on, i need a custom Jenkins slave but with Java 11 and the ability to move images between Image Registry. Therefore i create a custom Dockerfile which contains Skopeo, Maven 3.6.3 and Java 11. Below is the detail Dockerfile which i created,

FROM openshift/jenkins-slave-base-centos7:v3.11

MAINTAINER Muhammad Edwin < edwin at redhat dot com >


ENV MAVEN_VERSION=3.6.3 \
    PATH=$PATH:/opt/maven/bin

# install skopeo
RUN yum install skopeo -y && yum clean all

# install java
RUN curl -L --output /tmp/jdk.tar.gz https://download.java.net/java/GA/jdk11/9/GPL/openjdk-11.0.2_linux-x64_bin.tar.gz && \
	tar zxf /tmp/jdk.tar.gz -C /usr/lib/jvm && \
	rm /tmp/jdk.tar.gz && \
	update-alternatives --install /usr/bin/java java /usr/lib/jvm/jdk-11.0.2/bin/java 20000 --family java-1.11-openjdk.x86_64 && \
	update-alternatives --set java /usr/lib/jvm/jdk-11.0.2/bin/java
	
# Install Maven
RUN curl -L --output /tmp/apache-maven-bin.zip  https://www-eu.apache.org/dist/maven/maven-3/${MAVEN_VERSION}/binaries/apache-maven-${MAVEN_VERSION}-bin.zip && \
    unzip -q /tmp/apache-maven-bin.zip -d /opt && \
    ln -s /opt/apache-maven-${MAVEN_VERSION} /opt/maven && \
    rm /tmp/apache-maven-bin.zip && \
    mkdir -p $HOME/.m2

RUN chown -R 1001:0 $HOME && chmod -R g+rw $HOME

COPY run-jnlp-client /usr/local/bin/

USER 1001

Build by using this command,

docker build -t jenkins-slave-skopeo-jdk11-new -f skopeo-jdk11.dockerfile .

Pull the image to Openshift,

oc import-image docker.io/edwinkun/jenkins-slave-skopeo-jdk11-new --confirm

Register on Jenkins as a

And try on

node('maven') {
	stage('Clone') {
		sh "git config --global http.sslVerify false"
		sh "git clone https://github.com/edwin/hello-world.git"
	}
	stage('Build') {
		sh "mvn -v"
		sh "mvn clean package -f hello-world/pom.xml"
	}
}

This is the result,

Detail code can be seen on my github page, https://github.com/edwin/jenkins-slave-maven-jdk11-skopeo

Google+