Thursday, August 8, 2019

Apache Camel DSL Sample That Does A Few Things

Below piece does a few things. It's useful if you are using Camel DSL like me. The explanation is in the form of comment below. 

<!-- Bean instantiation of Sql Server driver which will be referenced by below DSL -->
    <bean id="sqlServerDS"
        class="org.springframework.jdbc.datasource.DriverManagerDataSource">
        <property name="driverClassName" value="${mssql.driverClassName}" />
        <property name="url" value="${mssql.url}" />
        <property name="username" value="${mssql.username}" />
        <property name="password" value="${mssql.password}" />
    </bean>

<!-- Setting up camel endpoint for XSLT which will be used to transform the received XML into another XML -->
<camel:endpoint id="someXSLT" uri="xslt:file://${xslt.location}/someXSLT.xslt"/>

<!-- Setting up File endpoint to output the generated output flat file -->
<camel:endpoint id="InformationFile" uri="file://?fileName=${output.location}/$simple{file:name}&amp;fileExist=Append"/>

<!-- Camel Route starts -->
<camel:route id="LalaLand">

                        <!-- Listening to someQueue which has been setup in ActiveMq -->
                        <camel:from uri="tcpActivemq:queue:someQueue"/>

                        <!-- transform the Xml picked up from someQueue using someXSLT -->
<camel:to ref="someXSLT"/>
                        <!-- define file name for the output file and stores it in Header -->
                        <camel:setHeader headerName="CamelFileName">
<camel:xpath resultType="java.lang.String">concat('somefile_', //code, '.txt')     </camel:xpath>
</camel:setHeader>
                        
                        <!-- logging -->
<camel:log message="SPLIT BEGINS" loggingLevel="DEBUG" logName="lalaland" />
                       <!-- Split the Xml transformed by someXSLT into lines -->
<camel:split>
<camel:xpath>//FileRecord</camel:xpath>
<camel:setHeader headerName="ctcRef">
<camel:xpath resultType="java.lang.String">//contactRef</camel:xpath>
</camel:setHeader>
                                <!- The payload will lose in the next section so needs to store the original message -->
<camel:setProperty propertyName="oriMessage">
<camel:simple>${body}</camel:simple>
</camel:setProperty>

                                <!-- if header variable ctcRef is not empty, Camel DSL will call 3 SQL to finally get the Date Of Birth, which is the requirement of this. Why 3 Sql? Because over here it does not support JOIN query -->
<camel:choice>
<camel:when>
<camel:simple>${header.ctcRef} != ''</camel:simple>
<camel:to uri="sql:SELECT ID FROM CONTACTS WHERE CONTACTREF = :#ctcRef;?dataSource=sqlServerDS" />
<camel:log message="body: ${body[0].get('ID')} " loggingLevel="DEBUG" logName="com.experian" />
<camel:setHeader headerName="contactId">
<camel:simple>${body[0].get('ID')}</camel:simple>
</camel:setHeader>
<camel:log message="contactId: ${header.contactId} " loggingLevel="DEBUG" logName="com.experian" />
<camel:to uri="sql:SELECT ACCOUNTID FROM ACCOUNTCONTACTS WHERE CONTACTTYPEID = 102 AND CONTACTID = :#contactId;?dataSource=sqlServerDS" />
<camel:setHeader headerName="accountId">
<camel:simple>${body[0].get('ACCOUNTID')}</camel:simple>
</camel:setHeader>
<camel:log message="accountId: ${header.accountId} " loggingLevel="DEBUG" logName="com.experian" />
<camel:to uri="sql:SELECT CONVERT(varchar,DATE_OF_BIRTH,23) AS DOB FROM CS_PERSON WHERE ACCOUNTS1 = :#accountId;?dataSource=sqlServerDS" />
                                                <!-- Store Date Of Birth in Header -->
<camel:setProperty propertyName="dob">
<camel:simple>${body[0].get('DOB')}</camel:simple>
</camel:setProperty>
<camel:log message="dob: ${header.dob} " loggingLevel="DEBUG" logName="com.experian" />
</camel:when>
</camel:choice>
                               <!-- Set the Body back with the original payload stored above -->
<camel:setBody>
<camel:simple>${property.oriMessage}</camel:simple>
</camel:setBody>
                                
                                <!-- Retrieves field value using xpath, add the dob from Header, and print in one line -->
<camel:transform>
<camel:xpath resultType="java.lang.String">concat(//accountRef, '|', //contactRef, '|', //title, '|', //givenName, '|', //middleName, '|', //familyName, '|', ${header.dob}, '&#xD;&#xA;')</camel:xpath>
</camel:transform>
                                
                               <!- The below will be executed after last line by checking based on CamelSplitComplete from Header -->
<camel:when>
<camel:simple>${header.CamelSplitComplete} == true</camel:simple>
                                        <!-- Below will add below wordings and number of records based on CamelSplitSize from Header to the output flat file -->
<camel:transform>
<camel:simple>adding new line here ${header.CamelSplitSize} </camel:simple>
</camel:transform>
</camel:when>
                                <!- content will be output to a physical file -->
<camel:to ref="someFile"/>
</camel:split>
</camel:route>

Thursday, May 23, 2019

Ansible: Setting Dynamic Group in Add_Host Task using Jinja2 Templating

I'm very very excited! that I'm able to assign dynamic Group value programtically for the add_host task by using Jinja2 Templating in Ansible Playbook!

What happened was I was passing in a long string of different host-name with comma as the delimiter from Terraform in my "user_data". I needed to parse this long string and split the host-names into different group by matching with a few keywords.

For e.g. , hostnames="web01.xxx/com,web02.xxx.com,app01.yyy.com". My Ansible script will split this, loop it and assign the Group value.

Below is my sample code:

  - add_host:
      name: "{{item}}"
      group: >-
          {% set groupy = "default" -%}
          {% if 'web' in item  -%}
          {%   set groupy = "web" -%}
          {% elif 'app' in item -%}
          {%   set groupy = "app" -%}
          {% endif -%}
          {{ groupy }}
    with_items: "{{hostnames.split(',')}}"

Thursday, January 31, 2019

Named Parameters for Apache Camel SQL (Spring DSL) that involves more than 1 table

I was working on this Apache Camel solution where I need to pick up XML from Apache MQ, then transform the XML payload into text. During the XML payload transformation, I have to enrich the content by querying the DB based on a field read from the same payload.

As in the official documentation version 2.12.4, named parameters can be used in the Endpoint URI , represented by the symbol :# .

However, in much recent versions, Apache Camel user can just directly access Header or Property in the Endpoint URI using :#${property.xxx} or :#{header.xxx}. But in my case, my Apache Camel version is 2.12.4, so I've to stick with the 2.12 way of doing things.

Anyway, the named parameters way works, but only to SQL query that only reads from 1 table. It will not work if the SQL query involves many tables such as joined query. I was getting error message "Invalid Column Name" , but it was never about the Column Name was wrong, it was about Camel SQL Component failed to recognize the :# symbol to replace the value.

So, what happened to the joined table query then? I just broke the SQL query into multiple pieces of Camel SQL calls, with each of them calling only one single table based on the value retrieved from the previous SQL call.

It's worth highlighting that the columns from result set will always be stored in Message Body as Java Map in an Java ArrayList. It looks like this upon printing on log [{ID=12345}]. 

Wednesday, November 7, 2018

Application Hung Caused by Corrupted Connection in DBCP Connection Pool

There was an incident where an unplanned network refresh causing the active connection object in the DBCP Connection Pool to lose the 'connection' to the corresponding database session and this resulted the application to hang upon getting connection from the pool.

This case was unique and was not reproducible due to no privilege to perform a network refresh intentionally. However I managed to reproduce this by intentionally killing the database sessions (Sql Server) while having SoapUI to fire the application continuously.

Solution
- via DBCP configuration. There are 2 solutions:
  1. For Sql Server, it's fixed by setting "maxIdle" of the Apache DBCP as 0, the application always establishes a new connection upon new request.
  2. For Oracle, what I observed from setting the "maxIdle=0" solution at local connecting to remote Oracle database was that the application was experiencing slowness upon getting connection from the DBCP pool. I did not want to compromise the application performance, therefore I tried another method which is to set the "maxWait" property, I set it at 500 milliseconds and at the same time had SOAPUI firing the application continuously. After killing the corresponding Oracle Connection Session, I noticed the application was able to create a new connection very soon, as opposed to previously the application would just be hung.

Friday, September 21, 2018

Tortoise Git SSH Auto-Authorization Issue

I'm using Tortoise Git as the Tortoise client to perform Git actions such as check modifications, clone, add, commit, push etc. via GUI. It's been a pain getting Tortoise to be able to be authorized automatically by the Git Serve.

Background

  1. I've already had private key generated via MobaXTerm and stored in the Git machine's authorized_keys.
  2. Now I want to enable the auto-authorization from TortoiseGit to the Git server upon performing actions such as Git Push. Therefore I need to let my Tortoise Git know the whereabout of the private key.
  3. Here comes the problem, that specific private key has to be converted to Putty format. To do that, we use PuttyGen that is installed together with the Tortoise Git.
    1. Conversations - Import Key (look for the specific Private Key)
    2. Save private key to a location
  4. Next, start Pageant which has also been installed by Tortoise Git installer. An icon will appear at the taskbar. Right click - > Add Key. Select the Private Key that you just created in previous step.


Thursday, February 8, 2018

Apache DBCP Connection Pool Issue

I'm using this Apache DBCP for Connection Pool purpose. One thing I notice is that, every time there's a network failure in between my application and the database (Sql Server), yes it will generate Socket Write Failure/Error exception, and subsequent few calls would somehow cause the  getConnection() of BasicDataSource to freeze. 

I troubleshooted this by monitoring the Processes in the Activity Monitor in Sql Server. I saw there's always a Connection created each time I fired a request. That Connection would just stay, I think my application was using this particular Connection for multiple firing of requests. If I killed this particular Connection then fire a few more times, then I managed to reproduce the issue where the getConnection() was giving no response. 

After a lot of trial and error, finally I found the resolution - maxIdle of the BasicDataSource. The default maxIdle for BasicDataSource is 8, therefore I just need to set maxIdle as zero. This time, although I fired many many times , I did not see any Connection stays in the Process of Activity Monitor anymore! No more Connection Pool freeze! 


Monday, January 22, 2018

Camel-Context(DSL) only, No Coding! How to Log? How to Handle Error? How to Configure Retry? How to work out the Property Placeholder?

I'm excited to have found new way in Apache Camel to build specific feature without writing any single code, which has always been my way due to time constraint and flexibility (to make everything configurable and every exception handled properly). It was a long winding process with many many rounds of trial and error.

What I'm trying to achieve here is very simple, just read file and SFTP to the other side and Backup the original Source File! If this would have been done in a Java/Groovy and registered as a Spring Bean, then it shall be very fast. However I chose to do it the Camel-way!


  1. In your Camel-Context.xml (Spring DSL), declare a Route.  
  2. The <From> should be the File EIP. Remember to set a "Delay" for the file polling interval. 
  3. Next, use the SFTP EIP for the <To>


Done. So simple! However...

I want to LOG the entire process, such as:  

  1. What Are the Files Being Picked Up? 
  2. Whether the SFTP is Successful or Failed?  
  3. Was the SFTPed file moved to a Backup folder? 
For my own knowledge, I was using 2 ways to log:
  1. Camel's Log EIP - To log Route process + Error to the Application-specific log. 
  2. Camel's Log - To log Error to the General Error log. 
**For Error, will be written to both logs. 

Camel's Log EIP 
<camel:log message="[MyTimer2] SFTP Begins ${file:name}" loggingLevel="DEBUG" logName="myTimer2Logger" />
**${file:name} here is a kind of Reserved Keyword inside Camel's. It's called the File Expression language and this specific one will output the name of the file picked up in the log.  

Camel's Log
<camel:to uri="log:error-rollingFileAppender?level=ERROR&amp;showCaughtException=true&amp;showStackTrace=true" />
** First Parameter (in this case - error-rollingFileAppender) refers to the Appender setup in my Application's LogBack.xml


Question, where should I place my LOG Error Block? I should only LOG Error upon Exception right? 

First, I tried the <DoTry> - <DoCatch> way, I put my LOG Error function within <DoCatch>. However, there is a flaw here! Upon SFTP Exception, the File:// will still regard the SFTP as successful and continued to move file to the Backup folder. 

Finally I found the right way, use <OnException>! So just put the Log Error block inside your <OnException>, which is inside the <Route>! 


<camel:exception>java.lang.Exception</camel:exception>
<camel:to uri="log:error-rollingFileAppender?  level=ERROR&amp;showCaughtException=true&amp;showStackTrace=true" />
<camel:log message="[MyTimer2] File Transmission Error ${exception.stacktrace}" loggingLevel="ERROR" logName="myTimer2Logger" />
</camel:onException> 


Is that all? Nope.

I wanted to Retry for a fixed number of times upon any kind of Exception! 

Easy. Just declare the below:

<camel:redeliveryPolicyProfile id="myRedelivery" retryAttemptedLogLevel="ERROR" maximumRedeliveries="${H2MaxRetry}" redeliveryDelay="${H2RetryDelay}" />

Then add this as the "redeliveryPolicyRef" to your Route.

Last but not least, I wanted to set all the properties, such as the Input Directory, the SFTP host, username, password and etc in a single properties file

Here's the trick:
Use <Endpoint> for your File:// and SFTP:// 
Somehow only Endpoint allows Spring Property Placeholder! 

<camel:endpoint id="send_source" uri="file://${app.dir}/filewatcher/anz_output?delay=${myTimer2}&amp;move=${app.dir}/filewatcher/anz_backup/$simple{file:name}.done&amp;moveFailed=${app.dir}/filewatcher/anz_bad"/>
<camel:endpoint id="send_remote" uri="sftp://${H2hostname}${H2Path}?username=${H2user}&amp;password=${H2password}&amp;soTimeout=${H2SocketTimeout}&amp;binary=${H2Binary}"/>

**These properties placeholder are stored in the properties file configured at the PlaceholderConfigurer