ei
2015/11/16
16 Nov, 2015

[Article] Get More Value From Your External Applications with the WSO2 Platform

  • Chanuka Dissanayake
  • software engineer - WSO2


Introduction

As organizations expand, they tend to adopt external systems for their day-to-day operations. Organizations that were great at finding new applications and markets have been able to adapt to new technical advancements.There are many external applications such as Salesforce, Pardot, NetSuite and People-HR that can adopted throughout the organization. These systems contain a lot of data related to different sales, marketing, HR and finance processes among other internal processes, which can be extracted for internal applications or to build a central database that correlates the data for analysis. The WSO2 integration platform enables organizations to leverage the real value of big data.



WSO2 advantage

Figure 1: WSO2 integration platform

Cloud, mobile, social, and big data are challenging enterprises to keep pace with a rapid evolution in enterprise integration. Pure service-oriented architecture (SOA) and enterprise service bus (ESB) based on-premise integration platforms are making way for APIs and cloud connectivity. The WSO2 platform offers modern integration technologies, empowering enterprises to build an internally and externally connected business by seamlessly connecting cloud and mobile apps, services, multiple data repositories, social media, and on-premises systems. It address the demands of integrating disparate applications by various vendors, with heterogeneous protocols connecting services, legacy systems and cloud systems. The 100% open source comprehensive and cohesive platform enables you to be agile and is highly cost effective.



Addressing separate scenarios

There are many ways of integrating a system. Organizations can select the most appropriate way according to their requirement. This article will examine two use cases with different integration strategies that follow separate technologies.



Use case 1: Salesforce integration

Salesforce is one of the most popular cloud Customer Relationship Management (CRM) solutions that contains multiple cloud products including Pardot. You would usually want to extract data in these systems for different purposes. For example, you might need to identify the number of sales qualified lead quarterly or the digital marketing expenses/campaigns they processed).

In this use case, we will use a Salesforce connector though WSO2 Enterprise Service Bus (WSO2 ESB) and publish large scale data batch wise via data services.



High level architecture

Figure 2: Use case 1- High level architecture diagram

Figure 2 above shows the integration of Salesforce using WSO2 products, which includes WSO2 ESB connectors, WSO2 ESB, WSO2 Data Service Server (WSO2 DSS), WSO2 Dashboard Server, WSO2 Identity Server and WSO2 App Manager.

Here we have WSO2 ESB connectors that allows interaction with Salesforce APIs and services. The Salesforce connector provides a different set of operations. For this scenario we will use the query operations to retrieve the data. After retrieving the data we are going to store that data in our central database. In order to do that we are going to use WSO2 DSS and send batch requests. We are using batch requests to avoid large amounts of service calls and to maintain security. The data we store in the central database can be processed and retrieved to the DS for presentation by calling WSO2 DSS services. To manage different types of dashboards with different permission levels and security features, we are using WSO2 App Manager and WSO2 Identity Server connected with a Lightweight Directory Access Protocol (LDAP) user base.



Low level architecture

Figure 3.1: Use case 1 - Low level architecture diagram of proxy service (Part A)

Using the query operation in the Salesforce connector, we will query the data that we want to retrieve from Salesforce. That data will be sent in a sequence to the data mappings and the batch request will be prepared. When there is more data, Salesforce will send it as batches, therefore we need to iterate through each batch and prepare the batch request. To handle that situation we have a iterator after the initial query call. An enrich mediator before the iterator mediator will provide the iteration count and the QueryLocator value required to capture the next batch of data. The querymore operator will request the next batch of Salesforce records.

Figure 3.2: Use case 1 - Low level architecture diagram of proxy service (Part B)

Inside the iterator we have the same sequence again to do mappings and prepare the batch request. After each querymore operation, each batch will be senr to that sequence to process.

Figure 4.1: Use case 1 - Low level architecture diagram of sequence (Part A)

Enrich mediator will contain the context of the data batch we send from the proxy service. Initially we have Payloadfactory to store the insert_batch_request method to call the WSO2 DSS service. Property meditor next to that will store the batch message payload that is to be sent. The enrich mediator will save the batch data to body context to be read from the iterator. After that, each record will be iterated to prepare the batch message payload.

Figure 4.2: Use case 1 - Low level architecture diagram of sequence (Part B)

Each data item will be sent through the data mapper mediator to do the mapping and operations required. After preparing each record, it will be stored as a child element of the batch message payload.

Figure 4.3: Use case 1 - Low level architecture diagram of sequence (Part C)

The prepared batch payload will be sent to the WSO2 DSS endpoint.



Sequence: source

<?xml version="1.0" encoding="UTF-8"?>
<sequence name="SF_Data_Sequence" trace="disable" xmlns="http://ws.apache.org/ns/synapse">
    <enrich>
        <source clone="true" type="body"/>
        <target property="INIT_MSG_PAYLOAD" type="property"/>
    </enrich>
    <payloadFactory media-type="xml">
        <format>
            <p:insert_SF_Account_operation_batch_req xmlns:p="https://ws.wso2.org/dataservice"/>
        </format>
        <args/>
    </payloadFactory>
    <!-- Creating a property called DB_BATCH_MSG_PAYLOAD with the root element expression -->
    <property expression="//p:insert_SF_Account_operation_batch_req" name="DB_BATCH_MSG_PAYLOAD" scope="operation" type="OM" xmlns:ns="https://org.apache.synapse/xsd" xmlns:p="https://ws.wso2.org/dataservice"/>
    <enrich>
        <source clone="true" property="INIT_MSG_PAYLOAD" type="property"/>
        <target type="body"/>
    </enrich>
    <!--  Let's iterate through the data, we can iterate through records -->
    <iterate continueParent="true" expression="//rec:records" sequential="true" xmlns:ns="https://org.apache.synapse/xsd" xmlns:rec="urn:partner.soap.sforce.com">
        <target>
            <sequence>
                <property expression="get-property('operation','DB_BATCH_MSG_PAYLOAD')" name="DB_BATCH_MSG_PAYLOAD" scope="default" type="OM"/>
                <datamapper config="gov:datamapper/SFConfig.dmc" inputSchema="gov:datamapper/SFConfig_inputSchema.json" inputType="XML" outputSchema="gov:datamapper/SFConfig_outputSchema.json" outputType="XML"/>
                <enrich>
                    <source clone="true" type="body"/>
                    <target property="DB_MSG_PAYLOAD" type="property"/>
                </enrich>
                <property expression="//p:insert_SF_Account_operation" name="DB_MSG_PAYLOAD" scope="default" type="OM" xmlns:p="https://ws.wso2.org/dataservice"/>
                <enrich>
                    <source clone="true" property="DB_BATCH_MSG_PAYLOAD" type="property"/>
                    <target type="body"/>
                </enrich>
                <enrich>
                    <source clone="true" property="DB_MSG_PAYLOAD" type="property"/>
                    <target action="child" xmlns:p="https://ws.wso2.org/dataservice" xpath="//p:insert_SF_Account_operation_batch_req"/>
                </enrich>
                <property expression="//p:insert_SF_Account_operation_batch_req" name="DB_BATCH_MSG_PAYLOAD" scope="operation" type="OM" xmlns:p="https://ws.wso2.org/dataservice"/>
            </sequence>
        </target>
    </iterate>
    <property expression="get-property('operation','DB_BATCH_MSG_PAYLOAD')" name="DB_BATCH_MSG_PAYLOAD" scope="default" type="OM"/>
    <enrich>
        <source clone="true" property="DB_BATCH_MSG_PAYLOAD" type="property"/>
        <target type="body"/>
    </enrich>
    <header name="Action" scope="default" value="urn:insert_SF_Account_operation_batch_req"/>
    <call>
        <endpoint>
            <address uri="https://localhost:8679/services/SF_Data_Service"/>
        </endpoint>
    </call>
</sequence>



How to implement use case 1

Please refer to this blog to understand the steps of doing the above solution step by step.



Use case 2: People-HR integration

Most organizations use an external human resource (HR) application to manage their employee data. People-HR is such an application used to manage HR-related processes. When there is a huge amount of big data, processing and analyzing (e.g. how a team member or a team has performed last month includes data correlation between Pople-HR, JIRA, Redmine, etc.) it is not easy. In order to perform such analytics WSO2 Data Analytics Server (WSO2 DAS) provides an amazing platform along with the WSO2 Dashboard Server.

This use case will take People-HR as an example for the external system and demonstrate how to extract the data and present it in a dashboard.



High level architecture

Figure 5: Use case 2 - High level architecture diagram

Figure 5 above shows the integration of People-HR using WSO2 products, which includes WSO2 ESB connectors, WSO2 ESB, WSO2 DAS and WSO2 Internet of Things Server (WSO2 IoT Server).

Data captured through the People-HR connector will be published to WSO2 DAS via WSO2 ESB. That data will be process and analysed through WSO2 DAS. Here, spark queries are used to store the data in an external database. Processed data can be presented using the dashboard in WSO2 DAS. There may be several devices accessing the analytic dashboard, so the WSO2 IoT Server can be use to manage these devices.

Most of the above implementations and deployments will be done through WSO2 Developer Studio (from the WSO2 ESB configurations to deployment of a C-App on a remote server). Therefore you will be given a good understanding about using developer studio for their integration purposes.

At the end of this article you will be able to develop a gadget which will display the data retrieved from the connector. This will be the end user interface that will be used by your organization’s management to make decisions.



Low level architecture

Figure 6: Use case 2 - Low level architecture diagram of proxy service

Once the data is retrieved from the query call from People-HR operation, each record will be iterated. Inside the iterate, each record will be mapped into a specific variable and it’ll be published as a stream to WSO2 DAS. Log mediators (optional) will help you to monitor the incoming and outgoing data through each iteration.



Proxy service: source

<?xml version="1.0" encoding="UTF-8"?>
<proxy name="PeopleHrProxy" startOnLoad="true" trace="disable"
  transports="http https" xmlns="http://ws.apache.org/ns/synapse">
  <target>
    <inSequence onError="faultHandlerSeq">
      <peoplehr.query configKey="peopleauth">
        <queryName>Retrieve Employee Info</queryName>
      </peoplehr.query>
      <iterate continueParent="true" expression="//jsonObject/Result"
        id="MyIterator" sequential="true" xmlns:ns="https://org.apache.synapse/xsd">
        <target>
          <sequence>
          <log level="full"></log>
            <publishEvent>
              <eventSink>DAS_EVENT_SINK</eventSink>
              <streamName>PeopleEventStream</streamName>
              <streamVersion>1.0.3</streamVersion>
              <attributes>
                <meta/>
                <correlation/>
                <payload>
                  <attribute defaultValue=""
                    expression="//*[local-name()='Employee Id']/text()"
                    name="EmployeeId" type="STRING"/>
                  <attribute defaultValue=""
                    expression="//*[local-name()='First Name']/text()"
                    name="FirstName" type="STRING"/>
                  <attribute defaultValue=""
                    expression="//*[local-name()='Last Name']/text()"
                    name="LastName" type="STRING"/>
                </payload>
                <arbitrary/>
              </attributes>
            </publishEvent>
            <log>
              <property expression="//*[local-name()='First Name']/text()" name="FirstName"/>
            </log>
          </sequence>
        </target>
      </iterate>
      <respond/>
    </inSequence>
    <outSequence>
      <send/>
    </outSequence>
    <faultSequence/>
  </target>
  <parameter name="ApplicationXMLBuilder.allowDTD">true</parameter>
</proxy>



How to implement use case 2

Please refer to this blog to understand the steps of implementing the above solution step by step.



Conclusion

Use case 1 and 2 explains the usage of the WSO2 platform when dealing with an organization’s big data and analytics. Enterprises need to gain proper insight into how the data is used and communicated through each system. This kind of modeling helps organizations to predict business opportunities and make decisions that increase revenue and return on investment. It will help to effectively target marketing, improve customer experience and avoid internal and external system failures. With this holistic view of your organization you can analyze the results to overcome existing problems and take steps to avoid making the same mistakes again.



References

  1. https://wso2.com/library/articles/2014/02/cloud-to-rdbms-using-wso2-esb/#setupmysql
  2. https://wso2.com/blogs/thesource/2016/09/keep-your-wso2-products-up-to-date-with-wum/
  3. https://docs.wso2.com/display/DSS322/Creating+Using+Various+Data+Sources
  4. https://nuwanpallewela.wordpress.com/2016/07/16/understanding-wso2-data-mapper-5-0-0/
  5. https://docs.wso2.com/display/ESB500/Data+Mapper+Mediator
  6. https://stackoverflow.com/questions/4205181/insert-into-a-mysql-table-or-update-if-exists
  7. https://docs.wso2.com/display/ESBCONNECTORS/Writing+a+Connector
  8. https://docs.wso2.com/display/ESBCONNECTORS/PeopleHR+Connector
  9. https://docs.wso2.com/display/ESB490/Publish+Event+Mediator
  10. https://docs.wso2.com/display/ESB480/Iterate+Mediator
  11. [https://docs.wso2.com/display/ESB490/Working+with+Event+Sinks
  12. https://docs.wso2.com/display/DAS300/Scheduling+Batch+Analytics+Scripts
  13. https://docs.wso2.com/display/DAS300/Understanding+Event+Streams+and+Event+Tables
  14. https://docs.wso2.com/display/DAS300/Configuring+Event+Receivers
  15. https://docs.wso2.com/display/DVS380/Creating+ESB+Artifacts

 

About Author

  • Chanuka Dissanayake
  • software engineer
  • wso2