WSO2 Venus

Yasassri RatnayakeHow to get rid of GTK3 errors when using eclipse



When I was trying to use eclipse on Fedora 26 I faced many errors related to GTK 3. Following are some of the errors I saw. These were observed in Mars2, Oxygen and also in Neon.

(Eclipse:11437): Gtk-WARNING **: Allocating size to SwtFixed 0x7fef3992f2d0 without calling gtk_widget_get_preferred_width/height(). How does the code know the size to allocate?

(Eclipse:13633): Gtk-WARNING **: Negative content width -1 (allocation 1, extents 1x1) while allocating gadget (node trough, owner GtkProgressBar)

(Eclipse:13795): Gtk-WARNING **: Negative content width -1 (allocation 1, extents 1x1) while allocating gadget (node trough, owner GtkProgressBar)


(Eclipse:13795): Gtk-CRITICAL **: gtk_distribute_natural_allocation: assertion 'extra_space >= 0' failed


All above issues are caused by GTK 3. So as a solution for this issues what we can do is force eclipse to use GTK2. Following is how you can do this.

To force GTK2, simply export the following environment variable.


1
2
3
4
#Export Following
export SWT_GTK3=0
#Start Eclipse using the same terminal session
./eclipse


Note : Make sure you start eclipse in the same terminal session where the Exported sys variable is visible to eclipse.

If you want to force eclipse to use GTK3 you can simply change the variable as follows SWT_GTK3=1

Thanks for reading and please drop a comment if you have any queries. 

Lakshani GamageAdding a New Store Theme to Enterprise Mobility Manager(EMM)

A theme consists of UI elements such as logos, images, background colors etc. WSO2 EMM Store comes with a default theme.



You can extend the existing theme by writing a new one.

From this blog post I'm going to show how to change styles (background colours, fonts etc.)
First, Create a directory called "carbon.super/themes" inside <EMM_HOME>/repository/deployment/server/jaggeryapps/store/themes/.

Then, Create a directory called "css" inside <EMM_HOME>/repository/deployment/server/jaggeryapps/store/themes/carbon.super/themes.
Add below two css files to above created css directory. You can change it's value based on your preferences.

1. appm-left-column-styles.css


/*========== MEDIA ==========*/
@media only screen and (min-width: 768px) {
.page-content-wrapper.fixed {
min-height: calc(100% - 130px);
max-height: calc(100% - 130px);
}
}

.media {
margin-top: 0;
}

.media-left {
padding-right: 0;
}

.media-body {
background-color: #EFEFEF;
}

/**/
/*========== NAVIGATION ==========*/


.section-title {
background-color: #444444;
border: 1px solid #444444;
height: 40px;
padding-top: 5px;
width: 200px;
padding-left: 10px;
font-size: 18px;
color: #fff;
}

/**/
/*========== TAGS ==========*/
.tags {
word-wrap: break-word;
width: 200px;
padding: 5px 5px 5px 5px;
background-color: #ffffff;
display: inline-block;
margin-bottom: 0;
}

.tags > li {
line-height: 20px;
font-weight: 400;
cursor: pointer;
border: 1px solid #E4E3E3;
font-size: 12px;
float: left;
list-style: none;
margin: 5px;
}

.tags > li a {
padding: 3px 6px;
}

.tags > li:hover,
.tags > li.active {
color: #ffffff;
background-color: #7f8c8d;
border: 1px solid #7f8c8d;
}

.tags-more {
float: right;
margin-right: 11px;
}

/**/
/*=========== RECENT APPS ==========*/
.recent-app-items {
list-style: none;
width: 200px;
padding: 5px 0 5px 0;
background-color: #ffffff;
margin-bottom: 10px;
}

.recent-app-items > li {
padding: 6px 6px 6px 6px;
}
.recent-app-items .recent-app-item-thumbnail {
width: 60px;
height: 45px;
line-height: 45px;
float: left;
text-align: center;
}

.recent-app-items .recent-app-item-thumbnail > img {
max-height: 45px;
max-width: 60px;
}

.recent-app-items .recent-app-item-thumbnail > div {
height: 45px;
width: 60px;
color: #ffffff;
font-size: 14px;
}

.recent-app-items .recent-app-item-summery {
background-color: transparent;
padding-left: 3px;

width:127px;
}

.recent-app-items .recent-app-item-summery, .recent-app-items .recent-app-item-summery > h4 {
overflow: hidden;
text-overflow: ellipsis;
white-space: nowrap;
}

nav.navigation > ul{
background: #525252;
color: #fff;
position: relative;
-moz-box-shadow: 0 1px 6px rgba(0, 0, 0, 0.1);
-ms-box-shadow: 0 1px 6px rgba(0, 0, 0, 0.1);
-webkit-box-shadow: 0 1px 6px rgba(0, 0, 0, 0.1);
box-shadow: 0 1px 6px rgba(0, 0, 0, 0.1);
-moz-user-select: none;
-webkit-user-select: none;
-ms-user-select: none;
list-style: none;
padding:0px;
margin: 0px;
}

nav.navigation ul li {
min-height: 40px;
color: #fff;
text-decoration: none;
font-size: 16px;
font-weight: 100;
position: relative;
}

nav.navigation a:after{
content: " ";
display: block;
height: 0;
clear: both;
}

nav.navigation ul li a i {
line-height: 100%;
font-size: 21px;
vertical-align: middle;
width: 40px;
height: 40px;
float: left;
text-align: center;
padding: 9px;
}

nav.navigation .left-menu-item {
text-align: left;
vertical-align: middle;
padding-left: 10px;
line-height: 38px;
width: 160px;
height: 40px;
font-size: 14px;
display: table;
margin-left: 40px;
}

nav.navigation .left-menu-item i{
float: none;
position: relative;
left: 0px;
font-size: 10px;
display: table-cell;
}

ul.sublevel-menu {
padding: 0px ;
list-style: none;
margin: 0px;
display: block;
background-color: rgb(108, 108, 108);

}

ul.sublevel-menu li{
line-height: 40px;
}

ul.sublevel-menu li a{
display:block;
font-size: 14px;
text-indent:10px;
}
ul.sublevel-menu li a:hover{
background-color: #626262;
}
nav.navigation ul > li .sublevel-menu li .icon{
background-color: rgb(108, 108, 108);
}
nav.navigation ul > li ul.sublevel-menu li a:hover .icon{
background-color: #626262;
}
ul.sublevel-menu .icon {
background-color: none;
font-size: 17px;
padding: 11px;
}

nav a.active .sublevel-menu {
display: block;
}

nav .sublevel-menu {
display: none;
}

nav.navigation.sublevel-menu{
display: none;
}

nav.navigation ul > li.home .icon {
background: #c0392b;
color: white;
}

nav.navigation ul > li.home.active {
background: #c0392b;
color: white;
}

nav.navigation ul > li.home.active > .left-menu-item {
background: #c0392b;
}

nav.navigation ul > li.green .icon {
background: #63771a;
color: white;
}

nav.navigation ul > li.green:hover > .icon {
background: #63771a;
color: white;
}

nav.navigation ul > li.green:hover .left-menu-item, nav.navigation ul > li.green.active .left-menu-item, nav.navigation ul > li.green.hover .left-menu-item {
background: #63771a;
color: white;

}

nav.navigation ul > li.red .icon {
background: #c0392b;
color: white;
}

nav.navigation ul > li.red:hover > .icon {
background: #c0392b;
color: white;
}

nav.navigation ul > li.red:hover .left-menu-item, nav.navigation ul > li.red.active .left-menu-item, nav.navigation ul > li.red.hover .left-menu-item {
background: #c0392b;
color: white;

}

nav.navigation ul > li.orange .icon {
background: #0a4c7f;
color: white;
}

nav.navigation ul > li.orange:hover > .icon {
background: #0a4c7f;
color: white;
}

nav.navigation ul > li.orange:hover .left-menu-item, nav.navigation ul > li.orange.active .left-menu-item, nav.navigation ul > li.orange.hover .left-menu-item {
background: #0a4c7f;
color: white;

}

nav.navigation ul > li.yellow .icon {
background: #f39c12;
color: white;
}

nav.navigation ul > li.yellow:hover > .icon {
background: #f39c12;
color: white;
}

nav.navigation ul > li.yellow:hover .left-menu-item, nav.navigation ul > li.yellow.active .left-menu-item, nav.navigation ul > li.yellow.hover .left-menu-item {
background: #f39c12;
color: white;

}

nav.navigation ul > li.blue .icon {
background: #2980b9;
color: white;
}

nav.navigation ul > li.blue:hover > .icon {
background: #2980b9;
color: white;
}

nav.navigation ul > li.blue:hover .left-menu-item, nav.navigation ul > li.blue.active .left-menu-item, nav.navigation ul > li.blue.hover .left-menu-item {
background: #2980b9;
color: white;

}

nav.navigation ul > li.purple .icon {
background: #766dde;
color: white;
}

nav.navigation ul > li.purple:hover > .icon {
background: #766dde;
color: white;
}

nav.navigation ul > li.purple:hover .left-menu-item, nav.navigation ul > li.purple.active .left-menu-item, nav.navigation ul > li.purple.hover .left-menu-item {
background: #766dde;
color: white;

}

nav.navigation ul > li.grey .icon {
background: #2c3e50;
color: white;
}

nav.navigation ul > li.grey:hover > .icon {
background: #2c3e50;
color: white;
}

nav.navigation ul > li.grey:hover .left-menu-item, nav.navigation ul > li.grey.active .left-menu-item, nav.navigation ul > li.grey.hover .left-menu-item {
background: #2c3e50;
color: white;

}

nav.navigation .second_level {
display: none;
}

nav.navigation .second_level a {
line-height: 20px;
padding: 8px 0 8px 10px;
box-sizing: border-box;
-webkit-box-sizing: border-box;
-moz-box-sizing: border-box;
}

nav.navigation .second_level a:hover {
background-color: rgba(0, 0, 0, 0.05);
}

nav.navigation .second_level > .back {
height: 100%;
padding: 0 3px;
background: #FFF;
vertical-align: middle;
font-size: 13px;
width: 5px;
}

nav.navigation .second_level > .left-menu-item {
padding: 6px 0;
text-align: left;
width: 100%;
vertical-align: middle;
}

@media (min-width: 320px) and (max-width: 991px) {
ul.sublevel-menu li a {
text-indent:0px;
}
}

.page-content-wrapper.fixed .sidebar-wrapper.sidebar-nav,
.page-content-wrapper.fixed .sidebar-wrapper.sidebar-options {
width: 250px;
background: #373e44;
overflow-y: auto;
overflow: visible;
}

.page-content-wrapper.fixed .sidebar-wrapper.sidebar-nav-sub {
height: 100%;
z-index: 1000000;
background: #272c30;
}


.page-content-wrapper.fixed .sidebar-wrapper.sidebar-options {
width: 235px;
max-height: calc(100% - 85px);
}
.sidebar-wrapper.toggled .close-handle.close-sidebar {
display: block;
}

#left-sidebar{
background-color: inherit;
color: inherit;
}

#left-sidebar.sidebar-nav li a{
color:inherit;
}

@media (min-width: 768px){
.visible-side-pane{
position: relative;
left: 0px;
width: initial;
}
}

.mobile-sub-menu-active {
color: #63771a !important;
}

2. appm-main-styles.css

/*========== HEADER ==========*/
header {
background: #242c63;
}

header .header-action {
display: inline-block;
color: #ffffff;
text-align: center;
vertical-align: middle;
line-height: 30px;
padding: 10px 10px 10px 10px;
}

header .header-action:hover,
header .header-action:focus,
header .header-action:active {
background: #4d5461;
}

/**/
/*========== BODY ==========*/
.body-wrapper a:hover {
text-decoration: none;
}

.body-wrapper > hr {
border-top: 1px solid #CECDCD;
margin-top: 50px;
}

/**/
/*=========== nav CLASS ========*/
.actions-bar {
background: #2c313b;
}

.actions-bar .navbar-nav > li a {
line-height: 50px;
}

.actions-bar .navbar-nav > li a:hover,
.actions-bar .navbar-nav > li a:focus,
.actions-bar .navbar-nav > li a:active {
background: #4d5461;
color: #ffffff;
}

.actions-bar .navbar-nav > .active > a,
.actions-bar .navbar-nav > .active > a:hover,
.actions-bar .navbar-nav > .active > a:focus,
.actions-bar .navbar-nav > .active > a:active {
background: #4d5461;
color: #ffffff;
}

.actions-bar .dropdown-menu {
background: #2c313b;
}

.actions-bar .dropdown-menu > li a {
line-height: 30px;
}

.navbar-search, .navbar-search .navbar {
min-height: 40px;
}

.navbar-menu-toggle {
float: left;
height: 40px;
padding: 0;
line-height: 47px;
font-size: 16px;
background:#1A78D8;
color: #ffffff;
}
.navbar-menu-toggle:hover, .navbar-menu-toggle:focus, .navbar-menu-toggle:active {
color: #ffffff;
background: #0F5296;
}
/**/
/*========== SEARCH ==========*/
.search-bar {
background-color: #035A93;
}

.search-box .input-group, .search-box .input-group > input,
.search-box .input-group-btn, .search-box .input-group-btn > button {
min-height: 40px;
border: none;
margin: 0;
background-color: #004079;
color: #ffffff;
}

.search-box .input-group-btn > button {
opacity: 0.8;
}

.search-box .input-group-btn > button:hover,
.search-box .input-group-btn > button:active,
.search-box .input-group-btn > button:focus {
opacity: 1;
}

.search-box .search-field::-webkit-input-placeholder {
/* WebKit, Blink, Edge */
color: #fff;
opacity: 0.8;
font-weight: 100;
}

.search-box .search-field:-moz-placeholder {
/* Mozilla Firefox 4 to 18 */
color: #fff;
opacity: 0.8;
font-weight: 100;
}

.search-box .search-field::-moz-placeholder {
/* Mozilla Firefox 19+ */
color: #fff;
opacity: 0.8;
font-weight: 100;
}

.search-box .search-field:-ms-input-placeholder {
/* Internet Explorer 10-11 */
color: #fff;
opacity: 0.8;
font-weight: 100;
}

.search-field {
padding-left: 10px;
}
.search-box .search-by, .search-box .search-by-dropdown {
background-color: #002760 !important;
color: #fff !important;
}

.search-box .search-by-dropdown {
margin-top: 0;
border: none;
}

.search-box .search-by-dropdown li a {
background-color: #002760;
color: #fff;
}

.search-box .search-by-dropdown li a:hover,
.search-box .search-by-dropdown li a:active,
.search-box .search-by-dropdown li a:focus {
background-color: #004D86 !important;
color: #fff;
}

.search-options {
position: absolute;
top: 100%;
right: 0;
bottom: auto;
left: auto;
float: right;
z-index: 1000;
margin: 0 15px 0 15px;
background-color: #002760;
color: #fff;
}

/**/
/*========== PAGE ==========*/
.page-header {
height: auto;
padding: 10px 0 10px 0;
border-bottom: none;
margin: 0;
}

.page-header:after {
clear: both;
content: " ";
display: block;
height: 0;
}

.page-header .page-title {
margin: 0;
padding-top: 6px;
display: inline-block;
}

.page-header .page-title-setting {
display: inline-block;
margin-left: 5px;
padding-top: 10px;
}

.page-header .page-title-setting > a {
padding: 5px 5px 5px 5px;
opacity: 0.7;
}

.page-header .page-title-setting > a:hover,
.page-header .page-title-setting > a:active,
.page-header .page-title-setting > a:focus,
.page-header .page-title-setting.open > a {
opacity: 1;
background-color: #e4e4e4;
}

.page-header .sorting-options > button {
padding: 0 5px 0 5px;
}

.page-content .page-title {
margin-left: 0px;
}
/**/
/*========== NO APPS ==========*/
.no-apps {
width: 100%;
}

.no-apps, .no-apps div, .no-apps p {
background-color: #ffffff;
text-align: center;
cursor: help;
}

.no-apps p {
cursor: help;
}

/**/
/*========== APP THUMBNAIL ITEMS==========*/
.app-thumbnail-ribbon {
display: block;
position: absolute;
top: 0;
height: 25%;
color: #ffffff;
z-index: 500;
border: 1px solid rgb(255, 255, 255);
border: 1px solid rgba(255, 255, 255, .5);
/* for Safari */
-webkit-background-clip: padding-box;
/* for IE9+, Firefox 4+, Opera, Chrome */
background-clip: padding-box;
border-top-width: 0;
}

.app-thumbnail-type {
display: block;
position: absolute;
bottom: 0;
left: 0;
height: 30%;
color: #ffffff;
z-index: 500;
border: 1px solid rgb(255, 255, 255);
border: 1px solid rgba(255, 255, 255, .5);
/* for Safari */
-webkit-background-clip: padding-box;
/* for IE9+, Firefox 4+, Opera, Chrome */
background-clip: padding-box;
border-left-width: 0;
border-bottom-width: 0;
font-size: 2em;
}

.app-thumbnail-ribbon > span, .app-thumbnail-type > span {
position: absolute;
top: 50%;
left: 50%;
transform: translate(-50%, -50%);
-webkit-transform: translate(-50%, -50%);
-moz-transform: translate(-50%, -50%);
-ms-transform: translate(-50%, -50%);
-o-transform: translate(-50%, -50%);
}

/**/
/*========== APP TILE ==========*/
.app-tile {
background-color: #ffffff;
margin-bottom: 20px;
}

.app-tile .summery {
padding: 10px 0 10px 10px;
max-width: 100%;
}

.app-tile .summery > h4 {
margin-top: 5px;
margin-bottom: 0;
white-space: nowrap;
overflow: hidden;
text-overflow: ellipsis;
}
.app-tile .summery a h4 {
margin-top: 5px;
margin-bottom: 0;
white-space: nowrap;
overflow: hidden;
text-overflow: ellipsis;
}

.app-tile .summery > h5 {
margin-top: 0;
}

.app-tile .summery > h4, .app-tile .summery > h5, .app-tile .summery > p {
text-overflow: ellipsis;
white-space: nowrap;
overflow: hidden;
-ms-text-overflow: ellipsis;
-o-text-overflow: ellipsis;
}

.app-tile .summery > .more-menu {
/*position: relative;*/
}

.app-tile .summery > .more-menu .more-menu-btn {
float: right;
height: auto;
background-color: #F7F7F7;
color: #838383;
padding: 10px;
margin-top: -10px;
}

.app-tile .summery > .more-menu.open .more-menu-btn {
background-color: #D2D2D2;
}

.app-tile .summery > .more-menu .more-menu-btn:hover {
background-color: #e4e4e4;
}

.app-tile .summery > .more-menu .more-menu-items {
margin-top: 0;
}

/**/
/*========== APP DETAILS ==========*/
.app-details {
background-color: #ffffff;
}

.app-details .summery > h4, .app-details .summery > p {
white-space: nowrap;
overflow: hidden;
}

.app-details .summery > .actions {
margin: 10px 0 0 0;
}

.app-details .summery > .actions > a {
margin: 5px 5px 5px 0;
}

.app-details .summery > .actions > a > i {
padding-right: 5px;
}

.app-details-tabs {
padding: 0 15px 0 15px;
}

.app-details-tabs > .nav-tabs > li > a {
border-radius: 0;
}

.app-details-tabs > .nav-tabs > li.active > a,
.app-details-tabs > .nav-tabs > li.active > a:hover,
.app-details-tabs > .nav-tabs > li.active > a:focus,
.app-details-tabs > .nav-tabs > li.active > a:active {
background-color: #fff;
border: 1px solid #fff;
border-radius: 0;
}

.app-details-tabs > .nav-tabs > li > a:hover,
.app-details-tabs > .nav-tabs > li > a:focus,
.app-details-tabs > .nav-tabs > li > a:active {
background-color: #E8E8E8;
border: 1px solid #E8E8E8;
border-radius: 0;
}

.app-details-tabs > .tab-content {
padding: 20px 17px;
background-color: #fff;
}

.app-details-tabs > .tab-content > h3 {
margin-top: 0;
}

/**/
/*========== DEFAULT THUMBNAIL & BANNER ==========*/
.default-thumbnail, .default-banner {
color: #ffffff;
position: absolute;
top: 50%;
left: 50%;
transform: translateX(-50%) translateY(-50%);
-webkit-transform: translate(-50%, -50%);
-moz-transform: translate(-50%, -50%);
-ms-transform: translate(-50%, -50%);
-o-transform: translate(-50%, -50%);
}

/**/
/*========== RATING ==========*/
.rating > .one {
opacity: 1;
}

.rating > .zero {
opacity: 0.3;
}

/**/
/*========== UTILS ==========*/
a.disabled {
cursor: default;
}

.absolute-center {
position: absolute;
top: 50%;
left: 50%;
transform: translateX(-50%) translateY(-50%);
-webkit-transform: translate(-50%, -50%);
-moz-transform: translate(-50%, -50%);
-ms-transform: translate(-50%, -50%);
-o-transform: translate(-50%, -50%);
}

.ratio-responsive-1by1 {
padding: 100% 0 0 0;
}

.ratio-responsive-4by3 {
padding: 75% 0 0 0;
}

.ratio-responsive-16by9 {
padding: 56.25% 0 0 0;
}

.ratio-responsive-1by1, .ratio-responsive-4by3, .ratio-responsive-16by9 {
width: 100%;
position: relative;
}

.ratio-responsive-item {
display: block;
position: absolute;
top: 0;
bottom: 0;
left: 0;
right: 0;
text-align: center;
}

.ratio-responsive-item:after {
content: ' ';
display: inline-block;
vertical-align: middle;
height: 100%;
}

.ratio-responsive-img > img {
display: block;
position: absolute;
max-height: 100%;
max-width: 100%;
left: 0;
right: 0;
top: 0;
bottom: 0;
margin: auto;
}

.hover-overlay {
position: absolute;
bottom: 0;
left: 0;
width: 100%;
height: 100%;
display: none;
color: #FFF;
}

.hover-overlay-container:hover .hover-overlay {
display: block;
background: rgba(0, 0, 0, .6);
cursor: pointer;
}

.hover-overlay-inactive-container:hover .hover-overlay {
display: block;
background: rgba(0, 0, 0, .6);
cursor: not-allowed;
}

/**/
/*========== COLORS ==========*/
/*
focus : background 5% lighter, border 5% darker
hover: background 10% lighter, border 5% darker
active: background 10% lighter, border 5% darker
*/

/* subscribe - main color: #603cba */
.background-color-subscribe {
background-color: #603cba;
}

.background-color-on-hover-subscribe {
background-color: transparent;
}

.background-color-on-hover-subscribe:hover {
background-color: #603cba;
}

.btn-subscribe {
color: #fff;
background-color: #603cba;
border-color: #603cba;
}

.btn-subscribe:focus,
.btn-subscribe.focus {
color: #fff;
background-color: #6D49C7;
border-color: #532FAD;
}

.btn-subscribe:hover,
.btn-subscribe:active,
.btn-subscribe.active {
color: #fff;
background-color: #7A56D4;
border-color: #532FAD;
}

/* favorite - main color: #810847 */
.background-color-favorite {
background-color: #810847;
}

.background-color-on-hover-favorite {
background-color: transparent;
}

.background-color-on-hover-favorite:hover {
background-color: #810847;
}

.btn-favorite {
color: #fff;
background-color: #810847;
border-color: #810847;
}

.btn-favorite:focus,
.btn-favorite.focus {
color: #fff;
background-color: #8E1554;
border-color: #75003B;
}

.btn-favorite:hover,
.btn-favorite:active,
.btn-favorite.active {
color: #fff;
background-color: #9B2261;
border-color: #75003B;
}

/* all apps - main color: #007A5F */
.background-color-all-apps {
background-color: #007A5F;
}

.background-color-on-hover-all-apps {
background-color: transparent;
}

.background-color-on-hover-all-apps:hover {
background-color: #007A5F;
}

/* advertised - main color: #C64700 */
.background-color-ad {
background-color: #C64700;
}

.background-color-inactive {
background-color: #C10D15;
}

.background-color-deprecated {
background-color: #FFCC00;
}

/*========== MOBILE PLATFORM COLORS ========*/
.background-color-android {
background-color: #a4c639;
}

.background-color-apple {
background-color: #CCCCCC;
}

.background-color-windows {
background-color: #00bcf2;
}
.background-color-webapps {
background-color: #32a5f2;
}

/*=============== MOBILE ENTERPRISE INSTALL MODAL =========*/
.ep-install-modal {
background: white !important;
color: black !important;
}

.ep-install-modal .dataTables_filter label {
margin-top: 5px;
margin-bottom: 5px;
}
.ep-install-modal .dataTables_filter label input {
margin: 0 0 0 0 !important;
min-width: 258px !important;
}

.ep-install-modal .dataTables_info {
float: none !important;
}

.ep-install-modal .dataTables_paginate {
float: none !important;
}

.ep-install-modal .dataTables_paginate .paginate_enabled_next{
color: #1b63ff;
margin-left: 5px;
}

.ep-install-modal .dataTables_paginate .paginate_enabled_previous{
color: #1b63ff;
}

.ep-install-modal .dataTables_paginate .paginate_disabled_next{
margin-left: 5px;
}

.ep-install-modal .modal-header button {
color: #000000;
}

#noty_center_layout_container {
z-index: 100000001 !important;
}

/*=================MOBILE DEVICE INSTALL MODAL*==============*/
.modal-dialog-devices .pager li>a {
background-color: transparent !important;
}
.modal-dialog-devices .thumbnail {
background-color: transparent; !important;
border: none !important;
}
/*---*/

/*===================HOME PAGE SEE MORE OPTION==============*/
.title {
width: auto;
padding: 0 10px;
height: 50px;
border-bottom: 3px solid #3a9ecf;
float: left;
padding-top: 14px;
font-size: 20px;
font-weight: 100;
}

.fav-app-title {
width: auto;
padding: 0 10px;
height: 50px;
border-bottom: 3px solid #3a9ecf;
float: left;
padding-top: 14px;
font-size: 20px;
font-weight: 100;
margin-bottom: 10px;
}

.more {
color:#000;
float:right;
background-image:url(../img/more-icon.png)!important;
background-position:center left;
background-repeat:no-repeat;
text-transform:uppercase;
padding:23px 3px 16px 36px !important
}

a.more:hover {
color:#3a9ecf;
text-decoration:none;
background-image:url(../img/more-icon-hover-blue.png)!important;
background-position:center left;
background-repeat:no-repeat
}

a.more:active {
background-color:none
}

a.more:focus {
border:none
}
/*---*/

Refresh store. Store will look like below.





Nipun SuwandaratnaData Analytics with WSO2 Analytics Platform

Data Analytics and Visualization is a key requirement for any organization today. Proper Analytics and Visualization of data helps make better informed business decisions, reduce losses and increase profitability.

Data Analytics requirements can vary depending on what kind of data you need to analyze, the input mediums as well as the urgency of when it needs to be analyzed and acted upon.

Today any organization would produce a large amount of data. This data could be complex, scattered and transmitted through multiple mediums and protocols. Capturing this data and conducting analysis on large sets of structures and unstructured data could be a daunting task.

Furthermore, there are occasions where data needs to be analyzed as they are produced in real time.

In other cases it is required to predict future events or trends based on historical and current data.

And in all cases data visualization is a key aspect. Interactive dashboards would make it easy for users to interact with data using functions such as sort and filter and make the decision making process much easier. 


What WSO2 offers:

WSO2 offers a complete Analytics Platform that provides solutions for all the aforementioned use-cases. The WSO2 Analytics platform offers the following:

Batch Analytics
Analyze a set of data collected over a period of time.
Suitable for high volumes of data.

Real-Time Analytics
Continuous processing of input data in real time.
Suitable for critical systems where immediate actions is required e.g: Flight radar systems

Interactive Analytics
Obtaining fast results on indexed data by executing ad-hoc queries

Predictive Analytics
Predict future events by analyzing historical and current data


Batch Analytics

Lets look at Batch Analytics in the perspective of Big Data.

What is Big Data ?

“Big data is a term for data sets that are so large or complex that traditional data processing applications are inadequate to deal with them”    - (Ref: Wikipedia)

Why Analyze Big Data ?

  • Make informed Business decisions - make decisions based on patterns emerging from analyzing historic data
  • Improve customer experience - discover customer preferences, purchasing patterns and present the most relevant data
  • Process Improvements - identify areas of the business process that needs improvement 


Example: Better customer experience in airline seat reservation/allocation

Automatically allocate seats to customers based on their previous seat booking preferences by analyzing historic data related to seat reservations.

seating-plan-a310-300(1).png

img ref: http://staticcontent.transat.com/airtransat/infovoyageurs/content/EN/seating-plan-a310-300(1).png



Real Time Analytics

Identify most meaningful events within an event cloud
Analyze the impact
Acts on them in real time

Example: City Transport Control System - Analyzing traffic, monitor movement of buses, generate alerts based on traffic, speed & route
tfl.png
img ref: http://wso2.com/library/demonstrations/2015/02/screencast-analyzing-transport-for-london-data-with-wso2-cep/



Predictive Analytics:

Approaches:
  1. Machine Learning
  2. Other approaches such as statistical modeling
Machine learning is the science of getting computers to act without being explicitly programmed - (ref: http://online.stanford.edu/)

Example: e-Commerce sites use predictive analytics to suggest the most relevant merchandize, increasing sales opportunity

amazon.png
img ref: Amazon.com




Evanthika AmarasiriHow to access an ActiveMQ queue from WSO2 ESB which is secured with a username/password

By default, a queue in ActiveMQ can be accessed without providing any credentials. However, in real world scenarios, you will have to deal with secured queues. So in this blog, I will explain how we can enable security for ActiveMQ and what configurations are required to be done in WSO2 ESB.

Pr-requisites - Enable the JMS transport for WSO2 ESB as explained in [1].

Step 1 - Secure the ActiveMQ instance with credentials.

To do this, add the below configuration to the activemq.xml under the <broker> tag and start the server.

<plugins>
    <simpleAuthenticationPlugin anonymousAccessAllowed="true">
        <users>
            <authenticationUser username="system" password="system" groups="users,admins"/>
            <authenticationUser username="admin" password="admin" groups="users,admins"/>
            <authenticationUser username="user" password="user" groups="users"/>
            <authenticationUser username="guest" password="guest" groups="guests"/>
        </users>
    </simpleAuthenticationPlugin>
</plugins>


Step 2 - Enable the JMS Listener configuration and configure it as shown below.

    <!--Uncomment this and configure as appropriate for JMS transport support, after setting up your JMS environment (e.g. ActiveMQ)-->
    <transportReceiver name="jms" class="org.apache.axis2.transport.jms.JMSListener">
        <parameter name="myTopicConnectionFactory" locked="false">
                <parameter name="java.naming.factory.initial" locked="false">org.apache.activemq.jndi.ActiveMQInitialContextFactory</parameter>
                <parameter name="java.naming.provider.url" locked="false">tcp://localhost:61616</parameter>
                <parameter name="java.naming.security.principal" locked="false">admin</parameter>
                <parameter name="java.naming.security.credentials" locked="false">admin</parameter>
                <parameter locked="false" name="transport.jms.UserName">admin</parameter>
                <parameter locked="false" name="transport.jms.Password">admin</parameter>
                <parameter name="transport.jms.ConnectionFactoryJNDIName" locked="false">TopicConnectionFactory</parameter>
                <parameter name="transport.jms.ConnectionFactoryType" locked="false">topic</parameter>
        </parameter>

        <parameter name="myQueueConnectionFactory" locked="false">
                <parameter name="java.naming.factory.initial" locked="false">org.apache.activemq.jndi.ActiveMQInitialContextFactory</parameter>
                <parameter name="java.naming.provider.url" locked="false">tcp://localhost:61616</parameter>
                <parameter name="java.naming.security.principal" locked="false">admin</parameter>
                <parameter name="java.naming.security.credentials" locked="false">admin</parameter>
                <parameter locked="false" name="transport.jms.UserName">admin</parameter>
                <parameter locked="false" name="transport.jms.Password">admin</parameter>
                <parameter name="transport.jms.ConnectionFactoryJNDIName" locked="false">QueueConnectionFactory</parameter>
                <parameter name="transport.jms.ConnectionFactoryType" locked="false">queue</parameter>
        </parameter>

        <parameter name="default" locked="false">
                <parameter name="java.naming.factory.initial" locked="false">org.apache.activemq.jndi.ActiveMQInitialContextFactory</parameter>
                <parameter name="java.naming.provider.url" locked="false">tcp://localhost:61616</parameter>
                <parameter name="java.naming.security.principal" locked="false">admin</parameter>
                <parameter name="java.naming.security.credentials" locked="false">admin</parameter>
                <parameter locked="false" name="transport.jms.UserName">admin</parameter>
                <parameter locked="false" name="transport.jms.Password">admin</parameter>
                <parameter name="transport.jms.ConnectionFactoryJNDIName" locked="false">QueueConnectionFactory</parameter>
                <parameter name="transport.jms.ConnectionFactoryType" locked="false">queue</parameter>
        </parameter>
    </transportReceiver>


Step 3 - Create a Proxy service to listen to a JMS queue in ActiveMQ.

Once the ESB server is started, create the below Proxy service and let it listen to the queue generated in ActiveMQ.


   <proxy name="StockQuoteProxy1" transports="jms" startOnLoad="true">
      <target>
         <endpoint>
            <address uri="http://localhost:9000/services/SimpleStockQuoteService"/>
         </endpoint>
         <inSequence>
            <property name="OUT_ONLY" value="true"/>
         </inSequence>
         <outSequence>
            <send/>
         </outSequence>
      </target>
      <publishWSDL uri="file:repository/samples/resources/proxy/sample_proxy_1.wsdl"/>
      <parameter name="transport.jms.ContentType">
         <rules>
            <jmsProperty>contentType</jmsProperty>
            <default>application/xml</default>
         </rules>
      </parameter>
   </proxy>

Once the above proxy service is deployed, send a request to the queue and observe how the message is processed and send to the backend. You can use the sample available in [2] to test this scenario out.

[1] - https://docs.wso2.com/display/ESB490/Configure+with+ActiveMQ
[2] - https://docs.wso2.com/display/ESB490/Sample+250%3A+Introduction+to+Switching+Transports

Charini NanayakkaraGetting Started with Jenkins on Ubuntu


  1. Install Jenkins Debian package from terminal as specified here: https://pkg.jenkins.io/debian-stable/
  2. Start Jenkins with following command: Use a port which is not already in use. We have used 8081 here: sudo /usr/bin/java -Djava.awt.headless=true -jar /usr/share/jenkins/jenkins.war --webroot=/var/cache/jenkins/war --httpPort=8081
  3. Type http://localhost:8081/ (8081 should be replaed with port you used) in your browser and enter.
  4. Enter the following in your terminal: sudo gedit /root/.jenkins/secrets/initialAdminPassword. This file would display a password with which you need to login to Jenkins the first time
  5. Few plugins would get installed in initial login. Once it gets completed, you should be able to work with Jenkins

Lakshani GamageCreate Carbon Application Archive (CAR) File Using WSO2 Developer Studio

1. Start Developer Studio.
2. Go to Developer Studio > Open Dashboard menu.
3. Click on "ESB Config Project" under "Integration Project".
4. If you want to create the ESB project from existing configuration files, select "Point to Existing Synapse-configs Folder". Otherwise, select "New ESB Config Project" . Then, Click Next.



5. Create a project by giving a name to the project and maven information (Group Id, Artifact Id, Version).


6. Create api, proxy services, sequences, tasks etc that you want to include in the ESB Config project.


7. Build your project. (mvn clean install)
8. Go to Developer Studio > Open Dashboard and  Click on "Composite Application Project" under "Distribution".
9. Give a name to your Composite Project and select all the dependencies that should be included in the CAR Application. Here, I'm adding the ESB Config project that I created in step 4 to CAR application. Then, click "Next"


10. Give maven information (Group Id, Artifact Id, Version). and click "Finish".
11. Click on pom.xml of the created composite application project and go to design view. Select the correct server role of your dependencies.


12. Right click on created composite application project and select "Export Composite Application Project".
13. Give a name, version and export destination to create the CAR app.



14. Now you will find the CAR app in the destination that you gave in the previous step.

You can deploy the created CAR file in WSO2 products as mentioned here.

Hariprasath ThanarajahDynamic Schema Generation for WSO2 ESB Connector's Dynamic operations

To use Data Mapper with WSO2 ESB Connectors. Connectors will contain JSON schemas for each operation(static operation) that defines the message formats to which it will respond and expect. Therefore, when you integrate connectors in a project this Connector option searches through the workspace and find the available Connectors. Then, you can select the respective Connector in the operation, so that the related JSON schema will be loaded for the Data Mapper by the tool.

If the response to a particular operation or the request payload for a particular operation is different then we need to create the schema dynamically. For Example, In Salesforce SOAP connector the sOjects and the fields for a particular sObject are different. In the create operation of this connector, we need to give the request with different types and different fields for that particular sObject type. Then we can't limit the schema as static. For that reason, we create this feature to overcome this issue in ESB-dev-tooling.

Refer about Data Mapper Mediator

You can refer the step-by-step installation guide of dynamic schema generation feature for Salesforce SOAP connector here.

I am explaining the content with a simple use-case using salesforce SOAP Connector.

The Use Case


Here I am going to explain the above use-case with ESB dev-tooling.

Requirements


  • WSO2 ESB - version 5.0 and above
  • WSO2 ESB Tooling - version 5.0 and above

First, you need to create an ESB Solution project in developer studio using New->ESB-Solution Project in the project explorer.

Give the name as dynamicSchemaDemo and leave others as default and click Finish.
(Here we create tool sub-project for registry resources, Composite Application project, and Connector exporter Project).

Then create a custom proxy service to implement and configure the data mapper mediator. Right Click dynamicSchemaDemo -> New -> Proxy Service and give the name as dynamicDemo and select the proxy service type as Custom proxy and leave others as default.

Then add the connector using Right Click dynamicSchemaDemo -> Add or Remove Connector -> select Add Connector -> Next -> You can add the connector from your local file system or from the store.

After that, you can see the imported connector in the tool palette. If you click the connector in the palette then you can see the connector operations like in the figure.

After that, you can configure the message flow.

First, you need to create a payload like below to create a record in Salesforce Account sObject.

       <sfdc:sObjects xmlns:sfdc="sfdc" type="Account">
          <sfdc:sObject>
              <sfdc:Name>wso2123</sfdc:Name>
           </sfdc:sObject>
        </sfdc:sObjects>

From the create operation we can get the response to create the input schema. This is a static one so it is already available with the connector zip.

After that, we need to upsert /update the same record by its Id. So, In this case, we can get the Id from the input schema and mapped it to the generated output schema dynamically. Because the requestBody for different sObjects of the upsert operations are different. So mapped the Id and give the value need to be updated in the record to output schema from that we can generate the request for upsert operation.

Configure the above operations like below

How to configure the PayloadFactory

Click the PayloadFactory and you can see the properties tab -> click payload -> paste the above payload and click OK.

Configure the create operation

Click the create operation -> properties tab ->right click the create operation -> click load parameters from schema -> then the parameters are load from the schema and since this is on the input side you can define the values like in the below image. 
For the sobjects parameter, you are getting the value from the payload using XPath. In this case, you need to specify the namespace to get the XPath of sObjects. You have to specify the namespace for the prefix "sfdc" like below. In the Namespaces area gives sfdc as prefix and sfdc as URI and add then click OK.

 Configure the Datamapper

Click the DataMapper and leave the default as it is and click OK.
After that, you can see the two boxes to define the schemas, In the first box which means in the input schema box you need to create the schema from the response of the create operation. The response is static so the schema is available with the salesforce SOAP connector zip. For that, you can do like below,

Right click the input box -> Select CONNECTOR in Resource type -> Select salesforce connector in Connector -> Select create in the Operation and OK. Then the schema loaded in the input side.
In the output side, we need to generate the schema dynamically because the request for upsert will differ for different sObjects.


Right click the output box -> Select CONNECTOR in Resource type -> Select salesforce connector in Connector -> Select upsert in the Operation -> Then you can see a button call generate schema next to the selected operation -> Click that button.
You can show a dialog like below,

Give the username, password, securityToken and the login Url click the Login button.

From that, we can see the list of sObject types of Salesforce in the SObject combo box. We can select one of the sObject(In this case Account) to create the request(payload) for upsert operation for Salesforce SOAP connector and from the response, it dynamically creates the schema for upsert operation.

So Select Account as SObject -> OK -> OK

Then the schema for upsert operation is created like below on the output side.

After that, we need to map the value to the upsert operation from the response of the create operation like below

And also need to give the required values for the properties in the schema to introduce the constant from the Common -> Constant under the palette. 

You can configure the constant value as Right click the constant -> Configure Constant Operator -> You can specify the constant Type and the value and OK.

In the above case, we mapped the Id from the response of the create operation to Id of upsert operation and introduce constant to allOrNone, allowFieldTruncate, externalId,  type of sobject and the new name to update the record as 0 , 0, Id, Account, Hi Hariprasath respectively.


Configure the upsert operation

Click the upsert operation -> properties tab ->right click the upsert operation -> click load parameters from schema -> then the parameters are load from the schema. You don't need to give the value to the parameters because all the values come from the schema.

For the sobjects parameter, you are getting the value from the payload using XPath. In this case, you need to specify the namespace to get the XPath of sObjects. You have to specify the namespace for the prefix, "sfdc" like below. In the Namespaces area gives sfdc as prefix and sfdc as URI and add then click OK.


Other than the above you have to create the init configuration for salesforce SOAP connector.

When you click the connector operation you can see the Properties like below.

When you select the New Config option then you can see a dialog like below. In this, You can give the name as sf_configuration and give the login details in the boxes and click OK.

If you need to use the configuration in another operation then you just click the available Configs and there you can see the available configuration. You can select from this instead of giving the init config again and again.

Now you complete the use-case. After that, you need to include the connector in the project by adding it to Connector exporter Project from the workspace.

Right click dynamicSchemaDemoConnectorExporter -> New -> Add/Remove Connectors -> Add Connector -> Next -> Click Workspace and select the connector -> OK -> Finish.

After that, you need to create a composite Application Project to deploy it in the ESB and run it. For that need to right click the dynamicSchemaDemoCompositeApplication -> Export Composite Application Project -> Give the Export Destination -> Click Next

You can see all the artifacts like below and need to select all the artifacts to create the CAPP.

So Select All the artifacts -> Finish.

Now we have the CAPP to deploy as carbon Application to WSO2 ESB.

Download the ESB 5.0.0 or above and run it by extracting it and go to {ESB Location}/bin and type ./wso2server.sh to start the ESB and go to https://172.17.0.1:9443/carbon/  console and give admin as the username and password.

In the left side, you can see Carbon Applications. Click the add button under Carbon Applications and browse the CAR file you created before and Open -> Upload 

Then the artifacts that you created earlier are deploying in the ESB.

Click the List under Services. There you can see the proxy service you created earlier in the dev tooling. Click Try this service. You can see the page like below




 After that just click send. Then you can analyze the log  from the terminal. It will look like below,

[2016-12-05 14:51:44,401] DEBUG - wire HTTPS-Sender I/O dispatcher-1 << "Content-Type: text/xml[\r][\n]"
[2016-12-05 14:51:44,401] DEBUG - wire HTTPS-Sender I/O dispatcher-1 << "SOAPAction: "urn:partner.soap.sforce.com/Soap/loginRequest"[\r][\n]"
[2016-12-05 14:51:44,401] DEBUG - wire HTTPS-Sender I/O dispatcher-1 << "Content-Length: 343[\r][\n]"
[2016-12-05 14:51:44,402] DEBUG - wire HTTPS-Sender I/O dispatcher-1 << "Host: login.salesforce.com[\r][\n]"
[2016-12-05 14:51:44,402] DEBUG - wire HTTPS-Sender I/O dispatcher-1 << "Connection: Keep-Alive[\r][\n]"
[2016-12-05 14:51:44,402] DEBUG - wire HTTPS-Sender I/O dispatcher-1 << "User-Agent: Synapse-PT-HttpComponents-NIO[\r][\n]"
[2016-12-05 14:51:44,402] DEBUG - wire HTTPS-Sender I/O dispatcher-1 << "[\r][\n]"
[2016-12-05 14:51:44,402] DEBUG - wire HTTPS-Sender I/O dispatcher-1 << "<?xml version='1.0' encoding='UTF-8'?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:urn="urn:partner.soap.sforce.com"><soapenv:Body><urn:login><urn:username>tharis63@outlook.com</urn:username><urn:password>hariprasath6@THZNCpIbfKWKqXnIdb5qYGopEeo</urn:password></urn:login></soapenv:Body></soapenv:Envelope>"
[2016-12-05 14:51:44,727] DEBUG - wire HTTPS-Sender I/O dispatcher-1 >> "HTTP/1.1 200 OK[\r][\n]"
[2016-12-05 14:51:44,728] DEBUG - wire HTTPS-Sender I/O dispatcher-1 >> "Date: Mon, 05 Dec 2016 09:21:44 GMT[\r][\n]"
[2016-12-05 14:51:44,728] DEBUG - wire HTTPS-Sender I/O dispatcher-1 >> "Strict-Transport-Security: max-age=31536000; includeSubDomains[\r][\n]"
[2016-12-05 14:51:44,728] DEBUG - wire HTTPS-Sender I/O dispatcher-1 >> "Set-Cookie: BrowserId=4Z_NPJrkR4O_TuxLD5Rmkg;Path=/;Domain=.salesforce.com;Expires=Fri, 03-Feb-2017 09:21:44 GMT[\r][\n]"
[2016-12-05 14:51:44,728] DEBUG - wire HTTPS-Sender I/O dispatcher-1 >> "Expires: Thu, 01 Jan 1970 00:00:00 GMT[\r][\n]"
[2016-12-05 14:51:44,728] DEBUG - wire HTTPS-Sender I/O dispatcher-1 >> "Content-Type: text/xml;charset=UTF-8[\r][\n]"
[2016-12-05 14:51:44,728] DEBUG - wire HTTPS-Sender I/O dispatcher-1 >> "Content-Length: 1705[\r][\n]"
[2016-12-05 14:51:44,728] DEBUG - wire HTTPS-Sender I/O dispatcher-1 >> "[\r][\n]"
[2016-12-05 14:51:44,728] DEBUG - wire HTTPS-Sender I/O dispatcher-1 >> "<?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns="urn:partner.soap.sforce.com" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><soapenv:Body><loginResponse><result><metadataServerUrl>https://ap2.salesforce.com/services/Soap/m/37.0/00D280000017q6q</metadataServerUrl><passwordExpired>false</passwordExpired><sandbox>false</sandbox><serverUrl>https://ap2.salesforce.com/services/Soap/u/37.0/00D280000017q6q</serverUrl><sessionId>00D280000017q6q!AQoAQN__fWlhfT0lpmQ95lYzR0JUDcKxfqMmfz8YDcE5PCAof0w5.9vV2UujTU5oOJTD90CHRfCKQtN7P0dFKiq5CzC_zg_V</sessionId><userId>00528000001m5RRAAY</userId><userInfo><accessibilityMode>false</accessibilityMode><currencySymbol>$</currencySymbol><orgAttachmentFileSizeLimit>5242880</orgAttachmentFileSizeLimit><orgDefaultCurrencyIsoCode>USD</orgDefaultCurrencyIsoCode><orgDisallowHtmlAttachments>false</orgDisallowHtmlAttachments><orgHasPersonAccounts>false</orgHasPersonAccounts><organizationId>00D280000017q6qEAA</organ"
[2016-12-05 14:51:44,732] DEBUG - wire HTTPS-Sender I/O dispatcher-1 >> "izationId><organizationMultiCurrency>false</organizationMultiCurrency><organizationName>wso2</organizationName><profileId>00e28000001C79cAAC</profileId><roleId xsi:nil="true"/><sessionSecondsValid>7200</sessionSecondsValid><userDefaultCurrencyIsoCode xsi:nil="true"/><userEmail>tharis63@outlook.com</userEmail><userFullName>Hariprasath Thanarajah</userFullName><userId>00528000001m5RRAAY</userId><userLanguage>en_US</userLanguage><userLocale>en_US</userLocale><userName>tharis63@outlook.com</userName><userTimeZone>America/Los_Angeles</userTimeZone><userType>Standard</userType><userUiSkin>Theme3</userUiSkin></userInfo></result></loginResponse></soapenv:Body></soapenv:Envelope>"
[2016-12-05 14:51:45,387] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "POST /services/Soap/u/37.0/00D280000017q6q HTTP/1.1[\r][\n]"
[2016-12-05 14:51:45,387] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "Strict-Transport-Security: max-age=31536000; includeSubDomains[\r][\n]"
[2016-12-05 14:51:45,387] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "Expires: Thu, 01 Jan 1970 00:00:00 GMT[\r][\n]"
[2016-12-05 14:51:45,387] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "Set-Cookie: BrowserId=4Z_NPJrkR4O_TuxLD5Rmkg;Path=/;Domain=.salesforce.com;Expires=Fri, 03-Feb-2017 09:21:44 GMT[\r][\n]"
[2016-12-05 14:51:45,388] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "Content-Type: text/xml[\r][\n]"
[2016-12-05 14:51:45,388] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "SOAPAction: "urn:partner.soap.sforce.com/Soap/createRequest"[\r][\n]"
[2016-12-05 14:51:45,388] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "Transfer-Encoding: chunked[\r][\n]"
[2016-12-05 14:51:45,388] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "Host: ap2.salesforce.com[\r][\n]"
[2016-12-05 14:51:45,388] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "Connection: Keep-Alive[\r][\n]"
[2016-12-05 14:51:45,388] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "User-Agent: Synapse-PT-HttpComponents-NIO[\r][\n]"
[2016-12-05 14:51:45,388] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "[\r][\n]"
[2016-12-05 14:51:45,388] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "337[\r][\n]"
[2016-12-05 14:51:45,388] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "<?xml version='1.0' encoding='UTF-8'?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:urn="urn:partner.soap.sforce.com"><soapenv:Header><urn:AllOrNoneHeader><urn:allOrNone>0</urn:allOrNone></urn:AllOrNoneHeader><urn:AllowFieldTruncationHeader><urn:allowFieldTruncation>0</urn:allowFieldTruncation></urn:AllowFieldTruncationHeader><urn:SessionHeader><urn:sessionId>00D280000017q6q!AQoAQN__fWlhfT0lpmQ95lYzR0JUDcKxfqMmfz8YDcE5PCAof0w5.9vV2UujTU5oOJTD90CHRfCKQtN7P0dFKiq5CzC_zg_V</urn:sessionId></urn:SessionHeader></soapenv:Header><soapenv:Body><urn:create><urn:sObjects><urn1:type xmlns:urn1="urn:sobject.partner.soap.sforce.com">Account</urn1:type><urn1:Name xmlns:urn1="urn:sobject.partner.soap.sforce.com">wso2123</urn1:Name></urn:sObjects></urn:create></soapenv:Body></soapenv:Envelope>[\r][\n]"
[2016-12-05 14:51:45,388] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "0[\r][\n]"
[2016-12-05 14:51:45,388] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "[\r][\n]"
[2016-12-05 14:51:45,676] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "HTTP/1.1 200 OK[\r][\n]"
[2016-12-05 14:51:45,677] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "Date: Mon, 05 Dec 2016 09:21:45 GMT[\r][\n]"
[2016-12-05 14:51:45,677] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "Set-Cookie: BrowserId=kDZV-HTlTgGKWV_guyv7eQ;Path=/;Domain=.salesforce.com;Expires=Fri, 03-Feb-2017 09:21:45 GMT[\r][\n]"
[2016-12-05 14:51:45,677] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "Expires: Thu, 01 Jan 1970 00:00:00 GMT[\r][\n]"
[2016-12-05 14:51:45,677] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "Content-Type: text/xml;charset=UTF-8[\r][\n]"
[2016-12-05 14:51:45,677] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "Transfer-Encoding: chunked[\r][\n]"
[2016-12-05 14:51:45,677] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "[\r][\n]"
[2016-12-05 14:51:45,677] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "1C6[\r][\n]"
[2016-12-05 14:51:45,677] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "<?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns="urn:partner.soap.sforce.com"><soapenv:Header><LimitInfoHeader><limitInfo><current>8</current><limit>15000</limit><type>API REQUESTS</type></limitInfo></LimitInfoHeader></soapenv:Header><soapenv:Body><createResponse><result><id>00128000013bxBkAAI</id><success>true</success></result></createResponse></soapenv:Body></soapenv:Envelope>[\r][\n]"
[2016-12-05 14:51:45,678] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "0[\r][\n]"
[2016-12-05 14:51:45,678] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "[\r][\n]"
[2016-12-05 14:51:45,680]  INFO - DependencyTracker Local entry : gov:datamapper/NewConfig.dmc was added to the Synapse configuration successfully
[2016-12-05 14:51:45,684]  INFO - DependencyTracker Local entry : gov:datamapper/NewConfig_inputSchema.json was added to the Synapse configuration successfully
[2016-12-05 14:51:45,685]  INFO - DependencyTracker Local entry : gov:datamapper/NewConfig_outputSchema.json was added to the Synapse configuration successfully
[2016-12-05 14:51:45,880] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "POST /services/Soap/u/37.0/00D280000017q6q HTTP/1.1[\r][\n]"
[2016-12-05 14:51:45,880] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "Expires: Thu, 01 Jan 1970 00:00:00 GMT[\r][\n]"
[2016-12-05 14:51:45,880] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "Set-Cookie: BrowserId=kDZV-HTlTgGKWV_guyv7eQ;Path=/;Domain=.salesforce.com;Expires=Fri, 03-Feb-2017 09:21:45 GMT[\r][\n]"
[2016-12-05 14:51:45,880] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "Content-Type: text/xml[\r][\n]"
[2016-12-05 14:51:45,880] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "SOAPAction: "urn:partner.soap.sforce.com/Soap/upsertRequest"[\r][\n]"
[2016-12-05 14:51:45,880] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "Transfer-Encoding: chunked[\r][\n]"
[2016-12-05 14:51:45,880] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "Host: ap2.salesforce.com[\r][\n]"
[2016-12-05 14:51:45,881] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "Connection: Keep-Alive[\r][\n]"
[2016-12-05 14:51:45,881] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "User-Agent: Synapse-PT-HttpComponents-NIO[\r][\n]"
[2016-12-05 14:51:45,881] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "[\r][\n]"
[2016-12-05 14:51:45,881] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "3c9[\r][\n]"
[2016-12-05 14:51:45,881] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "<?xml version='1.0' encoding='UTF-8'?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:urn="urn:partner.soap.sforce.com"><soapenv:Header><urn:AllOrNoneHeader><urn:allOrNone>0</urn:allOrNone></urn:AllOrNoneHeader><urn:AllowFieldTruncationHeader><urn:allowFieldTruncation>0</urn:allowFieldTruncation></urn:AllowFieldTruncationHeader><urn:SessionHeader><urn:sessionId>00D280000017q6q!AQoAQN__fWlhfT0lpmQ95lYzR0JUDcKxfqMmfz8YDcE5PCAof0w5.9vV2UujTU5oOJTD90CHRfCKQtN7P0dFKiq5CzC_zg_V</urn:sessionId></urn:SessionHeader></soapenv:Header><soapenv:Body><urn:upsert><urn:externalIDFieldName>Id</urn:externalIDFieldName><urn:sObjects><urn1:type xmlns:urn1="urn:sobject.partner.soap.sforce.com">Account</urn1:type><urn1:Id xmlns:urn1="urn:sobject.partner.soap.sforce.com">00128000013bxBkAAI</urn1:Id><urn1:Name xmlns:urn1="urn:sobject.partner.soap.sforce.com">Hi Hariprasath</urn1:Name></urn:sObjects></urn:upsert></soapenv:Body></soapenv:Envelope>[\r][\n]"
[2016-12-05 14:51:45,881] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "0[\r][\n]"
[2016-12-05 14:51:45,881] DEBUG - wire HTTPS-Sender I/O dispatcher-2 << "[\r][\n]"
[2016-12-05 14:51:48,103] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "HTTP/1.1 200 OK[\r][\n]"
[2016-12-05 14:51:48,104] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "Date: Mon, 05 Dec 2016 09:21:45 GMT[\r][\n]"
[2016-12-05 14:51:48,104] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "Set-Cookie: BrowserId=3p628kfQSK2VnduWHofmXg;Path=/;Domain=.salesforce.com;Expires=Fri, 03-Feb-2017 09:21:46 GMT[\r][\n]"
[2016-12-05 14:51:48,104] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "Expires: Thu, 01 Jan 1970 00:00:00 GMT[\r][\n]"
[2016-12-05 14:51:48,104] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "Content-Type: text/xml;charset=UTF-8[\r][\n]"
[2016-12-05 14:51:48,104] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "Transfer-Encoding: chunked[\r][\n]"
[2016-12-05 14:51:48,104] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "[\r][\n]"
[2016-12-05 14:51:48,104] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "1DE[\r][\n]"
[2016-12-05 14:51:48,104] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "<?xml version="1.0" encoding="UTF-8"?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns="urn:partner.soap.sforce.com"><soapenv:Header><LimitInfoHeader><limitInfo><current>8</current><limit>15000</limit><type>API REQUESTS</type></limitInfo></LimitInfoHeader></soapenv:Header><soapenv:Body><upsertResponse><result><created>false</created><id>00128000013bxBkAAI</id><success>true</success></result></upsertResponse></soapenv:Body></soapenv:Envelope>[\r][\n]"
[2016-12-05 14:51:48,114] DEBUG - wire HTTP-Listener I/O dispatcher-1 << "HTTP/1.1 200 OK[\r][\n]"
[2016-12-05 14:51:48,116] DEBUG - header << "HTTP/1.1 200 OK[\r][\n]"
[2016-12-05 14:51:48,116] DEBUG - wire HTTP-Listener I/O dispatcher-1 << "Expires: Thu, 01 Jan 1970 00:00:00 GMT[\r][\n]"
[2016-12-05 14:51:48,117] DEBUG - header << "HTTP/1.1 200 OK[\r][\n]"
[2016-12-05 14:51:48,117] DEBUG - wire HTTP-Listener I/O dispatcher-1 << "Set-Cookie: BrowserId=3p628kfQSK2VnduWHofmXg;Path=/;Domain=.salesforce.com;Expires=Fri, 03-Feb-2017 09:21:46 GMT[\r][\n]"
[2016-12-05 14:51:48,117] DEBUG - wire HTTP-Listener I/O dispatcher-1 << "Content-Type: text/xml;charset=UTF-8[\r][\n]"
[2016-12-05 14:51:48,117] DEBUG - wire HTTP-Listener I/O dispatcher-1 << "Date: Mon, 05 Dec 2016 09:21:48 GMT[\r][\n]"
[2016-12-05 14:51:48,117] DEBUG - wire HTTP-Listener I/O dispatcher-1 << "Transfer-Encoding: chunked[\r][\n]"
[2016-12-05 14:51:48,117] DEBUG - wire HTTP-Listener I/O dispatcher-1 << "[\r][\n]"
[2016-12-05 14:51:48,117] DEBUG - wire HTTP-Listener I/O dispatcher-1 << "1de[\r][\n]"
[2016-12-05 14:51:48,118] DEBUG - wire HTTP-Listener I/O dispatcher-1 << "<?xml version='1.0' encoding='UTF-8'?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns="urn:partner.soap.sforce.com"><soapenv:Header><LimitInfoHeader><limitInfo><current>8</current><limit>15000</limit><type>API REQUESTS</type></limitInfo></LimitInfoHeader></soapenv:Header><soapenv:Body><upsertResponse><result><created>false</created><id>00128000013bxBkAAI</id><success>true</success></result></upsertResponse></soapenv:Body></soapenv:Envelope>[\r][\n]"
[2016-12-05 14:51:48,118] DEBUG - header << "Expires: Thu, 01 Jan 1970 00:00:00 GMT[\r][\n]"
[2016-12-05 14:51:48,118] DEBUG - wire HTTP-Listener I/O dispatcher-1 << "0[\r][\n]"
[2016-12-05 14:51:48,118] DEBUG - header << "Set-Cookie: BrowserId=3p628kfQSK2VnduWHofmXg;Path=/;Domain=.salesforce.com;Expires=Fri, 03-Feb-2017 09:21:46 GMT[\r][\n]"
[2016-12-05 14:51:48,119] DEBUG - wire HTTP-Listener I/O dispatcher-1 << "[\r][\n]"
[2016-12-05 14:51:48,119] DEBUG - header << "Content-Type: text/xml;charset=UTF-8[\r][\n]"
[2016-12-05 14:51:48,120] DEBUG - header << "Date: Mon, 05 Dec 2016 09:21:48 GMT[\r][\n]"
[2016-12-05 14:51:48,120] DEBUG - header << "Transfer-Encoding: chunked[\r][\n]"
[2016-12-05 14:51:48,120] DEBUG - header << "[\r][\n]"
[2016-12-05 14:51:48,124] DEBUG - content << "1"
[2016-12-05 14:51:48,124] DEBUG - content << "d"
[2016-12-05 14:51:48,124] DEBUG - content << "e"
[2016-12-05 14:51:48,124] DEBUG - content << "[\r]"
[2016-12-05 14:51:48,124] DEBUG - content << "[\n]"
[2016-12-05 14:51:48,124] DEBUG - content << "<"
[2016-12-05 14:51:48,124] DEBUG - content << "?xm"
[2016-12-05 14:51:48,125] DEBUG - content << "l version='1.0' encoding='UTF-8'?><soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns="urn:partner.soap.sforce.com"><soapenv:Header><LimitInfoHeader><limitInfo><current>8</current><limit>15000</limit><type>API REQUESTS</type></limitInfo></LimitInfoHeader></soapenv:Header><soapenv:Body><upsertResponse><result><created>false</created><id>00128000013bxBkAAI</id><success>true</success></result></upsertResponse></soapenv:Body></soapenv:Envelope>"
[2016-12-05 14:51:48,125] DEBUG - content << "[\r]"
[2016-12-05 14:51:48,126] DEBUG - content << "[\n]"
[2016-12-05 14:51:48,126] DEBUG - content << "0"
[2016-12-05 14:51:48,126] DEBUG - content << "[\r]"
[2016-12-05 14:51:48,126] DEBUG - content << "[\n]"
[2016-12-05 14:51:48,126] DEBUG - content << "[\r]"
[2016-12-05 14:51:48,126] DEBUG - content << "[\n]"
[2016-12-05 14:51:48,126] DEBUG - header << "[\r][\n]"
[2016-12-05 14:51:48,268] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "0[\r][\n]"
[2016-12-05 14:51:48,269] DEBUG - wire HTTPS-Sender I/O dispatcher-2 >> "[\r][\n]"

You can go to your salesforce account and see the changes done by the use case like below,

That' it. Now you may understand the Dynamic Schema generation for dev-tooling and run a simple use case using Connector, data mapper, and dev-tooling to WSO2 ESB.

Sohani Weerasinghe


How to use JAVA Flight Recorder (JFR) with WSO2 Products

This blog post is about using JFR with WSO2 Products.  JFR collects events from a Java application from the OS layer, the JVM, and all the way up to the Java application. The collected events include thread latency events such as sleep, wait, lock contention, I/O, GC, and method profiling. We can use this JFR to do long run testing in effective way.

  • Enable JFR by adding below parameters in the wso2server.sh file  ( under $JAVACMD) located at <PRODUCT_HOME>/bin
          -XX:+UnlockCommercialFeatures \
          -XX:+FlightRecorder \
  • Then start the server
  • Then start JMC (Java mission control)  by executing the executable file located at JAVA_HOME/bin/jmc
  • Then you can see  org.wso2.carbon.bootstrap.Bootstrapin the UI, right click on that and click on start Flight Recording. Do relevant changes and click on finish.
  • Then run the load test
  • After the specified time you will get a .jfr file and you can see the memory growth, cpu usage, etc


Hasunie AdikariInstalling Nginx in MAC OS


You can easily install Nginx with Homebrew and visit the site through http://brew.sh/

Installation

  • Install brew.
          Command: /usr/bin/ruby -e "$(curl -fsSL             
                              https://raw.githubusercontent.com/Homebrew/install/master/install)"

          you can take the command from the brew site and past it in the terminal.
  • Then give the brew command.
           Command :brew
  • Update The brew.    
          Command :brew update  
  • Install the Nginx with brew.
           Command :brew install nginx
  • After install the Nginx run it by
           Command :sudo nginx

Testing

       Test the Nginx by going through the http://localhost:8080
     

Configuration

       Default place of nginx.conf is on Mac after install with the brew is
     
       /usr/local/etc/nginx/nginx.conf


       You can change the default 8080 to 80. First you need to stop the server, if its already running.
     
        sudo nginx -s stop
     
        vim /usr/local/etc/nginx/nginx.conf

        From:
     
        server {
        listen       8080;
        server_name  localhost;

         #access_log  logs/host.access.log  main;

         location / {
         root   html;
         index  index.html index.htm;
        }
     
       To:
        server {
        listen       80;
        server_name  localhost;

        #access_log  logs/host.access.log  main;

        location / {
        root   html;
        index  index.html index.htm;
       }


Anupama PathirageHow to Read file Stored in Registry - WSO2 ESB

Sometimes we need to read file content in the mediation flow of WSO2 ESB.

Let's say we have a file named EndPoints.xml with the below content in the registry path of /_system/config/repository/demo as follows.

File Content :



<EndPointsList xmlns:ns1="http://endpoints">
<EP>www.google.com</EP>
<EP>www.yahoo.com</EP>
</EndPointsList>


Registry Path:



Sample Proxy:

In this proxy service, the file named EndPoints.xml is read and content is printed using log mediator.


<?xml version="1.0" encoding="UTF-8"?>
<proxy xmlns="http://ws.apache.org/ns/synapse"
name="TestFileReadProxy"
transports="https,http"
statistics="disable"
trace="disable"
startOnLoad="true">
<target>
<inSequence>
<property name="EndPointList"
expression="get-property('registry','conf:/repository/demo/EndPoints.xml')"
scope="default"
type="OM"/>
<foreach xmlns:nm="http://endpoints"
id="foreach_1"
expression="$ctx:EndPointList//EP">
<sequence>
<log level="custom">
<property name="EP:" expression="//EP"/>
</log>
</sequence>
</foreach>
<respond/>
</inSequence>
</target>
<description/>
</proxy>



Log Output:



TID: [-1234] [] [2016-12-02 11:58:13,964]  INFO {org.apache.synapse.mediators.builtin.LogMediator} -  EP: = www.google.com {org.apache.synapse.mediators.builtin.LogMediator}
TID: [-1234] [] [2016-12-02 11:58:13,966] INFO {org.apache.synapse.mediators.builtin.LogMediator} - EP: = www.yahoo.com {org.apache.synapse.mediators.builtin.LogMediator}

Anupama PathirageWSO2 ESB 5.0 DB Configuration with ESB Analytics

Following are the databases used with ESB 5.0 and ESB analytics 5.0


  • WSO2_CARBON_DB - Local registry space which is specific to each APIM instance.
  • WSO2UM_DB - User Manager Database which stores information related to users and user roles.
  • WSO2REG_DB - Registry database which is a content store and a metadata repository for SOA artifacts
  • WSO2_ANALYTICS_EVENT_STORE_DB - Analytics Record Store which stores event definitions 
  • WSO2_ANALYTICS_PROCESSED_DATA_STORE_DB - Analytics Record Store which stores processed data
  • WSO2_METRICS_DB used to store Carbon metrics



Anupama PathirageHow to use WSO2 ESB Enrich Mediator To Remove Elements from Payload.

The Enrich Mediator can process a message based on a given source configuration and then perform the specified action on the message by using the target configuration. In this example it is used to remove some elements from the message payload. In this example we need to find jsonObject elements which has jsonArray elements within that and remove the jsonObject element for them.

Sample Request:


<root:rootelement xmlns:root="www.test.com">
<jsonObject xmlns="http://ws.apache.org/ns/synapse">
<jsonArray>
<jsonElement>
<account_name>XYZ</account_name>
<account_id>20</account_id>
</jsonElement>
</jsonArray>
</jsonObject>
<jsonObject>
<account_name>DEF</account_name>
<account_id>22</account_id>
</jsonObject>
<jsonObject xmlns="http://ws.apache.org/ns/synapse">
<jsonArray>
<jsonElement>
<account_name>PQR</account_name>
<account_id>10</account_id>
</jsonElement>
<jsonElement>
<account_name>JKL</account_name>
<account_id>11</account_id>
</jsonElement>
<jsonElement>
<account_name>QWE</account_name>
<account_id>12</account_id>
</jsonElement>
</jsonArray>
</jsonObject>
<jsonObject>
<account_name>ABC</account_name>
<account_id>42</account_id>
</jsonObject>
</root:rootelement>

Sample Response:


<root:rootelement xmlns:root="www.test.com">
<jsonArray xmlns="http://ws.apache.org/ns/synapse">
<jsonElement>
<account_name>XYZ</account_name>
<account_id>20</account_id>
</jsonElement>
</jsonArray>
<jsonObject>
<account_name>DEF</account_name>
<account_id>22</account_id>
</jsonObject>
<jsonArray xmlns="http://ws.apache.org/ns/synapse">
<jsonElement>
<account_name>PQR</account_name>
<account_id>10</account_id>
</jsonElement>
<jsonElement>
<account_name>JKL</account_name>
<account_id>11</account_id>
</jsonElement>
<jsonElement>
<account_name>QWE</account_name>
<account_id>12</account_id>
</jsonElement>
</jsonArray>
<jsonObject>
<account_name>ABC</account_name>
<account_id>42</account_id>
</jsonObject>
</root:rootelement>


Example Proxy Service:

<proxy xmlns="http://ws.apache.org/ns/synapse"
name="TestXPath"
transports="https,http"
statistics="disable"
trace="disable"
startOnLoad="true">
<target>
<inSequence>
<foreach expression="//*[local-name()='jsonObject']">
<sequence>
<filter xpath="boolean(//*[local-name()='jsonObject']/*[name()='jsonArray'])">
<then>
<enrich>
<source clone="true" xpath="//*[local-name()='jsonArray']"/>
<target type="body"/>
</enrich>
</then>
</filter>
</sequence>
</foreach>
<respond/>
</inSequence>
<outSequence>
<send/>
</outSequence>
</target>
<description/>
</proxy>

Lakshani Gamage[WSO2 App Manager] Favorite Apps

In your app store, there may be so many apps. In such a situation, you may need to search apps by name/ provider or business owner.


But if you use only a few apps from the store frequently, it is an extra effort if you have to search them always.

In WSO2 App Manager 1.2.0, there is a new feature called "Favorite Apps". That can be used to set/unset apps as favorite apps.

Now, Let's see how to set an app as a favorite app.


How to set an app as a favorite app?

First, log into App Store. Then, click on the button (with 3 dots) in the bottom right corner of the app that you want, and click on "Add to Favorites". Then you can see that app in "Favorite app list". The favorite apps are displayed with a flag like below.


If you navigate to "Favourites" tab, you will see all your favorite apps as shown below.



How to remove an app from the favorite list?

Click on the button in the bottom right corner of the app. Then, click on "Remove from Favorites".



How to set the "Favorites" page as home page ?

Navigate to the "Favorite" page. Click on the gear icon shown in the image and select “Set this page as home”.


If you want to revert above setting, click on the gear icon and select "Remove from home page".


That's all. Enjoy WSO2 App Manager.  :) 

Dimuthu De Lanerolle


XACML Architecture


1. Its an access control policy language.
2. The Identity Server supports XACML 3.0, which is based on Balana XACML implementation.
3. The XACML engine of the WSO2 Identity Server has two major components, i.e., PAP and PDP. 


Eg: 

MyPolicy.xml
==========

<Policy xmlns="urn:oasis:names:tc:xacml:3.0:core:schema:wd-17" PolicyId="MyPolicy" RuleCombiningAlgId="urn:oasis:names:tc:xacml:3.0:rule-combining-algorithm:deny-overrides" Version="1.0">
     <Target>
   <AnyOf>
            <AllOf>
  <Match MatchId="urn:oasis:names:tc:xacml:1.0:function:string-equal">
         <AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string">read</AttributeValue>
         <AttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:action:action-id" Category="urn:oasis:names:tc:xacml:3.0:attribute-category:action" 
                                            DataType="http://www.w3.org/2001/XMLSchema#string" MustBePresent="true"/>
  </Match>
     </AllOf>
  </AnyOf>
      </Target>
          <Rule Effect="Permit" RuleId="permit"/>
</Policy>


Request 
=======

https://localhost:9443/entitlement/Decision/pdp

Authorization         Basic YWRtaW46YWRtaW4=
Accept                   application/xml
Content-Type        application/xml


<Request CombinedDecision="false" ReturnPolicyIdList="false" xmlns="urn:oasis:names:tc:xacml:3.0:core:schema:wd-17">
    <Attributes Category="urn:oasis:names:tc:xacml:3.0:attribute-category:action">
        <Attribute AttributeId="urn:oasis:names:tc:xacml:1.0:action:action-id" IncludeInResult="false">
            <AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string">read</AttributeValue>
        </Attribute>
    </Attributes>
    <Attributes Category="urn:oasis:names:tc:xacml:3.0:attribute-category:resource">
        <Attribute AttributeId="urn:oasis:names:tc:xacml:1.0:resource:resource-id" IncludeInResult="false">
            <AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string">http://127.0.0.1/service/very_secure/</AttributeValue>
        </Attribute>
    </Attributes>
</Request>


Response
========

<Response xmlns="urn:oasis:names:tc:xacml:3.0:core:schema:wd-17">
    <Result>
        <Decision>Permit</Decision>
        <Status>
            <StatusCode Value="urn:oasis:names:tc:xacml:1.0:status:ok"/>
        </Status>
    </Result>
</Response>

Charini NanayakkaraSVN Partial Checkout: [Helpful when a child pom refers to a parent]

Assume you want a sub-module of a large project structure for a maven build. Checking out the entire project structure seems worthless since it consumes memory unnecessarily. However, what if the submodule's POM refers to a parent pom? What if several sub-modules are needed but not the entire project? In such instances, svn partial checkout is helpful, since it allows us to maintain the high-level project structure. Only the needed sub-modules would be checked out completely while retaining the skeleton of the rest.

Say you want to checkout module x, where the svn location of x appear as follows

https://svn.abc.com/abc/custom/projects/turing/platform/trunk/components/org.abc.module.mgt/x/

Assume that the POM of x refers to the parent POM in platform. Then you'd want the complete project structure starting from platform. This could be achieved by following given steps:
  1. svn co --depth immediates https://svn.abc.com/abc/custom/projects/turing/platform/
  2. cd platform/trunk
  3. svn up --set-depth immediates 
  4. cd components
  5. svn up --set-depth immediates
  6. cd org.abc.module.mgt
  7. svn up --set-depth immediates
  8. cd x
  9. svn up --set-depth infinity
What happens here is as follows. In the 1st step, we checkout the top directory structure of "platform". Thus, it would create a folder named "trunk" within "platform" (and other directories in "platform" if any). If you check inside "trunk" directory, however, you would see that it's empty. Thus, go into directory "trunk" and run "svn up --set-depth immediates" to complete top level directory structure within "trunk". Continue to get the top level directory structures by visiting the directories sequentially (directories associated with path "platform/trunk/components/org.abc.module.mgt/x/"). You would notice that the POM files too are checked out from svn at each directory level. Finally, when you reach the directory you require (here it's "x"), run the command "svn up --set-depth infinity". This ensures that all the content within "x" is checked out (not just the top structure).

Now you can easily run mvn clean install from within x directory, without encountering "cannot find parent POM" issues (you may have to build the project from within platform directory once). 

Dilshani SubasingheError: unable to write 'random state'

Environment:  Ubuntu 15.10

Situation: Generating Open SSL keys

Error:
 unable to write 'random state'  

Analysis:

This happens due to .rnd file in home directory is owned by root rather than current user account. Giving permission to that file through user account, this can be resolve.

Solution:

Identify the current user
 echo $USER  

Give permissions
 sudo chown user:user ~/.rnd  

* Replace user with current user





Anupama PathirageConfigure Cipher Suites for WSO2 Products

To configure required cipher suites, it is required to add cipher  attribute to the https connector configuration in the catalina-server.xml file. A comma separated list of ciphers that we want the server to support needs to be mentioned there as follows.

ciphers="<cipher-name>,<cipher-name>"



Following are the recommended cipher suites to use with TLS 1.2

Java 8 with JCE Unlimited Strength Jurisdiction Policy

TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,
TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,
TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256,
TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384,
TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,
TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,
TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256,
TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384,
TLS_DHE_RSA_WITH_AES_128_GCM_SHA256,
TLS_DHE_RSA_WITH_AES_256_GCM_SHA384,
TLS_DHE_RSA_WITH_AES_128_CBC_SHA256,
TLS_DHE_RSA_WITH_AES_256_CBC_SHA256,
TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA,
TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA,
TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA,
TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA,
TLS_DHE_RSA_WITH_AES_128_CBC_SHA,
TLS_DHE_RSA_WITH_AES_256_CBC_SHA

Java 7 with JCE Unlimited Strength Jurisdiction Policy

TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256,
TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384,
TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256,
TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384,
TLS_DHE_RSA_WITH_AES_128_CBC_SHA256,
TLS_DHE_RSA_WITH_AES_256_CBC_SHA256,
TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA,
TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA,
TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA,
TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA,
TLS_DHE_RSA_WITH_AES_128_CBC_SHA,
TLS_DHE_RSA_WITH_AES_256_CBC_SHA

The only difference between above 2 groups is, Java 7 doesn't contain GCM based ciphers since it was supported only from Java 8.

References:

[1] https://docs.wso2.com/display/ADMIN44x/Configuring+Transport+Level+Security
[2] https://docs.wso2.com/display/ADMIN44x/Supported+Cipher+Suites

Anupama PathirageEnable TLS 1.2 for WSO2 Services

Following configuration changes needs to be done in service to enable TLS 1.2 only.
  • Enforce TLS 1.2 for the servlet transport i.e. Port 9443. Do the following in /repository/conf/tomcat/catalina-server.xml file.
    • Find the Connector configuration corresponding to TLS (usually, this connector has the port set to 9443 and the sslProtocol as TLS). Remove the sslProtocol="TLS" attribute and replace it with sslEnabledProtocols="TLSv1.2".
        protocol="org.apache.coyote.http11.Http11NioProtocol"
      port="9443"
      bindOnInit="false"
      sslEnabledProtocols="TLSv1.2"
  • Enforce TLS 1.2 for PassThrough transport  - i.e.  Port 8243 (Ex: In ESB) Do the following in /repository/conf/axis2/axis2.xml file.
    • Add the parameter "HttpsProtocols" under the below elements.

<transportReceiver name="https" class="org.apache.synapse.transport.passthru.PassThroughHttpSSLListener">


<transportSender name="https" class="org.apache.synapse.transport.passthru.PassThroughHttpSSLSender"> 


Parameter:

Hariprasath ThanarajahHow to keep the value in static variable and reuse it to Text field in the Dialog

Firstly create a class like below to keep the static variables from different dialog boxes


package org.wso2.tooling.connector.dynamic.schema.salesforcesoap;

public class LoginForm {
public static String userName, password, loginUrl, securityToken;

private static LoginForm loginForm = new LoginForm();

private LoginForm(){
}

public static LoginForm getInstance() {
return loginForm;
}

public String getUserName() {
return userName;
}

public void setUserName(String userName) {
LoginForm.userName = userName;
}

public String getPassword() {
return password;
}

public void setPassword(String password) {
LoginForm.password = password;
}

public String getLoginURL() {
return loginUrl;
}

public void setLoginURL(String loginUrl) {
LoginForm.loginUrl = loginUrl;
}

public String getSecurityToken() {
return securityToken;
}

public void setSecurityToken(String securityToken) {
LoginForm.securityToken = securityToken;
}
}

Create the classes that extends Dialog Class(org.eclipse.jface.dialogs.Dialog) to call the salesforce SOAP api operations. In this case, we need to give the Salesforce username, password, securityToken and loginUrl for every operation. But In this project, we create separate classes for each and every operation to get the response for that particular operation.

For Example take the query operation,

1. This is the Dialog to give the login details to call the query operation. So when you click the login or the OK button then the value from the textBoxes of first 4 fields need to be stored in the static variables of LoginForm class.

figure 1

Figure 2



2. If the user opens the dialog again then the above Dialog in Figure 1 will come with the value that entered in the above figure 2 like below.
Figure 3


The Class to call the query operation is below

package org.wso2.tooling.connector.dynamic.schema.salesforcesoap;

import java.io.StringWriter;
import java.util.List;

import javax.xml.soap.MessageFactory;
import javax.xml.soap.MimeHeaders;
import javax.xml.soap.SOAPBody;
import javax.xml.soap.SOAPConnection;
import javax.xml.soap.SOAPConnectionFactory;
import javax.xml.soap.SOAPElement;
import javax.xml.soap.SOAPEnvelope;
import javax.xml.soap.SOAPHeader;
import javax.xml.soap.SOAPMessage;
import javax.xml.soap.SOAPPart;
import javax.xml.transform.TransformerFactory;
import javax.xml.transform.dom.DOMSource;
import javax.xml.transform.stream.StreamResult;

import org.eclipse.jface.dialogs.Dialog;
import org.eclipse.jface.dialogs.MessageDialog;
import org.eclipse.swt.SWT;
import org.eclipse.swt.events.ModifyEvent;
import org.eclipse.swt.events.ModifyListener;
import org.eclipse.swt.events.SelectionAdapter;
import org.eclipse.swt.events.SelectionEvent;
import org.eclipse.swt.graphics.Point;
import org.eclipse.swt.layout.FillLayout;
import org.eclipse.swt.layout.FormAttachment;
import org.eclipse.swt.layout.FormData;
import org.eclipse.swt.layout.FormLayout;
import org.eclipse.swt.widgets.Button;
import org.eclipse.swt.widgets.Combo;
import org.eclipse.swt.widgets.Composite;
import org.eclipse.swt.widgets.Control;
import org.eclipse.swt.widgets.Group;
import org.eclipse.swt.widgets.Label;
import org.eclipse.swt.widgets.Shell;
import org.eclipse.swt.widgets.Text;
import org.eclipse.ui.PlatformUI;

public class GenerateInputSchemaForQueryOperation extends Dialog {

private Group grpPropertyKey;
private Label lblConnectorSalesforceLoginUserName;
private Label lblConnectorLoginSalesforcePassword;
private Label lblConnectorLoginSalesforceSecurityToken;
private Label lblConnectorLoginSalesforceLoginURL;
private Label lblSObject;
private Label lblQuery;
private static Text connectorLoginSalesforceUserNameTextField;
private static Text connectorLoginSalesforcePasswordTextField;
private static Text connectorLoginSalesforceSecurityTokenTextField;
private static Text connectorLoginSalesforceLoginURLTextField;
private Text queryTextField;
private Button login;
private static Combo cmbSObjectType;
private String value;

private static final String SELECT_CONNECTOR_LOGIN_USERNAME = Messages.SchemaKeyEditorDialog_SelectConnectorLoginUsername;
private static final String SELECT_CONNECTOR_LOGIN_PASSWORD = Messages.SchemaKeyEditorDialog_SelectConnectorLoginPassword;
private static final String SELECT_CONNECTOR_LOGIN_SECURITY_TOKEN = Messages.SchemaKeyEditorDialog_SelectConnectorLoginSecurityToken;
private static final String SELECT_CONNECTOR_LOGIN_LOGIN_URL = Messages.SchemaKeyEditorDialog_SelectConnectorLoginLoginURL;
private static final String SELECT_SALESFORCE_LOGIN = Messages.SchemaKeyEditorDialog_SelectConnectorLogin;
private static final String SELECT_SALESFORCE_SOBJECT = Messages.SchemaKeyEditorDialog_SelectSObject;
private static final String SELECT_SALESFORCE_QUERY = Messages.SchemaKeyEditorDialog_Query;

public GenerateInputSchemaForQueryOperation(Shell parent) {
super(parent);
}

@Override
protected Control createDialogArea(Composite parent) {
Composite container = (Composite) super.createDialogArea(parent);

FillLayout fl_container = new FillLayout(SWT.HORIZONTAL);
fl_container.marginHeight = 5;
fl_container.marginWidth = 5;
fl_container.spacing = 10;
container.setLayout(fl_container);

grpPropertyKey = new Group(container, SWT.None);

FormLayout fl_grpPropertyKey = new FormLayout();
fl_grpPropertyKey.marginHeight = 10;
fl_grpPropertyKey.marginWidth = 10;
grpPropertyKey.setLayout(fl_grpPropertyKey);

lblConnectorSalesforceLoginUserName = new Label(grpPropertyKey, SWT.NORMAL);
lblConnectorLoginSalesforcePassword = new Label(grpPropertyKey, SWT.NORMAL);
lblConnectorLoginSalesforceSecurityToken = new Label(grpPropertyKey, SWT.NORMAL);
lblConnectorLoginSalesforceLoginURL = new Label(grpPropertyKey, SWT.NORMAL);
lblSObject = new Label(grpPropertyKey, SWT.NORMAL);
lblQuery = new Label(grpPropertyKey, SWT.NORMAL);

connectorLoginSalesforceUserNameTextField = new Text(grpPropertyKey, SWT.BORDER);
connectorLoginSalesforcePasswordTextField = new Text(grpPropertyKey, SWT.BORDER | SWT.PASSWORD);
connectorLoginSalesforceSecurityTokenTextField = new Text(grpPropertyKey, SWT.BORDER | SWT.PASSWORD);
connectorLoginSalesforceLoginURLTextField = new Text(grpPropertyKey, SWT.BORDER);
queryTextField = new Text(grpPropertyKey, SWT.MULTI | SWT.BORDER | SWT.WRAP | SWT.V_SCROLL);

cmbSObjectType = new Combo(grpPropertyKey, SWT.DROP_DOWN | SWT.READ_ONLY | SWT.BORDER);

login = new Button(grpPropertyKey, SWT.PUSH);

if (LoginForm.userName != null && LoginForm.password != null && LoginForm.securityToken != null
&& LoginForm.loginUrl != null) {
GenerateInputSchemaForQueryOperation.connectorLoginSalesforceUserNameTextField
.setText(LoginForm.getInstance().getUserName());
GenerateInputSchemaForQueryOperation.connectorLoginSalesforcePasswordTextField
.setText(LoginForm.getInstance().getPassword());
GenerateInputSchemaForQueryOperation.connectorLoginSalesforceSecurityTokenTextField
.setText(LoginForm.getInstance().getSecurityToken());
GenerateInputSchemaForQueryOperation.connectorLoginSalesforceLoginURLTextField
.setText(LoginForm.getInstance().getLoginURL());
}

FormData salesforceLoginUserNameLabelLayoutData = new FormData();
lblConnectorSalesforceLoginUserName.setText(SELECT_CONNECTOR_LOGIN_USERNAME);
lblConnectorSalesforceLoginUserName.setLayoutData(salesforceLoginUserNameLabelLayoutData);

FormData salesforceLoginUserNameTextFieldLayoutData = new FormData();
salesforceLoginUserNameTextFieldLayoutData.left = new FormAttachment(lblConnectorSalesforceLoginUserName, 10,
SWT.RIGHT);
salesforceLoginUserNameTextFieldLayoutData.right = new FormAttachment(100, -5);
connectorLoginSalesforceUserNameTextField.setLayoutData(salesforceLoginUserNameTextFieldLayoutData);

FormData salesforceLoginPasswordLabelLayoutData = new FormData();
salesforceLoginPasswordLabelLayoutData.top = new FormAttachment(lblConnectorSalesforceLoginUserName, 20,
SWT.BOTTOM);
lblConnectorLoginSalesforcePassword.setText(SELECT_CONNECTOR_LOGIN_PASSWORD);
lblConnectorLoginSalesforcePassword.setLayoutData(salesforceLoginPasswordLabelLayoutData);

FormData salesforceLoginPasswordTextFieldLayoutData = new FormData();
salesforceLoginPasswordTextFieldLayoutData.top = new FormAttachment(connectorLoginSalesforceUserNameTextField,
10, SWT.BOTTOM);
salesforceLoginPasswordTextFieldLayoutData.left = new FormAttachment(lblConnectorLoginSalesforcePassword, 10,
SWT.RIGHT);
salesforceLoginPasswordTextFieldLayoutData.right = new FormAttachment(100, -5);
connectorLoginSalesforcePasswordTextField.setLayoutData(salesforceLoginPasswordTextFieldLayoutData);

FormData salesforceLoginSecurityTokenLabelLayoutData = new FormData();
salesforceLoginSecurityTokenLabelLayoutData.top = new FormAttachment(lblConnectorLoginSalesforcePassword, 20,
SWT.BOTTOM);
lblConnectorLoginSalesforceSecurityToken.setText(SELECT_CONNECTOR_LOGIN_SECURITY_TOKEN);
lblConnectorLoginSalesforceSecurityToken.setLayoutData(salesforceLoginSecurityTokenLabelLayoutData);

FormData salesforceLoginSecurityTokenTextFieldLayoutData = new FormData();
salesforceLoginSecurityTokenTextFieldLayoutData.top = new FormAttachment(
connectorLoginSalesforcePasswordTextField, 12, SWT.BOTTOM);
salesforceLoginSecurityTokenTextFieldLayoutData.left = new FormAttachment(
lblConnectorLoginSalesforceSecurityToken, 10, SWT.RIGHT);
salesforceLoginSecurityTokenTextFieldLayoutData.right = new FormAttachment(100, -5);
connectorLoginSalesforceSecurityTokenTextField.setLayoutData(salesforceLoginSecurityTokenTextFieldLayoutData);

FormData salesforceLoginLoginURLLabelLayoutData = new FormData();
salesforceLoginLoginURLLabelLayoutData.top = new FormAttachment(lblConnectorLoginSalesforceSecurityToken, 20,
SWT.BOTTOM);
lblConnectorLoginSalesforceLoginURL.setText(SELECT_CONNECTOR_LOGIN_LOGIN_URL);
lblConnectorLoginSalesforceLoginURL.setLayoutData(salesforceLoginLoginURLLabelLayoutData);

FormData salesforceLoginLoginURLTextFieldLayoutData = new FormData();
salesforceLoginLoginURLTextFieldLayoutData.top = new FormAttachment(
connectorLoginSalesforceSecurityTokenTextField, 12, SWT.BOTTOM);
salesforceLoginLoginURLTextFieldLayoutData.left = new FormAttachment(lblConnectorLoginSalesforceLoginURL, 10,
SWT.RIGHT);
salesforceLoginLoginURLTextFieldLayoutData.right = new FormAttachment(100, -5);
connectorLoginSalesforceLoginURLTextField.setLayoutData(salesforceLoginLoginURLTextFieldLayoutData);

FormData loginButtonLayoutData = new FormData();
loginButtonLayoutData.top = new FormAttachment(connectorLoginSalesforceLoginURLTextField, 10, SWT.BOTTOM);
loginButtonLayoutData.left = new FormAttachment(50, 10);
loginButtonLayoutData.right = new FormAttachment(100, -5);
login.setLayoutData(loginButtonLayoutData);
login.setText(SELECT_SALESFORCE_LOGIN);

login.addSelectionListener(new SelectionAdapter() {
public void widgetSelected(SelectionEvent event) {
try {
LoginForm.getInstance().setUserName(
GenerateInputSchemaForQueryOperation.connectorLoginSalesforceUserNameTextField.getText());
LoginForm.getInstance().setPassword(
GenerateInputSchemaForQueryOperation.connectorLoginSalesforcePasswordTextField.getText());
LoginForm.getInstance().setSecurityToken(
GenerateInputSchemaForQueryOperation.connectorLoginSalesforceSecurityTokenTextField
.getText());
LoginForm.getInstance().setLoginURL(
GenerateInputSchemaForQueryOperation.connectorLoginSalesforceLoginURLTextField.getText());
CallSalesforceOperations.getInstance().login();
String[] sObject = CallSalesforceOperations.callMetaData();
cmbSObjectType.setItems(sObject);
cmbSObjectType.select(0);
} catch (Exception e) {
MessageDialog.openWarning(PlatformUI.getWorkbench().getDisplay().getActiveShell(),
"Error In Login to Salesforce", "Check the Login Credentials and Try Again");
}
}
});

FormData sObjectLabelLayoutData = new FormData();
sObjectLabelLayoutData.top = new FormAttachment(login, 20, SWT.BOTTOM);
lblSObject.setText(SELECT_SALESFORCE_SOBJECT);
lblSObject.setLayoutData(sObjectLabelLayoutData);

FormData sObjectComboLayoutData = new FormData();
sObjectComboLayoutData.top = new FormAttachment(login, 15, SWT.BOTTOM);
sObjectComboLayoutData.left = new FormAttachment(lblSObject, 10, SWT.RIGHT);
sObjectComboLayoutData.right = new FormAttachment(100, -5);
cmbSObjectType.setLayoutData(sObjectComboLayoutData);

cmbSObjectType.addModifyListener(new ModifyListener() {
public void modifyText(ModifyEvent arg0) {
try {
queryTextField.setText(buildQuery());
} catch (Exception e) {
MessageDialog.openWarning(PlatformUI.getWorkbench().getDisplay().getActiveShell(),
"Error While build the Query", "Create a valid Query String");
}
}
});

FormData queryLabelLayoutData = new FormData();
queryLabelLayoutData.top = new FormAttachment(lblSObject, 20, SWT.BOTTOM);
lblQuery.setText(SELECT_SALESFORCE_QUERY);
lblQuery.setLayoutData(queryLabelLayoutData);

FormData queryTextFieldLayoutData = new FormData();
queryTextFieldLayoutData.top = new FormAttachment(lblQuery, 15, SWT.BOTTOM);
queryTextFieldLayoutData.left = new FormAttachment(0, 5);
queryTextFieldLayoutData.right = new FormAttachment(100, -5);
queryTextField.setLayoutData(queryTextFieldLayoutData);
queryTextFieldLayoutData.height = 125;

return container;
}

@Override
protected Point getInitialSize() {
return new Point(450, 550);
}

@Override
protected void okPressed() {

try {
LoginForm.getInstance().setUserName(
GenerateInputSchemaForQueryOperation.connectorLoginSalesforceUserNameTextField.getText());
LoginForm.getInstance().setPassword(
GenerateInputSchemaForQueryOperation.connectorLoginSalesforcePasswordTextField.getText());
LoginForm.getInstance().setSecurityToken(
GenerateInputSchemaForQueryOperation.connectorLoginSalesforceSecurityTokenTextField.getText());
LoginForm.getInstance().setLoginURL(
GenerateInputSchemaForQueryOperation.connectorLoginSalesforceLoginURLTextField.getText());
value = callQuery();
} catch (Exception e) {
MessageDialog.openWarning(PlatformUI.getWorkbench().getDisplay().getActiveShell(),
"Error While calling the Query Method", "Check the Login Credentials and Try Again");
}
super.okPressed();
}

/**
* The value to generate the Schema from the parent dialog.
*
* @return response.
*/
public String getResponse() {
return value;
}
}

You can find more about the above implementation in,

https://github.com/hariss63/dynamicSchemaForSalesforce/tree/master/org.wso2.tooling.connector.dynamic.schema



Maneesha WijesekaraSetup WSO2 API Manager Analytics with WSO2 API Manager 2.0 using REST Client


Please Note - Statistics publishing using REST Client was deprecated from APIM 2.0.0. Please refer this to continue.

In this blog post I will explain how to configure WSO2 API Manager Analytics 2.0.0 with WSO2 API Manager 2.0 to publish and view statistics. Before going further into the topic, I thought to give a brief summary about the role of WSO2 API Manager Analytics 2.0.0 in here

WSO2 API manager embedded with the ability to view statistics of the operations carried out such as usage comparison, monitoring Throttled Out Requests, API last access time and so on. To view so, the user has to configure an analytics server with API Manager and it allows to view statistics based on the given criteria. Until WSO2 API Manager 2.0.0, the recommended analytics server to view statistics was WSO2 DAS ( Data Analytics Server ) which is a high performing enterprise data analytics platform. Before that WSO2 BAM (Business Activity Monitor) used to collect and analyze runtime statistics from the API Manager. Based on the WSO2 DAS, with the vision of having a separate but custom analytics package including new features that will perform all the analytics for API Manager, WSO2 API Manager Analytics has been introduced. WSO2 API Manager analytics fuses batch and real-time analytics with predictive analytics and generate alerts when an abnormal situation occurs via machine learning.

Hope now you have a sound knowledge on what API Manager analytics is all about. So let's starts with configuration.


Steps to configure,

1. First download the WSO2 API Manager Analytics 2.0.0 release pack and unzip it.
( Download )

2. Start the Analytics server (By default the port offset was given as 1 in carbon.xml)

3. Go to Management Console of Analytics Server and logged in as administrator (Username- admin, Password- admin). 

4. Go to Manager -> Carbon Applications. List and delete the existing org.wso2.carbon.analytics.apim carbon app.

5. Browse Rest Client car app (org_wso2_carbon_analytics_apim_REST-1.0.0.car) from [APIM_ANALYTICS_HOME]/statistics and upload.

That's it from APIM Analytics side. Now see how to configure API Manager to finalize the configurations.

6. Download the WSO2 API Manager 2.0.0 pack and unzip it ( Download )

7. Open api-manager.xml ([APIM_HOME]/repository/conf/api-manager.xml ) and enables the Analytics. The configuration should be look like this. ( by default the values set as false )

<Analytics> 
<!-- Enable Analytics for API Manager -->
<Enabled>true</Enabled>


8. Then configure Server URL of the analytics server used to collect statistics. The define format is ' protocol://hostname:port/'. Although admin credentials to login to the remote DAS server has to be configured like below.

<DASServerURL>{tcp://localhost:7612}</DASServerURL> 
<DASUsername>admin</DASUsername>
<DASPassword>admin</DASPassword>


Assuming Analytics server in the same machine as the API Manager 2.0, the hostname I used here is 'localhost'. Change according to the hostname of remote location if the Analtics server run on different instance. By default the server port is adjusted with offset '1'. If the Analytics server has a different port offset ( check [APIM_ANALYTICS_HOME]/repository/conf/carbon.xml for the offset ), change the port in <DASServerURL> accordingly. As an example if the Analytics server has the port offset of 3, <DASServerURL> should be {tcp://localhost:7614}.


Now we have to choose between 2 clients to fetch and publish statistics.

  • The RDBMS client which fetches data from RDBMS and publish.
  • The REST client which directly fetches data from Analytics server.

I chose REST client to publish data in this tutorial and will explain how to configure the data fetching using RDBMS in next blog post.

For your information, API Manager 2.0 enables RDBMS configuration to proceed with statistics, by default. 

9. To enable publishing using REST Client, <StatsProviderImpl> should be uncommented (By default, it's in as a comment) and comment <StatsProviderImpl> for RDBMS

<!-- For APIM implemented Statistic client for DAS REST API -->
<StatsProviderImpl>org.wso2.carbon.apimgt.usage.client.impl.APIUsageStatisticsRestClientImpl</StatsProviderImpl>
<!-- For APIM implemented Statistic client for RDBMS -->
<!--StatsProviderImpl>org.wso2.carbon.apimgt.usage.client.impl.APIUsageStatisticsRdbmsClientImpl</StatsProviderImpl-->


10. Then the REST API url should be configured with hostname and port along with the credentials to access,

<DASRestApiURL>https://localhost:9444</DASRestApiURL> 
<DASRestApiUsername>admin</DASRestApiUsername>
<DASRestApiPassword>admin</DASRestApiPassword>

As mentioned before, the port associate with the default offset of 1 for WSO2 APIM analytics 1.0.0.

11. Now Save api-manager.xml and start the API Manager 2.0 server.

That's it. Open publisher in a browser ( https://<ip-address>:<port>/publisher). Go to Statistics and select API Usage as an example. The screen should looks like this with a message of 'Data Publishing Enabled. Generate some traffic to see statistics.'




Just create few APIs and try to invoke them in order to get some traffic to generate statistics on graph. So you can see the statistics like this.







Maneesha WijesekaraSetup WSO2 API Manager Analytics with WSO2 API Manager 2.0 using RDBMS

In this blog post I'll explain on how to configure RDBMS to publish APIM analytics using APIM analytics 2.0.0. Check my previous post if you want to configure publishing statistics with REST Client.

The purpose of having RDBMS is to fetch and store summarized data after the analyzing process. API Manager used this data to display on APIM side using dashboards.

Since the APIM 2.0.0, RDBMS use as the recommended way to publish statistics for API Manager. Hence, I will explain step by step configuration with RDBMS in order to view statistics in Publisher and Store through this blog post.

Steps to configure,

1. First download the WSO2 API Manager Analytics 2.0.0 release pack and unzip it.

2. Go to carbon.xml([APIM_ANALYTICS_HOME]/repository/conf/carbon.xml) and set port offset as 1 (default offset for APIM Analytics)

<Ports>
<!-- Ports offset. This entry will set the value of the ports defined below to
the define value + Offset.
e.g. Offset=2 and HTTPS port=9443 will set the effective HTTPS port to 9445
-->
<Offset>1</Offset>

Note - This is only necessary if both API Manager 2.0.0 and APIM Analytics servers run in a same machine.

3. Now add the data source for Statistics DB in stats-datasources.xml ([APIM_ANALYTICS_HOME]/repository/conf/datasources/stats-datasources./xml) according to the preferred RDBMS. You can use any RDBMS such as h2, mysql, oracle, postgres and etc and here I choose mysql to use in this blog post.


<datasource>
<name>WSO2AM_STATS_DB</name>
<description>The datasource used for setting statistics to API Manager</description>
<jndiConfig>
<name>jdbc/WSO2AM_STATS_DB</name>
</jndiConfig>
<definition type="RDBMS">
<configuration>
<url>jdbc:mysql://localhost:3306/statdb?autoReconnect=true&amp;relaxAutoCommit=true</url>
<username>maneesha</username>
<password>password</password>
<driverClassName>com.mysql.jdbc.Driver</driverClassName>
<maxActive>50</maxActive>
<maxWait>60000</maxWait>
<testOnBorrow>true</testOnBorrow>
<validationQuery>SELECT 1</validationQuery>
<validationInterval>30000</validationInterval>
</configuration>
</definition>
</datasource>

Give the correct hostname and name of the db in <url> (in this case, localhost and statdb respectively), username and password for the database and drive class name.

4. WSO2 analytics server automatically create the table structure for statistics database at the server start up using ‘-Dsetup’. 

5. Copy the related database driver into <APIM_ANALYTICS_HOME>/repository/components/lib directory.

If you use mysql - Download
If you use oracle 12c - Download
If you use Mssql - Download

6. Start the Analytics server

7. Download the WSO2 API Manager 2.0.0 pack and unzip it ( Download )

8. Open api-manager.xml ([APIM_HOME]/repository/conf/api-manager.xml ) and enables the Analytics. The configuration should be look like this. (by default the value set as false)

<Analytics>
<!-- Enable Analytics for API Manager -->
<Enabled>true</Enabled>

9. Then configure Server URL of the analytics server used to collect statistics. The define format is ' protocol://hostname:port/'. Although admin credentials to login to the remote DAS server has to be configured like below.

<DASServerURL>{tcp://localhost:7612}</DASServerURL>
<DASUsername>admin</DASUsername>
<DASPassword>admin</DASPassword>

Assuming Analytics server in the same machine as the API Manager 2.0, the hostname I used here is 'localhost'. Change according to the hostname of remote location if the Analytics server runs on a different instance. 

By default, the server port is adjusted with offset '1'. If the Analytics server has a different port offset ( check {APIM_ANALYTICS_HOME}/repository/conf/carbon.xml for the offset ), change the port in <DASServerURL> accordingly. As an example if the Analytics server has the port offset of 3, <DASServerURL> should be {tcp://localhost:7614}.

10. For your information, API Manager 2.0 enables RDBMS configuration to proceed with statistics, by default. To enable publishing using RDBMS, <StatsProviderImpl> should be uncommented (By default, it's not in as a comment. So this step can be omitted)

<!-- For APIM implemented Statistic client for DAS REST API -->
<!--StatsProviderImpl>org.wso2.carbon.apimgt.usage.client.impl.APIUsageStatisticsRestClientImpl</StatsProviderImpl-->
<!-- For APIM implemented Statistic client for RDBMS -->
<StatsProviderImpl>org.wso2.carbon.apimgt.usage.client.impl.APIUsageStatisticsRdbmsClientImpl</StatsProviderImpl>

11. The next step is to configure the statistics database in API Manager side. Add the data source for Statistics DB which used to configure in Analytics by opening master-datasources.xml ([APIM_HOME]/repository/conf/datasources/master-datasources./xml)


<datasource>
<name>WSO2AM_STATS_DB</name>
<description>The datasource used for setting statistics to API Manager</description>
<jndiConfig>
<name>jdbc/WSO2AM_STATS_DB</name>
</jndiConfig>
<definition type="RDBMS">
<configuration>
<url>jdbc:mysql://localhost:3306/statdb?autoReconnect=true&amp;relaxAutoCommit=true</url>
<username>maneesha</username>
<password>password</password>
<driverClassName>com.mysql.jdbc.Driver</driverClassName>
<maxActive>50</maxActive>
<maxWait>60000</maxWait>
<testOnBorrow>true</testOnBorrow>
<validationQuery>SELECT 1</validationQuery>
<validationInterval>30000</validationInterval>
</configuration>
</definition>
</datasource>

12. Copy the related database driver into <APIM_HOME>/repository/components/lib directory as well.

13. Start the API Manager server.

Go to statistics in publisher and the screen should looks like this with a message of 'Data Publishing Enabled. Generate some traffic to see statistics.'


To view statistics, you have to create at least one API and invoke it in order to get some traffic to display in graphs.


Lahiru Cooray

How to invoke a REST API Asynchronously


Dependencies:

<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.3.1</version>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpasyncclient</artifactId>
<version>4.0</version>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpcore-nio</artifactId>
<version>4.3</version>
</dependency>

Sample Code snippet:

private static void sendAsyncRequest(final HttpPost postRequest, 
FutureCallback futureCallback, CountDownLatch latch)
throws IOException {
CloseableHttpAsyncClient client;
client = HttpAsyncClients.createDefault();
client.start();
client.execute(postRequest, futureCallback);
try {
latch.await();
} catch (InterruptedException e) {
log.error("Error occurred while calling end point - " + e);
}
}


private void postRequest() throws IOException {
final HttpPost postRequest = new HttpPost("www.google.com");
final CountDownLatch latch = new CountDownLatch(1);
FutureCallback<HttpResponse> futureCallback =
new FutureCallback<HttpResponse>() {
@Override public void completed(final HttpResponse response) {
latch.countDown();
if ((response.getStatusLine().getStatusCode() != 201)) {
log.error("Error occurred while calling end point - " + response.getStatusLine().getStatusCode() +
"; Error - " +
response.getStatusLine().getReasonPhrase());
} else {
if (log.isDebugEnabled()) {
log.debug("Success Request - " + postRequest.getURI().getSchemeSpecificPart());
}
}
}

@Override public void failed(final Exception ex) {
latch.countDown();
log.error("Error occurred while calling end point - "
+ postRequest.getURI().getSchemeSpecificPart() +
"; Error - " + ex);
}

@Override public void cancelled() {
latch.countDown();
log.warn("Operation cancelled while calling end point - " +
postRequest.getURI().getSchemeSpecificPart());
}
};
sendAsyncRequest(postRequest, futureCallback, latch);
}

Hariprasath ThanarajahCreating the plug-in project in ECLIPSE

You can use any Java IDE you wish to build Eclipse plug-ins, but of course, the Eclipse SDK provides tooling specific for plug-in development. We'll walk through the steps for building our plug-in with the Eclipse SDK, since this is the typical case. If you are not already familiar with the Eclipse workbench and the Java IDE, consult the Java development user guide or PDE guide for further explanations of the steps we are taking. For now we are focusing on the code, not the tool; however, there are some IDE logistics for getting started.

Creating your plug-in project


You will need to create a project that contains your work. We'll take advantage of some of the code-generation facilities of the Plug-in Development Environment (PDE) to give us a template to start from. This will set up the project for writing Java code and generate the default plug-in manifest files (explained in a moment) and a class to hold our view.

  1. Open the New Project... wizard ( File > New > Project...) and choose Plug-in Project from the Plug-in Development category and click Next.
  2. On the Plug-in Project page, use org.wso2.tooling.connector.dynamic.schama as the name for your project and check the box for Create a Java project (this should be the default). Leave the other settings on the page with their default settings and then click Next to accept the default plug-in project structure and click Next.
  3. On the Plug-in Content page, look at the default settings. The wizard sets org.wso2.tooling.connector.dynamic.schama as the id of the plug-in.  The wizard will also generate a plug-in class for your plug-in and allow you to supply additional information about contributing to the UI. These defaults are acceptable, so click Next.
  4. On the Templates page, check the box for Create a plug-in using one of the templates. Then select the Plug-in with a view template. Click Next.
  5. We want to create a minimal plug-in, so at this point, we need to change the default settings to keep things as simple as possible. On the Main View Settings page, change the suggested defaults as follows:
  • Change the Java Package Name to org.wso2.tooling.connector.dynamic.schemas.views (we don't need a separate package for our view).
  • Change the View Class Name to CreateSchema.
  • Change the View Name to Create Schema View.
  • Leave the default View Category Id as org.wso2.tooling.connector.dynamic.schema
  • Change the View Category Name to Sample Category.
  • Leave the default viewer type as Table viewer (we will change this in the code to make it even simpler).
  • Uncheck the box for Add the view to the java perspective.
  • Click Next to proceed to the next page.
       6. On the View Features page, uncheck all of the boxes so that no extra features are generated by the plug-in. Click Finish to create the project and the plug-in skeleton.

       7. When asked if you would like to switch to the Plug-in Development perspective, answer Yes. Navigate to your new project and examine its contents.

The skeleton project structure includes several folders, files, and a Java package. The important files at this stage are the plugin.xml and MANIFEST.MF (manifest) files and the Java source code for your plug-in. We'll start by looking at the implementation for a view and then examine the manifest files.


The Create Schema view

Now that we've created a project, package, and view class for our plug-in, we're ready to study some code.  Here is everything you need in your CreateSchema.  Copy the contents below into the class you created, replacing the auto-generated content. 
package org.wso2.tooling.connector.dynamic.schama.views;

import org.eclipse.swt.widgets.Composite;
import org.eclipse.swt.widgets.Label;
import org.eclipse.ui.part.*;
import org.eclipse.swt.SWT;


public class CreateSchema extends ViewPart {

Label label;
public CreateSchema() {
}
public void createPartControl(Composite parent) {
label = new Label(parent, SWT.WRAP);
label.setText("Hello World");
}
public void setFocus() {
// set focus to my widget. For a label, this doesn't
// make much sense, but for more complex sets of widgets
// you would decide which one gets the focus.
}
}

The view part creates the widgets that will represent it in the createPartControl method. In this example, we create an SWT label and set the "Hello World" text into it. This is about the simplest view that can be created.

The Hello World manifests

Before we run the new view, let's take a look at the manifest files that were generated for us. First, double-click the plugin.xml file to open the plug-in editor and select the plugin.xml tab.
<?xml version="1.0" encoding="UTF-8"?>
<?eclipse version="3.4"?>
<plugin>

<extension
point="org.eclipse.ui.views">
<category
name="Sample Category"
id="org.wso2.tooling.connector.dynamic.schama">
</category>
<view
name="Create Schema View"
icon="icons/sample.gif"
category="org.wso2.tooling.connector.dynamic.schama"
class="org.wso2.tooling.connector.dynamic.schama.views.CreateSchema"
id="org.wso2.tooling.connector.dynamic.schama.views.CreateSchema">
</view>
</extension>
<extension
point="org.eclipse.help.contexts">
<contexts
file="contexts.xml">
</contexts>
</extension>

</plugin>

The information about the view that we provided when we created the plug-in project was used to generate an entry in the plugin.xml file that defines our view extension. In the extension definition, we define a category for the view, including its name and id. We then define the view itself, including its name and id, and we associate it with the category using the id we defined for our category. We also specify the class that implements our view, CreateSchema.
As you can see, the plug-in manifest file wraps up all the information about our extension and how to run it into a nice, neat package.
The other manifest file that is generated by the PDE is the OSGi manifest, MANIFEST.MF. This file is created in the META-INF directory of the plug-in project, but is most easily viewed by clicking on the MANIFEST.MFtab of the plug-in editor. The OSGi manifest describes lower-level information about the packaging of the plug-in, using the OSGi bundle terminology. It contains information such as the name of the plug-in (bundle) and the bundles that it requires.

Running the plug-in

We have all the pieces needed to run our new plug-in. Now we need to build the plug-in. If your Eclipse workbench is set up to build automatically, then your new view class should have compiled as soon as you saved the new content. If not, then select your new project and choose command link Project > Build Project. The class should compile without error.
There are two ways to run a plug-in once it has been built.
  1. The plug-in's manifest files and jar file can be installed in the eclipse/plugins directory. When the workbench is restarted, it will find the new plug-in.
  2. The PDE tool can be used to run another workbench from within your current workbench. This runtime workbench is handy for testing new plug-ins immediately as you develop them from your workbench. (For more information about how a runtime workbench works, check the PDE guide.)
For simplicity, we'll run the new plug-in from within the Eclipse workbench.

Launching the workbench

To launch a runtime workbench, choose command link Run > Run.... This dialog will show you all the different kinds of ways you can launch a program. Choose Eclipse Application, click New and accept all of the default settings. This will cause another instance of the Eclipse workbench, the runtime workbench, to start.

Running Hello World

So where is our new view? We can see all of the views that have been contributed by plug-ins using the Window > Show View > Other menu.

This menu shows us what views are available for the current perspective. You can see all of the views that are contributed to the platform (regardless of perspective) by selecting Other.... This will display a list of view categories and the views available under each category.
The workbench creates the full list of views by using the extension registry to find all the plug-ins that have provided extensions for the org.eclipse.ui.views extension point.

There we are! The view called "Create Schema View" has been added to the Show View window underneath our category "Sample Category." The labels for our category and view were obtained from the extension point configuration markup in the plugin.xml.
Up to this point, we still have not run our plug-in code!  The declarations we made in the plugin.xml (which can be seen by other plug-ins using the extension registry) are enough for the workbench to find out that there is a view called "Create Schema View" available in the "Sample" category. It even knows what class implements the view. But none of our code will be run until we decide to show the view.
If we choose the "Create Schema View" view from the Show View list, the workbench will activate our plug-in, instantiate and initialize our view class, and show the new view in the workbench along with all of the other views. Now our code is running. 

There it is, our first plug-in! 

Prakhash Sivakumar8 Days In a Mysterious Land -1

I thought of writing this post as many of our friends showing interest to visit Nepal after our journey :D .

Evanthika AmarasiriHow to create custom references(usedBy, ownedBy, etc) that can be used to associate artifacts in WSO2 Governance Registry 5.3.0 onward

This support was available from G-Reg 5.3.0 onward. For more information, refer [1].

1. Added a new rxt with the below config.

<artifactType hasNamespace="true" iconSet="10" pluralLabel="Tests" shortName="tests"
singularLabel="Test" type="application/vnd.wso2-tests+xml">
        <storagePath>/tests/@{details_name}</storagePath>
        <nameAttribute>details_name</nameAttribute>
        <namespaceAttribute>details_address</namespaceAttribute>
        <ui>
            <list>
                <column name="Name">
                    <data href="@{storagePath}" type="path" value="details_name"/>
                </column>
            </list>
        </ui>
        <content>
            <table name="Details">
                <field required="true" type="text">
                    <name>Name</name>
                </field>
                <field required="true" type="text">
                    <name>Address</name>
                </field>
                <field required="true" type="text">
                    <name>ContactNumber1</name>
                </field>
                <field required="true" type="text">
                    <name>ContactNumber2</name>
                </field>
            </table>
        </content>
    </artifactType>
    
2. From the publisher, added a new artifact of type tests (I've added a test artifact by the name Test3)
3. Added the below config to the <G-REG_HOME</repository/conf/governance.xml file;
<tests reverseAssociation ="tests" iconClass="fw-globe">tests</tests>
so that the <Association type="soapservice"> looks like what's given below.

        <Association type="soapservice">
            <security reverseAssociation ="secures" iconClass="fw-security">policy</security>
            <ownedBy reverseAssociation ="owns" iconClass="fw-user">soapservice,restservice,wsdl</ownedBy>
            <usedBy reverseAssociation ="depends" iconClass="fw-globe">soapservice,restservice,wsdl</usedBy>
            <depends reverseAssociation ="usedBy" iconClass="fw-store">soapservice,restservice,wsdl,endpoint</depends>
            <contacts reverseAssociation ="refers" iconClass="fw-globe">contacts</contacts>
            <tests reverseAssociation ="tests" iconClass="fw-globe">tests</tests>
        </Association>


4. From the publisher, try to select the added test type artifact for your SOAP service. I typed in the name Test3 and it would list to be selected and added as an association for the SOAP Service.


Note that as mentioned in our documentation when doing the above, you need to add the values you defined as short name in the RXT file of the artifact, within the <Association type> element, to define the association types enabled for that particular asset type

[1] - https://docs.wso2.com/display/Governance520/Adding+Associations+for+an+Asset

Evanthika AmarasiriDisabling API Console/Swagger tools menu available from store console for anonymous/logged in users

If you need to disable the API Console/Swagger from the Store UI for anonymous users/logged in users, you can try out the below methods.

There is no straightforward configuration readily available with API Manager to do this. However, by doing a minor config change, this is possible. What you actually need to do is change the code of the block.jag which resides under wso2am-1.8.0/repository/deployment/server/jaggeryapps/store/site/blocks/api/api-info folder.

Method 1

Assuming you want the API Console (RESTClient) to be disable for anonymous users only, this can be done by changing/adding the below lines of code to the block.jag.

Step 1
Change the below code of line from

var showConsole=true;
to

var showConsole=false;

Step 2
Then add the below lines of code right after the line _var showConsole=false;_

        if(user){
        showConsole=true
        }

Method 2

If you need this feature to be completely invisible for anonymous and logged in users, all you have to do is change the below code.
Change the parameter from

var showConsole=true;
to

var showConsole=false;

Once the above changes are done, restart the API manager server and you will notice that the RESTClient tool is visible only to logged in users/not visible at all for anyone.

Pubudu Priyashan[WSO2 ESB] Copying a file using WSO2 Fileconnector

WSO2 file connector can be used to do various file operations. You can find the instructions on how to install the file connector here.

Malith MunasingheVirtual Networking for a static IP based local cluster with Oracle Virtual Box

Working in a clustered environment was one of the main tasks which I had to go through recently. Before going into an actual clustered environment where I could mess things up I took up the challenge of setting one up on my own. The luxury of going into a commercial virtual server provider was not an option therefore opting to do it locally through a virtual environment was the best solution.


Since I’ve been using Oracle Virtual Box for a quite some time I went ahead and started deploying servers. Although I’ve been managing one or two servers in a virtual box, managing a cluster with 4 nodes and maintaining communication within the nodes into several ports became the problem.


Although using a NAT adapter with port forwarding can be used. Configuring several ports for each server was the problem with maintaining a cluster. Also assigning a static IP address to be used for communication apart from 10.0.2.15 which is used by Virtual Box was also out of options in this method. Then after some reading I figured host only adapter would be the solution for me. This solved the above problems I faced while using NAT adapter.


Initially you will have to add a Host-only network adapter to your virtual box instance. To do so got Preference -> Networks -> Host-only Networks  



Here in this panel by clicking the + icon on right hand corner you can add a Host-only adapter to your Virtual Box. Click on the new adapter that is created and do the configurations for IP's that you require. Basically this would use 19.168.xx.xx IP range since it is the private IP address range used.



The IP which will be given default to the Host-only adapter will be assigned to the host that the virtual box is running therefore in this scenario you can use IP addresses from 192.168.56.2 onwards for the virtual servers that you are using. After configuring click OK and start configuring a server.



Choose the server that you want to add the network to and select Settings -> Network -> adapter 2 (We will keep the adapter 1 as NAT since this wouldn’t be a blocker to go ahead and can be used for initial setting up and debugging without the new port we are adding).


Select Enable Network  Adapter and Under Attached to drop down select Host-only Adapter and assign the Name with Host-only adapter created above.  




Click Ok and we are ready to start the server. For this task I have been using ubuntu server 14.04 and the configurations in the server maybe a bit different to the OS version that you are using.


After starting the server run ifconfig command and you will only see eth0 port which is bound to 10.0.2.15 as inet address. Open /etc/network/interfaces and add below configurations to it after eth0 interface


auto eth1
iface eth1 inet static
address 192.168.56.4
netmask 255.255.255.0
network 192.168.56.0
broadcast 192.168.56.254


Save the file and run ifconfig eth0 up. It will setup the new interface with the relevant IP address. You can check it by running ifconfig and you will see below. Try pinging the IP you’ve assigned from your local host and confirm that IP is assigned properly.  


Do this for all the servers with several IPs and enjoy the luxury of a cluster which is running under a set of IPs that would be used to ssh, clustering, load balancing and etc.

Lakshani Gamage[WSO2 App Manager] How to Customize Webapp Overview Page

I posted several posts on how to add different custom input fields to Publisher. In this post, let's see how to customize Publisher overview page.

By default, webapp overview page is like below.




If you want to preview custom fields in the overview page, you need to modify <APPM_HOME>/repository/deployment/server/jaggeryapps/publisher/themes/appm/helpers/splitter.js.

Suppose you added a custom field called "Price" like in this post. Then you have to add below condition inside splitData function of above file.
   
else if (dataPart[i].name == "overview_price") {
overview_main.push(dataPart[i]);
}

Then,  the webapp overview page with custom feild will look like below.

Charini NanayakkaraInstalling ANTLR on Ubuntu and IntelliJ

I recently installed ANTLR on my machine to do some development work related to WSO2 Siddhi. So what is ANTLR? Following is the definition provided in their official web site (http://www.antlr.org/):

ANTLR (ANother Tool for Language Recognition) is a powerful parser generator for reading, processing, executing, or translating structured text or binary files. It's widely used to build languages, tools, and frameworks. From a grammar, ANTLR generates a parser that can build and walk parse trees.

WSO2 Siddhi is integrated with ANTLR compiler for efficient query compilation.

So... let's move on to see how we can install ANTLR on a linux machine and do development using the ANTLR plugin for IntelliJ.

Installing ANTLR 4 Plugin for IntelliJ
  1. Start IntelliJ IDE
  2. Go to File -> Settings -> Plugins
  3. Type "ANTLR" on the search text box
  4. Right click on "ANTLR v4 grammar plugin" 
  5. Select "Download and Install" option
  6. After completion of installation, restart IntelliJ for the changes to take effect
Installing ANTLR Run-time on Ubuntu [1]
  1. cd /usr/local/lib
  2. curl -O http://www.antlr.org/download/antlr-4.5-complete.jar
  3. export CLASSPATH=".:/usr/local/lib/antlr-4.5-complete.jar:$CLASSPATH" (Exports class path. Include in .bashrc file)
  4. alias antlr4='java -Xmx500M -cp "/usr/local/lib/antlr-4.5-complete.jar:$CLASSPATH" org.antlr.v4.Tool' (Creates alias for ANTLR tool. Include in .bashrc file)
  5. alias grun='java org.antlr.v4.gui.TestRig' (Creates alias for TestRig. Include in .bashrc file)
  6. Restart the machine
Now your system is ready for doing ANTLR based development. A good, first example to try out is provided here [2]


Prabath AriyarathnaApplication Monitoring

In the software world, application monitoring is critical for the administrators as well as for the maintenance(Application support) teams. Obviously monitoring is very useful for the administrators. They need to monitor the realtime behavior of the application to give uninterrupted service to the end users, but monitoring is even important to the support teams to track down the issues of the applications.



Doing support is the most important phase of the software development life cycle after delivering the product. End Users reported  different kind of issues and support engineers need some informations which are related to the application behaviour to solve the issues. Some issues are domain related and we can simply recreate the issues in our local environment. Fixing the issue is not a big deal if we could reproduce the same behavior in our local setup but some issues are not easy to replicate in the local environment because those aren’t continuously happening in the production setup. So Identifying the exact root cause is the challenge. Concurrency issues, Thread spinning issue and memory issues are in the top of the order. Software developer should have proper plan to report the status of the application with required details when application has some issues. Putting log messages with the proper details and proper place are the most important but same cases like high CPU usage, developer need some more information like thread dump to track the issue. Support engineers or developers may be identified the issue by looking at the logs, thread dump or heap dumps, but application specific information need for some cases. Proper monitoring mechanism can fulfil that requirement. There are different type of  monitoring application available in the industry for different purposes but all these applications are developed as the general purpose applications. Application developer need to implement application specific monitoring mechanism for achieving that requirement.

Note:- Proper Monitoring mechanism can be get as the marketing factor because client can incorporate JMX APis with their existing monitoring dashboards seamlessly or we can provide our own monitoring dashboard to the customers.

JMX(Java management extension)


The JMX technology provides the tools for building distributed, Web-based, modular and dynamic solutions for managing and monitoring devices, applications, and service-driven network. Starting with the J2SE platform 5.0, JMX technology is included in the Java SE platform. JMX is the recommended way to monitor and manage java applications. As an example, administrator can stop or start the application or dynamically can change the configurations. Monitoring and management are the basic usage of the JMX. JMX can be used for design the full modularize applications which can enable and disable the modules at any time via the JMX, but main intention of this article is for discussing management and monitoring capabilities of the JMX.

JMX architecture.


Three main layers can be identified in the JMX architecture.

  1. Prob Level
The level closed to the application is called the instrumentation layer or prob layer. This level consists of four approaches for instrumenting application and system resources to be manageable (i.e., making them managed beans, or MBeans), as well as a model for sending and receiving notifications. This level is the most important level for the developers because this level prepares resources to be manageable. We can identify main two categories when we consider about the  instrumentation level.

  • Application resources( Eg:- Connection pool, Thread pool, .. etc)
An application resources that need to be manageable through the JMX must provide the metadata about a resource’s features are known as its management interface. Management applications may interact with the resources via management interface.

  • Instrumentation strategy.
There are four instrumentation approaches defined by JMX that we can use to describe the management interface of a resource: standard, dynamic, model, and open


     2.  Agent Level
The agent level of the JMX architecture is made up of the MBean server and the JMX agent services. The MBean server has two purposes: it serves as a registry of MBeans and as a communications broker between MBeans and management applications (and other JMX agents). The JMX agent services provide additional functionality that is mandated by the JMX specification, such as scheduling and dynamic loading.

    
    3.  Remote management Level
Top level of the JMX architecture is called the distributed services level. This level contains the middleware that connects JMX agents to applications that manage them (management applications). This middleware is broken into two categories: protocol adaptors and connectors.

Dimuthu De Lanerolle

Useful Git commands

Q: How can I merge a distinct pull request to my local git repo ?

A:
   You can easily merge a desired pull request using the following command. If you are doing this merge at first time you need to clone a fresh check-out of the master branch to your local machine and apply this command from the console.
 
git pull https://github.com/wso2-dev/product-esb +refs/pull/78/head

Q: How do we get the current repo location of my local git repo?

A: The below command will give the git repo location your local repo is pointing to.

git remote -v

Q: Can we change my current repo url to a remote repo url

A: Yes. You can point to another repo url as below.

git remote set-url origin https://github.com/dimuthud/carbon-platform-integration-utils.git

Q: What is the git command to clone directly from a non-master branch (eg: two branches master & release-1.9.0 how to clone from release-1.9.0 branch directly without switching to release-1.9.0 after cloning from the master) 

A: Use the following git command.

git clone -b release-1.9.0 https://github.com/wso2/product-apim.git

Maven

Q : I need to go ahead and build no matter i get build failures. Can I do that with maven build?

A: Yes. Try building like this.

mvn clean install -fn 

Q : Can I directly clone a tag of a particular git branch ?

A : Yes. Lets Imagine your tag is 4.3.0 , Following command will let you directly clone the tag instead the branch.

Syntax : git clone --branch <tag_name> <repo_url>

eg:
git clone --branch carbon-platform-integration-utils-4.3.0 https://github.com/wso2/carbon-platform-integration-utils.git



Q : To See git remote urls in more detail

A : git remote show origin



Q: Creating  a new branch

git checkout -b NewBranchName
git push origin master
git checkout master
git branch      (The pointer * represents that, In which branch you are right now.)
git push origin NewBranchName



    For More Info : http://stackoverflow.com/questions/9257533/what-is-the-difference-between-origin-and-upstream-on-github

Hasunie Adikari

Installing Tomcat 8.5 on macOS 10.12 Sierra


Prerequisite: Java

First we need need to make sure Java is installed by this command javac in terminal.
If its already installed it would get following

Hasunie-MacBook-Pro:bin hasunie$ javac
Usage: javac <options> <source files>
where possible options include:
  -g                         Generate all debugging info
  -g:none                    Generate no debugging info
  -g:{lines,vars,source}     Generate only some debugging info
  -nowarn                    Generate no warnings
  -verbose                   Output messages about what the compiler is doing
  -deprecation               Output source locations where deprecated APIs are used
  -classpath <path>          Specify where to find user class files and annotation processors
  -cp <path>                 Specify where to find user class files and annotation processors
  -sourcepath <path>         Specify where to find input source files
  -bootclasspath <path>      Override location of bootstrap class files
  -extdirs <dirs>            Override location of installed extensions
  -endorseddirs <dirs>       Override location of endorsed standards path
  -proc:{none,only}          Control whether annotation processing and/or compilation is done.
  -processor <class1>[,<class2>,<class3>...] Names of the annotation processors to run; bypasses default discovery process
  -processorpath <path>      Specify where to find annotation processors
  -d <directory>             Specify where to place generated class files
  -s <directory>             Specify where to place generated source files
  -implicit:{none,class}     Specify whether or not to generate class files for implicitly referenced files
  -encoding <encoding>       Specify character encoding used by source files
  -source <release>          Provide source compatibility with specified release
  -target <release>          Generate class files for specific VM version
  -version                   Version information
  -help                      Print a synopsis of standard options
  -Akey[=value]              Options to pass to annotation processors
  -X                         Print a synopsis of nonstandard options
  -J<flag>                   Pass <flag> directly to the runtime system
  -Werror                    Terminate compilation if warnings occur

  @<filename>                Read options and filenames from file



If Its not Installed:
As I’m writing this, Java 1.8.0_101 is the latest version, available for download here: http://www.oracle.com/technetwork/java/javase/downloads/index.html
The JDK installer package come in an dmg and installs easily on the Mac; and after opening the Terminal app again,
java -version
Now shows something like this:
Hasunie-MacBook-Pro:bin hasunie$ java -version
java version "1.7.0_79"
Java(TM) SE Runtime Environment (build 1.7.0_79-b15)
Java HotSpot(TM) 64-Bit Server VM (build 24.79-b02, mixed mode)

Note : My java version is still idk 1.7
JAVA_HOME is an important environment variable, not just for Tomcat, and it’s important to get it right. Here is a trick that allows me to keep the environment variable current, even after a Java was installed. In ~/.bash_profile, I set the variable like so:
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home
export PATH=$JAVA_HOME/bin:$PATH

Installing Tomcat:
Here are the easy to follow steps to get it up and running on your Mac
  1. Download a binary distribution of the core module: apache-tomcat-8.5.5.tar.gz from here. I picked the tar.gz in Binary Distributions / Core section.
  2. Opening/unarchiving the archive will create a folder structure in your Downloads folder: (btw, this free Unarchiver app is perfect for all kinds of compressed files and superior to the built-in Archive Utility.app)
    ~/Downloads/apache-tomcat-8.5.5
  3. Open to Terminal app to move the unarchived distribution to /usr/local
    sudo mkdir -p /usr/local
    sudo mv ~/Downloads/apache-tomcat-8.5.5 /usr/local
  4. To make it easy to replace this release with future releases, we are going to create a symbolic link that we are going to use when referring to Tomcat (after removing the old link, you might have from installing a previous version):
    sudo rm -f /Library/Tomcat
    sudo ln -s /usr/local/apache-tomcat-8.5.5 /Library/Tomcat
  5. Change ownership of the /Library/Tomcat folder hierarchy:
    sudo chown -R <your_username> /Library/Tomcat
  6. Make all scripts executable:
    sudo chmod +x /Library/Tomcat/bin/*.sh
OR

  1. After the 1 st Step Rename the apache-tomcat-8.5.5 to Tomcat and copy it inside to the /Library  folder
  2. Start the server by giving
     Hasunie-MacBook-Pro:bin hasunie$ /Library/Tomcat/bin/startup.sh
Using CATALINA_BASE:   /Library/Tomcat
Using CATALINA_HOME:   /Library/Tomcat
Using CATALINA_TMPDIR: /Library/Tomcat/temp
Using JRE_HOME:        /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home
Using CLASSPATH:       /Library/Tomcat/bin/bootstrap.jar:/Library/Tomcat/bin/tomcat-juli.jar
Tomcat started.
      
       3.Then you could be able to see the tomcat home page from
          http://localhost:8080
          
       

      




Pubudu GunatilakaHow to create a self-signed SSL certificate for multiple domains

Domain names could contain multiple sub domains. For an example, esb.dev.abc.com and test.api.dev.abc.com are belong to the same organization.

Wildcard certificate *.dev.abc.com covers only the esb.dev.abc.com and it does not cover test.api.dev.abc.com. This wildcard certificate does not support if there are multiple dots (.) after the .dev.abc.com.

We can add multiple DNS alternative names to the SSL certificate to cover the domain names.

  1. Create a file called openssl.cnf with the following details.

[req]
distinguished_name = req_distinguished_name
req_extensions = v3_req

[req_distinguished_name]
countryName = SL
countryName_default = SL
stateOrProvinceName = Western
stateOrProvinceName_default = Western
localityName = Colombo
localityName_default = Colombo
organizationalUnitName = ABC
organizationalUnitName_default = ABC
commonName = *.dev.abc.com
commonName_max = 64

[ v3_req ]
# Extensions to add to a certificate request
basicConstraints = CA:FALSE
keyUsage = nonRepudiation, digitalSignature, keyEncipherment
subjectAltName = @alt_names

[alt_names]
DNS.1 = *.api.dev.abc.com
DNS.2 = *.app.dev.abc.com

2. Create the Private key.

sudo openssl genrsa -out server.key 2048

1

3. Create Certificate Signing Request (CSR).

sudo openssl req -new -out server.csr -key server.key -config openssl.cnf

Note: For the common name type as *.dev.abc.com. It will take the default values mentioned above for other values.

2

4. Sign the SSL Certificate.

sudo openssl x509 -req -days 3650 -in server.csr -signkey server.key -out server.crt -extensions v3_req -extfile openssl.cnf

3

Your server.csr certificate will contains *.dev.abc.com as the common name and other domain names as the DNS alternative names.

4.png


Isuru WijesingheImplement a WSO2 Carbon Component using eclipse IDE

Introduction

This tutorial mainly focus on how to implement a WSO2 carbon component from scratch and help you to understand the structure of the project that needs to be followed  when implementing a WSO2 carbon component. I assume that you have an overall understanding about WSO2 carbon platform and how it works.

First of all I give you a brief introduction about the award-winning WSO2 carbon platform. It is a component-based, service oriented platform for the enterprise-grade WSO2 middleware products stack. It is 100% open source and delivered under Apache License 2.0. The WSO2 Carbon platform is lean, high-performant and consists of a collection of OSGi bundles.  

The WSO2 Carbon core platform hosts a rich set of middleware components encompassing capabilities such as security, clustering, logging, statistics, management and more. These are basic features required by all WSO2 products that are developed on top of the base platform.

All WSO2 products are a collection of Carbon components. They have been developed simply by plugging various Carbon components that provide different features. The WSO2 Carbon component manager provides the ability to extend the Carbon base platform, by selecting the components that address your unique requirements and installing them with point-and-click simplicity. As a result, by provisioning this innovative base platform, you can develop your own, lean middleware product that has remarkable flexibility to change as business requirements change.

Once you have the basic knowledge on how the architecture works in carbon, you can start implementing a Carbon component. Before move on to any coding stuff first look at the prerequisites that we need to implement the carbon component using eclipse IDE.

 Prerequisites
  • Java
  • Maven
  • Any WSO2 carbon product (Here I use WSO2 Application Server)
  • Eclipse (or you can use IdeaJ as well)
Scenario

Suppose we have a simple object called OrderBean for storing order details in the back-end component and let’s try to display those information at the front-end UI.

Creating the Project Structure

Now I will explain about the project structure to implement the carbon component. Here I'm going to create an Order Process carbon component using ecpilse. This will consists of two parts called back-end runtime and front-end console UI. First look at how to implement back-end runtime.

As a first step I will create a maven project. (Before that, you should have install maven plugin to the eclipse)

File -> New -> Other -> Maven Project (Inside of the Maven folder)


Then click Next and you will see the fallowing UI.


Click Next and then select the appropriate archetype to create the project structure. Here I will use default project structure. And again click Next.


Now I will have to specify Archetype parameters for my maven project. See the fallowing figure to setup those parameters (Please change the version to 1.0.0-SNAPSHOT). And then click Finish.


Makesure that packaging type is bundle in the pom.xml file. (Because both backend and frontend must package as OSGi bundle in carbon). I'm using maven-bundle-plugin to do that.

<groupId>org.wso2.carbon</groupId>
<artifactId>org.wso2.carbon.example.OrderProcess</artifactId>
<version>1.0.0-SNAPSHOT</version>
<packaging>bundle</packaging>
 
This will be an OSGI bundle. So, I have to configure the Apache Felix plugin to set up the configurations.

       <build>
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>1.4.0</version>
<extensions>true</extensions>
<configuration>
<instructions>
<Bundle-SymbolicName>${pom.artifactId}</Bundle-SymbolicName>
<Bundle-Name>${pom.artifactId}</Bundle-Name>
<Export-Package>
org.wso2.carbon.example.OrderProcess.*
</Export-Package>
</instructions>
</configuration>
</plugin>
</plugins>
</build>

Since I'm using Carbon registry to store the items of the OrderBean, following dependencies should be added to the back-end project. (Remember to use byte arrays when you are storing the values in the Carbon registry)

<dependencies>  
<dependency>
<groupId>org.wso2.carbon</groupId>
<artifactId>org.wso2.carbon.registry.core</artifactId>
<version>4.2.0</version>
</dependency>
<dependency>
<groupId>org.wso2.carbon</groupId>
<artifactId>org.wso2.carbon.registry.api</artifactId>
<version>4.2.0</version>
</dependency>
</dependencies>

After adding the dependencies and the plugins, pom.xml file of the back-end will be similar to following pom. (If your project have an error then you should have to update the project such that right click the project then select Maven -> Update Project)

(You should have to change the value of the Export-Package element in your pom.xml file according to the package structure)

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>org.wso2.carbon</groupId>
<artifactId>org.wso2.carbon.example.OrderProcess</artifactId>
<version>1.0.0-SNAPSHOT</version>
<packaging>bundle</packaging>

<name>WSO2 Carbon - Order Process</name>
<url>http://maven.apache.org</url>

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>

<!-- <dependencies> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId>
<version>3.8.1</version> <scope>test</scope> </dependency> </dependencies> -->

<build>
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>1.4.0</version>
<extensions>true</extensions>
<configuration>
<instructions>
<Bundle-SymbolicName>${pom.artifactId}</Bundle-SymbolicName>
<Bundle-Name>${pom.artifactId}</Bundle-Name>
<Export-Package>
org.wso2.carbon.example.OrderProcess.*
</Export-Package>
</instructions>
</configuration>
</plugin>
</plugins>
</build>

<dependencies>
<dependency>
<groupId>org.wso2.carbon</groupId>
<artifactId>org.wso2.carbon.registry.core</artifactId>
<version>4.2.0</version>
</dependency>
<dependency>
<groupId>org.wso2.carbon</groupId>
<artifactId>org.wso2.carbon.registry.api</artifactId>
<version>4.2.0</version>
</dependency>
</dependencies>

<repositories>
<repository>
<id>wso2-nexus</id>
<name>WSO2 internal Repository</name>
<url>http://maven.wso2.org/nexus/content/groups/wso2-public/</url>
<releases>
<enabled>true</enabled>
<updatePolicy>daily</updatePolicy>
<checksumPolicy>ignore</checksumPolicy>
</releases>
</repository>
</repositories>

<pluginRepositories>
<pluginRepository>
<id>wso2-maven2-repository</id>
<url>http://dist.wso2.org/maven2</url>
</pluginRepository>
<pluginRepository>
<id>wso2-maven2-snapshot-repository</id>
<url>http://dist.wso2.org/snapshots/maven2</url>
</pluginRepository>
</pluginRepositories>

</project>

Create the back-end service

I already created a service class called ProcessOrderService inside the package org.wso2.carbon.example.OrderProcess. This service consists of two methods. One for processing the order and the other is for canceling the order.

Before creating the service class I already created a package called org.wso2.carbon.example.OrderProcess.data to hold my data objects called OrderBean, Item, Address, Customer.

Now I will show my OrderBean class implementation below and you will see that it implements the Serializable interface, since because I'm going to use Carbon registry to store the OrderBean objects in the carbon registry.

package org.wso2.carbon.example.OrderProcess.data;

import java.io.Serializable;

public class OrderBean implements Serializable{
private Customer customer;
private Address shippingAddress;
private Item[] orderItems;
private String orderID;
private double totalPrice;

/**
* @return customer
*/
public Customer getCustomer() {
return customer;
}

public void setCustomer(Customer customer) {
this.customer = customer;
}

public Address getShippingAddress() {
return shippingAddress;
}

public void setShippingAddress(Address shippingAddress) {
this.shippingAddress = shippingAddress;
}

public Item[] getOrderItems() {
return orderItems;
}

public void setOrderItems(Item[] orderItems) {
this.orderItems = orderItems;
}

public String getOrderID() {
return orderID;
}

public void setOrderID(String orderID) {
this.orderID = orderID;
}

public double getPrice() {
return totalPrice;
}

public void setPrice(double price) {
this.totalPrice = price;
}

}

package org.wso2.carbon.example.OrderProcess.data;

import java.io.Serializable;

public class Customer implements Serializable{
private String custID;
private String firstName;
private String lastName;

public String getCustID() {
return custID;
}

public void setCustID(String custID) {
this.custID = custID;
}

public String getFirstName() {
return firstName;
}

public void setFirstName(String firstName) {
this.firstName = firstName;
}

public String getLastName() {
return lastName;
}

public void setLastName(String lastName) {
this.lastName = lastName;
}

}

package org.wso2.carbon.example.OrderProcess.data;

import java.io.Serializable;

public class Address implements Serializable{
private String streetName;
private String cityName;
private String stateCode;
private String country;
private String zipCode;

public String getStreetName() {
return streetName;
}

public void setStreetName(String streetName) {
this.streetName = streetName;
}

public String getCityName() {
return cityName;
}

public void setCityName(String cityName) {
this.cityName = cityName;
}

public String getStateCode() {
return stateCode;
}

public void setStateCode(String stateCode) {
this.stateCode = stateCode;
}

public String getCountry() {
return country;
}

public void setCountry(String country) {
this.country = country;
}

public String getZipCode() {
return zipCode;
}

public void setZipCode(String zipCode) {
this.zipCode = zipCode;
}

}

package org.wso2.carbon.example.OrderProcess.data;

import java.io.Serializable;

public class Item implements Serializable{
private String itemName;
private String itemID;
private double unitPrice;
private int quantity;

public String getItemName() {
return itemName;
}

public void setItemName(String itemName) {
this.itemName = itemName;
}

public String getItemID() {
return itemID;
}

public void setItemID(String itemID) {
this.itemID = itemID;
}

public int getQuantity() {
return quantity;
}

public void setQuantity(int quantity) {
this.quantity = quantity;
}

public double getUnitPrice() {
return unitPrice;
}

public void setUnitPrice(double unitPrice) {
this.unitPrice = unitPrice;
}

}

Now you can see my service class implementation below.

package org.wso2.carbon.example.OrderProcess;

import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.util.ArrayList;
import java.util.List;
import java.util.logging.Logger;

import org.wso2.carbon.context.CarbonContext;
import org.wso2.carbon.context.RegistryType;
import org.wso2.carbon.example.OrderProcess.data.Item;
import org.wso2.carbon.example.OrderProcess.data.OrderBean;
import org.wso2.carbon.registry.api.Registry;
import org.wso2.carbon.registry.api.RegistryException;
import org.wso2.carbon.registry.api.Resource;


public class ProcessOrderService {
private final static Logger LOGGER = Logger.getLogger(ProcessOrderService.class.getName());

private List<OrderBean> orderList = new ArrayList<OrderBean>();
private int orderCounter = 0;
private double totalAmount = 0;
private Registry registry = null;
private static final String ORDER_PATH = "order_location";

public ProcessOrderService(){
registry = CarbonContext.getThreadLocalCarbonContext().getRegistry(RegistryType.valueOf(RegistryType.LOCAL_REPOSITORY.toString()));
}

/**
* Acquire the order
*
* @param orderBean
* @return OrderBean object
*/
public OrderBean processOrder(OrderBean orderBean) {

// Number of items ordered
if (orderBean.getOrderItems() != null) {
// Set the order ID.
orderBean.setOrderID("ABC-" + (orderCounter++));
try {
Resource orderRes = registry.newResource();
orderRes.setContent(serialize(orderBean.getOrderItems()));
registry.put(ORDER_PATH, orderRes);

Resource getItemsRes = registry.get(ORDER_PATH);
Item[] items = (Item[]) deserialize((byte[]) getItemsRes.getContent());

for (Item item : items) {
double totalItemCost = item.getUnitPrice() * item.getQuantity();
totalAmount += totalItemCost;
}

// set the total price
orderBean.setPrice(totalAmount);
orderList.add(orderBean);

return orderBean;
} catch (RegistryException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} catch (ClassNotFoundException e) {
e.printStackTrace();
}

}

return new OrderBean();
}

/**
* Delete the given order
*
* @param orderID
* @return boolean to check weather order is deleted or not
*/
public boolean cancelOrder(String orderID) {
LOGGER.info("cancelOrder method starting");

for (OrderBean orderBean : orderList) {

if (orderBean.getOrderID().equals(orderID)) {
LOGGER.info("canceling OrderBean Processing");
orderList.remove(orderBean);
return true;
}
}

LOGGER.info("cancelProcssing over");
return false;
}

private static byte[] serialize(Object obj) throws IOException {
ByteArrayOutputStream b = new ByteArrayOutputStream();
ObjectOutputStream o = new ObjectOutputStream(b);
o.writeObject(obj);
return b.toByteArray();
}

private static Object deserialize(byte[] bytes) throws IOException, ClassNotFoundException {
ByteArrayInputStream b = new ByteArrayInputStream(bytes);
ObjectInputStream o = new ObjectInputStream(b);
return o.readObject();
}
}

(If you have App.java class inside your service package please remove it. )

Now I should have to write the service configuration (services.xml) for my service implementation. For that first create a folder called resources inside the src/main/. Then create a folder called META-INF inside the resources folder. Inside the META-INF folder create services.xml file with following content. Change the service and service class names according to your project.

<serviceGroup>
<service name="ProcessOrderService" scope="transportsession">
<transports>
<transport>https</transport>
</transports>
<parameter name="ServiceClass">org.wso2.carbon.example.OrderProcess.ProcessOrderService</parameter>
</service>

<parameter name="adminService" locked="true">true</parameter>
<parameter name="hiddenService" locked="true">true</parameter>
<parameter name="AuthorizationAction" locked="true">/permission/admin/protected</parameter>
</serviceGroup>

Now go to the pom.xml file location of the back-end project using command line interface and type mvn clean install to build the project. If the build get success you will get a jar file like org.wso2.carbon.example.OrderProcess-1.0.0-SNAPSHOT.jar inside the target directory. Then copy the created jar file to repository/components/dropins directory in the WSO2 Application server. 

We can't see the WSDL file of the created service directly accessing the url (http://192.168.1.2:9765/services/ProcessOrderService?wsdl) after running the application server. That is because I have added this as a admin service and by default admin services WSDLs are hidden. In order to view the WSDL file open the carbon.xml file in the repository/conf and set the value of HideAdminServiceWSDLs as false.

<HideAdminServiceWSDLs>false</HideAdminServiceWSDLs>  

Now start the WSO2 Application Server and put the above URL in the browser (last part should be the Service name that you provide in the services.xml). Save the WSDL file in your computer to use it for front-end project.

 Create the front-end console UI

 Now I will create the front-end project like above (maven project) and edit the pom.xml file as below. Inside of this pom file you can see that I've used the previously saved WSDL file. Do the necessary modifications to the pom file according to the your project.
  • org.wso2.carbon.example.OrderProcess.ui
    • artifactId - org.wso2.carbon.example.OrderProcess.ui
    • packaging - bundle
    • name - WSO2 Carbon - Order Process
    • plugin - maven-bundle-plugin
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>org.wso2.carbon</groupId>
<artifactId>org.wso2.carbon.example.OrderProcess.ui</artifactId>
<version>1.0.0-SNAPSHOT</version>
<packaging>bundle</packaging>

<name>WSO2 Carbon - Order Process</name>
<url>http://maven.apache.org</url>

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>

<!-- <dependencies> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId>
<version>3.8.1</version> <scope>test</scope> </dependency> </dependencies> -->

<dependencies>
<dependency>
<groupId>org.apache.axis2.wso2</groupId>
<artifactId>axis2</artifactId>
<version>1.6.1.wso2v4</version>
</dependency>
<dependency>
<groupId>org.apache.stratos</groupId>
<artifactId>org.wso2.carbon.ui</artifactId>
<version>4.2.0-stratos</version>
</dependency>
</dependencies>

<build>

<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.5</source>
<target>1.5</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.1</version>
<executions>
<execution>
<id>source-code-generation</id>
<phase>process-resources</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<tasks>
<java classname="org.apache.axis2.wsdl.WSDL2Java" fork="true">
<arg
line="-uri src/main/resources/OrderProcess.wsdl -u -uw -o target/generated-code
-p org.wso2.carbon.example.OrderProcess.ui
-ns2p http://org.apache.axis2/xsd=org.wso2.carbon.example.OrderProcess.ui.types.axis2,http://OrderProcess.example.carbon.wso2.org=org.wso2.carbon.example.OrderProcess.ui,http://data.OrderProcess.example.carbon.wso2.org/xsd=org.wso2.carbon.example.OrderProcess.ui.types.data" />
<classpath refid="maven.dependency.classpath" />
<classpath refid="maven.compile.classpath" />
<classpath refid="maven.runtime.classpath" />
</java>
</tasks>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<executions>
<execution>
<id>add-source</id>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>target/generated-code/src</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>1.4.0</version>
<extensions>true</extensions>
<configuration>
<instructions>
<Bundle-SymbolicName>${pom.artifactId}</Bundle-SymbolicName>
<Export-Package>
org.wso2.carbon.example.OrderProcess.ui.*
</Export-Package>
<Import-Package>
!javax.xml.namespace,
javax.xml.namespace;version="0.0.0",
*;resolution:=optional,
</Import-Package>
<Carbon-Component>UIBundle</Carbon-Component>
</instructions>
</configuration>
</plugin>
</plugins>

</build>

<repositories>
<repository>
<id>wso2-nexus</id>
<name>WSO2 internal Repository</name>
<url>http://maven.wso2.org/nexus/content/groups/wso2-public/</url>
<releases>
<enabled>true</enabled>
<updatePolicy>daily</updatePolicy>
<checksumPolicy>ignore</checksumPolicy>
</releases>
</repository>
</repositories>

<pluginRepositories>
<pluginRepository>
<id>wso2-maven2-repository</id>
<url>http://dist.wso2.org/maven2</url>
</pluginRepository>
<pluginRepository>
<id>wso2-maven2-snapshot-repository</id>
<url>http://dist.wso2.org/snapshots/maven2</url>
</pluginRepository>
</pluginRepositories>

</project>

Now go to the pom.xml file location of the front-end project using command line interface and type mvn compile to compile the project. (It will download the necessary dependencies and then compile the classes as well)

As the next step I will create the Client called OrderProcessClient inside the org.wso2.carbon.example.OrderProcess.ui package, which will use the generated stub to access the back-end service which I created above.

package org.wso2.carbon.example.OrderProcess.ui;

import java.rmi.RemoteException;

import org.apache.axis2.client.Options;
import org.apache.axis2.client.ServiceClient;
import org.apache.axis2.context.ConfigurationContext;
import org.wso2.carbon.example.OrderProcess.ui.ProcessOrderServiceStub;
import org.wso2.carbon.example.OrderProcess.ui.types.data.OrderBean;

public class OrderProcessClient {

private ProcessOrderServiceStub stub;

public OrderProcessClient(ConfigurationContext configCtx, String backendServerURL,
String cookie) throws Exception {
String serviceURL = backendServerURL + "ProcessOrderService";
stub = new ProcessOrderServiceStub(configCtx, serviceURL);
ServiceClient client = stub._getServiceClient();
Options options = client.getOptions();
options.setManageSession(true);
options.setProperty(org.apache.axis2.transport.http.HTTPConstants.COOKIE_STRING, cookie);
}

public OrderBean processOrder(OrderBean orderBean) throws Exception {
try {
return stub.processOrder(orderBean);
} catch (RemoteException e) {
String msg = "Cannot process the order" + " . Backend service may be unvailable";
throw new Exception(msg, e);
}
}

public boolean cancelOrder(String orderID) throws Exception {
try {
return stub.cancelOrder(orderID);
} catch (RemoteException e) {
String msg = "Cannot cancel the order" + " . Backend service may be unvailable";
throw new Exception(msg, e);
}
}
}

Like I mentioned above in back-end project you will need to create resouces folder inside of the <folder-name>/src/main/ folder of your front-end  project. After that create a folder called web inside of the resource folder. Inside this web folder, create another directory and named it as orderprocess-mgt.

Create a .jsp file called orderprocessmanager.jsp inside of the orderprocess-mgt directory. This is the jsp page that consist of the UI part. I will have a table of existing orders.

<%@ page import="org.apache.axis2.context.ConfigurationContext" %>
<%@ page import="org.wso2.carbon.CarbonConstants" %>
<%@ page import="org.wso2.carbon.ui.CarbonUIUtil" %>
<%@ page import="org.wso2.carbon.utils.ServerConstants" %>
<%@ page import="org.wso2.carbon.ui.CarbonUIMessage" %>
<%@ page import="org.wso2.carbon.example.OrderProcess.ui.OrderProcessClient" %>
<%@ page import="org.wso2.carbon.example.OrderProcess.ui.types.data.OrderBean" %>
<%@ page import="org.wso2.carbon.example.OrderProcess.ui.types.data.Customer" %>
<%@ page import="org.wso2.carbon.example.OrderProcess.ui.types.data.Address" %>
<%@ page import="org.wso2.carbon.example.OrderProcess.ui.types.data.Item" %>
<%@ taglib prefix="fmt" uri="http://java.sun.com/jsp/jstl/fmt" %>
<%@ taglib uri="http://wso2.org/projects/carbon/taglibs/carbontags.jar" prefix="carbon" %>
<%
String serverURL = CarbonUIUtil.getServerURL(config.getServletContext(), session);
ConfigurationContext configContext =
(ConfigurationContext) config.getServletContext().getAttribute(CarbonConstants.CONFIGURATION_CONTEXT);
String cookie = (String) session.getAttribute(ServerConstants.ADMIN_SERVICE_COOKIE);

OrderProcessClient client;
OrderBean order;
OrderBean orderBean = new OrderBean();

Customer customer = new Customer();
customer.setCustID("A123");
customer.setFirstName("Isuru");
customer.setLastName("Wijesinghe");
orderBean.setCustomer(customer);

Address address = new Address();
address.setCityName("Colombo");
address.setCountry("Sri Lanka");
address.setStateCode("04");
address.setStreetName("Armer Street");
address.setZipCode("02");
orderBean.setShippingAddress(address);

Item item1 = new Item();
item1.setItemID("11");
item1.setItemName("MACBook");
item1.setQuantity(12);
item1.setUnitPrice(100);

Item item2 = new Item();
item2.setItemID("10");
item2.setItemName("UltrasBook");
item2.setQuantity(10);
item2.setUnitPrice(30);

Item[] orderItems = { item1, item2 };

orderBean.setOrderItems(orderItems);

try {
client = new OrderProcessClient(configContext, serverURL, cookie);
order = client.processOrder(orderBean);
} catch (Exception e) {
CarbonUIMessage.sendCarbonUIMessage(e.getMessage(), CarbonUIMessage.ERROR, request, e);
%>
<script type="text/javascript">
location.href = "../admin/error.jsp";
</script>
<%
return;
}
%>

<div id="middle">
<h2>Order Process Management</h2>

<div id="workArea">
<table class="styledLeft" id="moduleTable">
<thead>
<tr>
<th width="20%">Customer ID</th>
<th width="20%">First Name</th>
<th width="20%">Last Name</th>
<th width="20%">Order Price</th>
<th width="20%">Number Of Items</th>
</tr>
</thead>
<tbody>
<%

%>
<tr>
<td><%=order.getCustomer().getCustID()%></td>
<td><%=order.getCustomer().getFirstName()%></td>
<td><%=order.getCustomer().getLastName()%></td>
<td><%=order.getPrice()%></td>
<td><%=order.getOrderItems().length%></td>
</tr>
<%

%>
</tbody>
</table>
</div>
</div

Here you can see that I've used some style classes and IDs. Those are predefined classes and IDs in the Carbon. Don't forget to import the carbon tag library as well.

Now I will have to add the UI component to the menu bar as a menu item of the application server. For that you must create the component.xml file. Befeore creating it first you should have to create META-INF inside the resources folder in front-end project and then create the component.xml file inside of it as below.

<component xmlns="http://products.wso2.org/carbon">
<menus>
<menu>
<id>orderprocess_menu</id>
<i18n-key>orderprocess.menu</i18n-key>
<i18n-bundle>org.wso2.carbon.example.OrderProcess.ui.i18n.Resources</i18n-bundle>
<parent-menu>manage_menu</parent-menu>
<link>../orderprocess-mgt/orderprocessmanager.jsp</link>
<region>region1</region>
<order>50</order>
<style-class>manage</style-class>
<!-- --><icon>../log-admin/images/log.gif</icon>-->
<require-permission>/permission/protected/manage</require-permission>
</menu>
</menus>
</component>

Here i18n-bundle value depend on the package that the created Client resides. Create folder structure according to the package name inside the web folder. As an example I created the package called org.wso2.carbon.example.OrderProcess.ui to hold the client code. Therefore I must have to create a directory structure similar to the package name of the client code and inside of it create another directory called i18n. Then inside of it create a resource bundle called Resources.properties (create a empty file and named it as Resources.properties) inside the above created folder. Then update the file contend as below.

orderprocess.menu=Order Process

(This is similar to the i18n-key value inside of the component.xml and assign it a name and it is the menu item name in the menu bar of the application server that you can see after deploying it. Here I mentioned it as Order Process)

Now go to the pom.xml file location of the front-end project and type maven clean install in the command line interface.

Deploying the component


Now you copy the generated jar files inside the target folder in both back-end  front-end projects in to the dropins folder that I mentioned previously and restart the WSO2 Application Server. Then under the Services (in the main menu tab) you can see your menu item name called Order Process. Once you click it you can see the fallowing output.


Lakshani Gamage[WSO2 API Manager] How to Add a User Signup Workflow

From WSO2 API Manager, new developers can self-sign up to the API Store.  After that,  they can login to API Store using that account. But if admins want to approve those new accounts before new developers use their accounts, workflows can be configured. For that, we are using WSO2 Business Process Server (BPS).

Let's see how to add a user signup workflow.

  1. If API Manager and Business Process Server are running on the same machine, follow this step.

  2. Set Offset value to 2 in <BPS_HOME>/repository/conf/carbon.xml.
       
    <Offset>2</Offset>

  3. If there is no directory called "epr" inside <BPS_HOME>/repository/conf/, then create it.
  4. Copy following epr files from APIM_HOME>/business-processes/epr to <BPS_HOME>/repository/conf/epr.
    • UserSignupService.epr
    • UserSignupProcess.epr
  5. Start BPS and login to Management console.
  6. Go to Home > Manage > Processes > Add > BPEL. Then, upload <APIM_HOME>/business-processes/user-signup/BPEL/UserSignupApprovalProcess_1.0.0.zip.

  7. Go to Home > Manage > Human Tasks >. Then, upload <APIM_HOME>/business-processes/user-signup/HumanTask/UserApprovalTask-1.0.0.zip.


  8. Configuration on BPS is finished.  Now, let's see how to configure WSO2 APIM.
  9. Start APIM and  login to Management console.
  10. Go to Home > Resources > Browse and navigate to /_system/governance/apimgt/applicationdata/workflow-extensions.xml. Click on "Edit As Text". 
  11. Comment <UserSignUp executor="org.wso2.carbon.apimgt.impl.workflow.UserSignUpSimpleWorkflowExecutor"> and uncomment  <UserSignUp executor="org.wso2.carbon.apimgt.impl.workflow.UserSignUpWSWorkflowExecutor">
  12. Also, update below property values based on your BPS server credintials, service endpoint.
       
    <UserSignUp executor="org.wso2.carbon.apimgt.impl.workflow.UserSignUpWSWorkflowExecutor">
    <Property name="serviceEndpoint">http://localhost:9765/services/UserSignupProcess/</Property>
    <Property name="username">admin</Property>
    <Property name="password">admin</Property>
    <Property name="callbackURL">https://localhost:8243/services/WorkflowCallbackService</Property>
    </UserSignUp>



    Now Configuration in API Manager is finished. 
  13. Now, go to API store and sign up to the store. 
  14. When you signup to the API Store you will get a Notification. (User account awaiting Administrator Approval).
  15. Then, login to Admin Dashboard (https:<Server Host>:9443/admin). 
  16. Go to Tasks > User Creation. Then you will able to see the pending user approval tasks like below.
  17. If you click on "Start" button, the tasks status changed into "In_Progress". 
  18. Once, you click on "Complete" button, signed up user account become active and that user able to login to the API Store.

    Lakshani GamageHow To Add new Users, Roles and Tenants to WSO2 Automation Test Framework.

    If we want to add a new user or new role or tenant we should update automation.xml accordingly.

    a. How to add new Role
    Add any role with a name and a key inside the <roles> tag of <userManagement>. You have to list the permissions of each role inside the <permissions> tag.
       
    <roles>
    <role name = "AdminRole" key = "AdminRole">
    <permissions>
    <permission>/permission/admin</permission>
    </permissions>
    </role>
    <role name = "SubscribeRole" key = "SubscribeRole">
    <permissions>
    <permission>/permission/admin/login</permission>
    <permission>/permission/admin/manage/webapp/subscribe</permission>
    </permissions>
    </role>
    </roles>



    b. How to add new users to super tenant
    You can add any user with a key inside the <tenant> tag under the <superTenant> tag of
    <userManagement> tag.
       
    <superTenant>
    <tenant domain = "carbon.super" key = "superTenant">
    <admin>
    <user key = "superAdmin">
    <userName>admin</userName>
    <password>admin</password>
    </user>
    </admin>
    <users>
    <user key = "testuser1">
    <userName>testuser1</userName>
    <password>testuser1</password>
    </user>
    </users>
    </tenant>
    </superTenant>



    c. How to assign roles to users
    If you want to assign  roles to the user, there are 2 ways.

    1. Get Role from the automation.xml (Like we defined in step a) Then add the role key inside the user tag.
       
    <user key = "testuser1">
    <userName>testuser1</userName>
    <password>testuser1</password>
    <roles>
    <role>SubscribeRole</role>
    </roles>
    </user>



    2. You can see existing roles from management console as well. Then set role name under the <user> tag like this.
       
    <user key = "AppCreator">
    <userName>appcreator</userName>
    <password>appcreatorpass</password>
    <roles>
    <role>Internal/creator</role>
    </roles>
    </user>



    d. Add new tenants
    You can add any tenant with a domain name and key inside the <tenants> tag of <userManagement> tag. Inside that tag, you can add admin user information, user information as below.
       
    <tenants>
    <tenant domain = "wso2.com" key="wso2">
    <admin>
    <user key = "admin">
    <username>admin</username>
    <password>admin</password>
    </user>
    </admin>
    <users>
    <user key = "myuser">
    <username>mytestuser</username>
    <password>mytestuserpass</password>
    </user>
    </users>
    </tenant>
    </tenants>


    Lakshani GamageStart Multiple WSO2 IoTS Instances on the Same Computer

    If you want to run multiple WSO2 IoTS on the same machine, you have to change the default ports with an offset value to avoid port conflicts. The default HTTP and HTTPS ports (without offset) of a WSO2 product are 9763 and 9443 respectively.

    Here are the steps to offset ports. Let's assume you want to increase all ports by 1.

    1. Set Offset value to 1 in <IoTS_HOME>/repository/conf/carbon.xml
    2.    
      <Offset>1</Offset>

    3. Change the hostURL under <authenticator class="org.wso2.carbon.andes.authentication.andes.OAuth2BasedMQTTAuthenticator"> in <IoTS_HOME>/repository/conf/broker.xml according to the port offset.
    4.    
      <authenticator class = "org.wso2.carbon.andes.authentication.andes.OAuth2BasedMQTTAuthenticator">
      <property name = "hostURL">https://<IoTS_HOST>:<IoTS_PORT>/services/OAuth2TokenValidationService</property>
      <property name = "username">admin</property>
      <property name = "password">admin</property>
      <property name = "maxConnectionsPerHost">10</property>
      <property name = "maxTotalConnections">150</property>
      </authenticator>


    5. Start the Server.

    Lakshani Gamage[WSO2 App Manager] How to Disable App Types

    WSO2 App Manager facilitates creating, publishing, and managing Webapps, Sites, mobile applications.  By default, these three app types are enabled in WSO2 App Manager. Enabled app types has mentioned under <EnabledAssetTypeList> in <AppM_Home>/repository/conf/app-manager.xml
       
    <EnabledAssetTypeList>
    <Type>webapp</Type>
    <Type>mobileapp</Type>
    <Type>site</Type>
    </EnabledAssetTypeList>



    If you want to disable any app type from App Manager, you can easily do it. You just have to remove unwanted app types from above configuration and restart the server.

    If you disable mobileapp from App Manager, Publisher shows as below.

    Store will show like below.



    If you want remove "Site" from App Manager, you have to do two additional steps. Because Webapps and Sites are using same creating and editing pages in Publisher.
    You have not to allow users to create "Sites" from Publisher. For that, you have to remove the relevant div(as shown in below image) from Publisher UI.




    Comment below code block of
    <AppM_Home>/deployment/server/jaggeryapps/publisher/themes/appm/partialspublisher/themes/appm/partials/add-asset.hbs

       
    <div class = "form-group" type = 'hidden'>
    <label class = "control-label col-sm-2">Treat as a Site: </label>
    <div class = "col-sm-10 checkbox-div">
    <input type = "checkbox" class = "treatAsASite_checkbox">
    </div>
    </div>


       
    Comment below code block of
    <AppM_Home>/deployment/server/jaggeryapps/publisher/themes/appm/partialspublisher/themes/appm/partials/edit-asset.hbs

       
    <div class = "form-group" type = "hidden">
    <label class = "control-label col-sm-2">Treat as a Site: </label>
    <div class = "col-sm-10 checkbox-div">
    <label>
    <input type = "checkbox" class = "treatAsASite_checkbox"
    value = "{{{snoop "fields(name=overview_treatAsASite).value" data}}}">
    </label>
    </div>






    Lakshani GamageHow to Calculate Time Difference Between Request and Response in WSO2 ESB

    If you want to calculate the time difference between request and response, we can use a script mediator.
    The Script Mediator is used to script with languages such as JavaScript, Groovy, or Ruby.

    First, you have to get request timestamp using a property mediator. For that, add below line inside "inSequence".
        
    <property name = "REQUEST_TIMESTAMP" expression = "get-property('SYSTEM_TIME')"/>

    Then, add below line inside "outSequence"  to get response timestamp.
        
    <property name = "RESPONSE_TIMESTAMP" expression = "get-property('SYSTEM_TIME')"/>


    Now, you can calculate response time using below code. (Script mediator)
        
    <script language = "js">
    var requestTimeStamp = mc.getProperty("REQUEST_TIMESTAMP");
    var responseTimeStamp = mc.getProperty("RESPONSE_TIMESTAMP");
    var responseTime = responseTimeStamp - requestTimeStamp;
    mc.setProperty( "RESPONSE_TIME", responseTime);
    </script>



    A sample proxy with a script mediator to calculate response time is in below.
        
    <?xml version = "1.0" encoding = "UTF-8"?>
    <proxy xmlns = "http://ws.apache.org/ns/synapse"
    name = "CalculatingTimeDifference"
    transports = "https,http"
    statistics = "disable"
    trace = "disable"
    startOnLoad = "true">
    <target>
    <inSequence>
    <property name = "REQUEST_TIMESTAMP" expression = "get-property('SYSTEM_TIME')"/>
    </inSequence>
    <outSequence> <sequence xmlns = "http://ws.apache.org/ns/synapse" name = "responseMessage">
    <property name = "RESPONSE_TIMESTAMP" expression = "get-property('SYSTEM_TIME')"/>
    <script language = "js">
    var requestTimeStamp = mc.getProperty("REQUEST_TIMESTAMP");
    var responseTimeStamp = mc.getProperty("RESPONSE_TIMESTAMP");
    var responseTime = responseTimeStamp - requestTimeStamp;
    mc.setProperty("RESPONSE_TIME", responseTime);
    </script>
    <log level = "custom">
    <property name = "Response Time(ms)" expression = "$ctx:RESPONSE_TIME"/>
    </log>
    </outSequence>
    </target>
    <description/>
    </proxy>




    You can see a log message like below with the response time.

    [2016-10-03 09:41:43,531]  INFO - LogMediator API Response Time(ms) = 624.0

    Lakshani GamageGoogle Analytics Tracking for WSO2 App Manager

    Google Analytics is a free Web analytics service that provides statistics and basic analytical tools. We can configure WSO2 App Manager to track web application and sites invocation statistics through Google Analytics.

    First, let's see how to setup a google Analytic account.

    1. Go to http://www.google.com/analytics/ and Click on Analytic tab.
    2. Then, click on "Admin" tab and create a New Account.
    3. Click on "Website" and give account information like in below. Here, you have to fill the information which you want to track.
    4. Click on 'Get Tracking Id'. Then you will redirect to a page like below. From there you can get the Tracking Id.
    5.  Configure WSO2 App Manager with received Tracking code. Enable google analytics and add TrackingID of <APPM_HOME>/repository/conf/app-manager.xml as shown below.
       
    <GoogleAnalyticsTracking>
    <!--Enable/Disable Google Analytics Tracking-->
    <Enabled>true</Enabled>
    <!--Google Analytics Tracking ID-->
    <TrackingID>UA-86711225-1</TrackingID>
    </GoogleAnalyticsTracking>


    6. Restart server.
    7. Place the below JavaScript code snippet into the pages that you need to track with your Google Analytics account.
       
    <script language="javascript" src="https://ajax.googleapis.com/ajax/libs/jquery/1.6.2/jquery.min.js" type="text/javascript">
    function invokeStatistics(){
    var tracking_code = ["<TRACKING_CODE>"];
    var request = $.ajax({
    url: "http://<AM_GATEWAY_URL>:8280/statistics/",
    type: "GET",
    headers: {
    "trackingCode":tracking_code,
    }
    }
    );
    }
    </script>



    Now we have successfully integrated WSO2 App Manager with Google Analytics. Let's see how we can see the statistics.

    Real-time statistics


    Then go to Google Analytics [http://www.google.com/analytics/] and select the created account above.

    The following image shows an invocation of a specific Web application. The Google Analytics graphs and statistics are displayed at runtime in its Real-Time view. In below graphs, you will able to see a hit on "PageViews" per second and active users.


    Reporting statistics

    Google Analytics reporting statistics will take more than 24 hours from the time of invocation.

    A sample dashboard with populated statistics is shown in below.


    Nipun SuwandaratnaContainerization on Android devices with WSO2 Enterprise Mobility Manager (EMM)

    Data security is one of the main concerns of organizations today. With the increasing use of mobile devices for work organizations are faced with the challenge of protecting confidential corporate data that is accessible through mobile devices.

    If the organization allows corporate data access only via COPE devices, then they would have control over the device as well as the ability to perform security measures such as device wipes if the device is lost. However, in most organizations employees are allowed to access company data (e.g: email, shared drives etc.) on their personal devices. This is more cost effective for the company as well as helps improve the productivity as well.

    However, allowing data access on BYOD raises concerns on both sides. From the organization point of view they are concerned about data security and need to implement measures such as limiting certain apps and enabling features such as remote device wipe. On the employees point of view they are reluctant to allow the organization gain total control of their device and allow app restrictions and remote wipe.

    With version 2.2.0 WSO2 EMM will provide a solution to this problem using containerization using 'Android for Work'. With Containerization you can maintain a separate space within the device for corporate apps/data. This container provides total data isolation and can be managed separately by the organization. With this approach the company will not be able to access the personal space of the user's device, but would be able to manage the work profile. For example the company may decide to disable some apps on the work profile, but that would not prevent the user from using those apps in his/her personal space. There will be no data or context sharing between the apps run within and outside of the work profile. The work profile will be saved as encrypted files on the device. Therefore, the corporate data cannot be accessed outside of the container. If the organization wishes they can remote-wipe the corporate data on the device; this would not however effect the users personal data outside of the container.



    Lakshani Gamage[WSO2 App Manager] How to Add a Custom Dropdown Field to a Webapp

    In WSO2 App Manager, when you create a new web app, you have to fill a set of predefined values. If you want to add any custom fields to an app, you can easily do it.

    Suppose you want to add a custom dropdown field to webapp create page. Say the custom dropdown field name is "App Network Type". 

    First, Let's see how to add a custom field to UI (Jaggery APIs).
    1. Modify <APPM_HOME>/repository/resources/rxt/webapp.rxt.
    2.    
      <field type = "text">
      <name label = "App Network Type">App Network Type</name>
      </field>


    3. Login to Management console and navigate to Home > Extensions > Configure > Artifact Types and delete "webapp.rxt"
    4. Add below code snippet to the required place of <APPM_HOME>repository/deployment/server/jageeryapps/publisher/themes/appm/partials/add-asset.hbs
    5.    
      <div class="form-group">
      <label class = "control-label col-sm-2">App Network Type : </label>
      <div class = "col-sm-10">
      <select id = "appNetworkType" class = "col-lg-6 col-sm-12 col-xs-12">
      <option value = "None">None</option>
      <option value = "Online">Online</option>
      <option value = "Offline">Offline</option>
      </select>
      </div>
      <input type = "hidden" required = "" class = "col-lg-6 col-sm-12 col-xs-12" name = "overview_appNetworkType"
      id = "overview_appNetworkType">
      </div>


    6. Add below code snippet to <APPM_HOME>/repository/deployment/server/jaggeryapps/publisher/themes/appm/partials/edit-asset.hbs.
    7.    
      <div class = "form-group">
      <label class = "control-label col-sm-2">App Network Type : </label>
      <div class = "col-sm-10">
      <select id = "appNetworkType" class = "col-lg-6 col-sm-12 col-xs-12">
      <option value = "None">None</option>
      <option value = "Online">Online</option>
      <option value = "Offline">Offline</option>
      </select>
      </div>

      <input type='hidden' value="{{{snoop "fields(name=overview_appNetworkType).value" data}}}"
      name="overview_appNetworkType" id="overview_appNetworkType"/>
      </div>


    8. To save selected value in registry, you need to add below function inside $(document ).ready(function() {} of <APPM_HOME>/repository/deployment/server/jaggeryapps/publisher/themes/appm/js/resource-add.js
    9.    
      $("#appNetworkType").change(function() {
      var selectedNetworkType = $('#appNetworkType').find(":selected").text();
      $('#overview_appNetworkType').val(selectedNetworkType);
      });


    10. To preview the selected dropdown field value in app edit page, add below code snippet inside $(document ).ready(function() {} of  <APPM_HOME>/repository/deployment/server/jaggeryapps/publisher/themes/appm/js/resource-edit.js.
    11.    
      var selectedNetworkType = $('#overview_appNetworkType').val();
      $( "#appNetworkType" ).each(function( index ) {
      $(this).val(selectedNetworkType);
      });

      $("#appNetworkType").change(function() {
      var selectedNetworkType = $('#appNetworkType').find(":selected").text();
      $('#overview_appNetworkType').val(selectedNetworkType);
      });


    12. When you create a new version of an existing webapp, to copy the selected dropdown value to the new version, add below line to
      <APPM_HOME>/repository/deployment/server/jaggeryapps/publisher/themes/appm/partials/copy-app.hbs
    13.    
      <input type='text' value= "{{{snoop "fields(name=overview_appNetworkType).value" data}}}" name = "overview_appNetworkType" id = "overview_appNetworkType"/>


      Now, Let's see how to add customized fields to the REST APIs.
    14. Go to Main -> Browse -> in Management console and navigate to   /_system/governance/appmgt/applicationdata/custom-property-definitions/webapp.json and click on "Edit As Text". Add the custom fields which you want to add.
    15.    
      {
      "customPropertyDefinitions":
      [
      {"name" : "overview_appNetworkType"}
      ]
      }


    16. Restart App Manager.
    17. Web app create page with the newly added dropdown field will be shown as below. 

    Lakshani Gamage[WSO2 App Manager]Registry Extension (RXT) Files

    All data related to any application you create in WSO2 App Manager is stored in the registry which is embedded to the server. Those data is stored in a format which is defined in a special set of files called “Registry Extensions (RXTs)”[1] . When you save a web application, the format it is saved in the registry is given in “webapp.rxt”, and that of mobile applications is given in “mobileapp.rxt”. You can see these files in the file system under <APPM_HOME>/repository/resources/rxts folder. When you want to add a new field to an application you create, you need to edit these RXT files.

    These RXT files can also be found in Home > Extensions > Configure > Artifact Types in App Manager management console like below.


    But when you want to edit these file, it is better to edit them from the file system, because every time a new tenant is created, it picks relevant RXTs from the file system to populate the data in the registry.

    App Manager reads RXT files from the file system and populates them in the management console only if they are not already populated. So, whenever you edit RXTs from the file system, you have to delete the rxt files from the management console and restart the server to populate updated RXT files in the management console. If you have multiple tenants, you need to delete RXT files of each tenant from the management console.

    There are “field” tags in every RXT file. In each field tag, it contains field type and if it is a required field or not. See below two examples.

    eg :
    <field type = "text" required = "true">
     <name>AppId</name>
    </field>

    <field type = "text-area">
     <name>Terms and Conditions</name>
    </field>

    In rxt files, there are two types of fields. They are, “text” and “text-area”. “text” is used for text fields, and “text-area” is used for large text contents.  If you want to set the type of a field as double, Integer etc., you have to use a “text” field and have a type validation in application code.

    Rukshan PremathungaConfigure WSO2 APIM Analytics on Cluster environment

    Configure WSO2 APIM Analytics on Cluster environment

    • In APIM standalone setup single node used to configure for analytics. But different APIM component use that configuration for event publishing or summary data retrieving from the summary database. But in a cluster environment[1] we can configure it per node.
    • So here are the components or profiles[1] of the APIM that can run on separate nodes.
      • Gateway manager -Dprofile=gateway-manage
      • Gateway worker -Dprofile=gateway-worker -DworkerNode=true
      • Key Manager -Dprofile=api-key-manager
      • Traffic Manager -Dprofile=traffic-manager
      • API Publisher -Dprofile=api-publisher
      • API Store -Dprofile=api-store
    • But not all the nodes need to be configure for Analytics. And also not all the analytics enabled node are publish events or read summary tables.
    • Here the summary of the node’s analytics usage
    • Profile Need to Enable Event Published Read Stat DB
      Gateway manager YES only if accept request YES only if accept request NO
      Gateway worker YES YES NO
      Key Manager NO NO NO
      Traffic Manager NO NO NO
      API Publisher YES NO YES
      API Store YES YES YES

    Dimuthu De Lanerolle

    Java Script Basics 


    [1] Calling a function of another file from your javascript file

    file 01:  graphinventor.js
    =================

     console.log("-----------------------------------------Time Unit ");
     console.log("Previous Time Stamp ");


    var test123 = function (data){
    console.log("-----------------------------------------inside test123 ");
    console.log(data)
    alert("This is an alert" +data);

    }

    [3] https://datatables.net/manual/ajax

    file 02: gadgetconf.js
    ===============

    processData: function(data) {

    console.log('data '+JSON.stringify(data));
     // in console of the browser (Ctrl+c) you see the content now as json data

    console.log("------------------------");
    test123(data);
    console.log("------------------------");

    }

    main.js file (Loading graphinventor.js file first)
    ==========

           <!-- Custom -->
              <script src="js/graphinventor.js"></script>
              <script src="js/gadgetconf.js"></script>
              <script src="js/main.js"></script>

    [2] Callback function is a function passed into another function.



    Sriskandarajah SuhothayanSetup Hive to run on Ubuntu 15.04

    This is tested on hadoop-2.7.3, and apache-hive-2.1.0-bin.

    Improvement on Hive documentation : https://cwiki.apache.org/confluence/display/Hive/GettingStarted

    Step 1

    Make sure Java is installed

    Installation instruction : http://suhothayan.blogspot.com/2010/02/how-to-set-javahome-in-ubuntu.html

    Step 2

    Make sure Hadoop is installed & running

    Instruction : http://suhothayan.blogspot.com/2016/11/setting-up-hadoop-to-run-on-single-node_8.html

    Step3 

    Add Hive and Hadoop home directories and paths

    Run

    $ gedit ~/.bashrc

    Add flowing at the end (replace {hadoop path} and {hive path} with proper directory locations)

    export HADOOP_HOME={hadoop path}/hadoop-2.7.3

    export HIVE_HOME={hive path}/apache-hive-2.1.0-bin
    export PATH=$HIVE_HOME/bin:$PATH

    Run

    $ source ~/.bashrc

    Step4

    Create /tmp and hive.metastore.warehouse.dir and set executable permission create tables in Hive. (replace {user-name} with system username)

    hadoop-2.7.3/bin/hadoop fs -mkdir /tmp
    $ hadoop-2.7.3/bin/hadoop fs -mkdir /user/{user-name}/warehouse
    $ hadoop-2.7.3/bin/hadoop fs -chmod 777 /tmp
    $ hadoop-2.7.3/bin/hadoop fs -chmod 777 /user/{user-name}/warehouse

    Step5

    Create hive-site.xml 

    $ gedit apache-hive-2.1.0-bin/conf/hive-site.xml

    Add following (replace {user-name} with system username):

    <configuration>
      <property>
        <name>hive.metastore.warehouse.dir</name>
        <value>/user/{user name}/warehouse</value>
      </property>
    </configuration>


    Copy hive-jdbc-2.1.0-standalone.jar to lib

    cp apache-hive-2.1.0-bin/jdbc/hive-jdbc-2.1.0-standalone.jar apache-hive-2.1.0-bin/lib/

    Step6

    Initialise Hive with Derby, run:

    $ ./apache-hive-2.1.0-bin/bin/schematool -dbType derby -initSchema

    Step7

    Run Hiveserver2:

    $ ./apache-hive-2.1.0-bin/bin/hiveserver2

    View hiveserver2 logs: 

    tail -f /tmp/{user name}/hive.log

    Step8

    Run Beeline on another terminal:

    $ ./apache-hive-2.1.0-bin/bin/beeline -u jdbc:hive2://localhost:10000

    Step9

    Enable fully local mode execution: 

    hive> SET mapreduce.framework.name=local;

    Step10

    Create table :

    hive> CREATE TABLE pokes (foo INT, bar STRING);

    Brows table 

    hive> SHOW TABLES;

    Sriskandarajah SuhothayanSetting up Hadoop to run on Single Node in Ubuntu 15.04

    This is tested on hadoop-2.7.3.

    Improvement on Hadoop documentation : http://hadoop.apache.org/docs/r2.7.2/hadoop-project-dist/hadoop-common/SingleCluster.html

    Step 1 

    Make sure Java is installed

    Installation instruction : http://suhothayan.blogspot.com/2010/02/how-to-set-javahome-in-ubuntu.html

    Step 2

    Install pre-requisites

    $ sudo apt-get install ssh
    $ sudo apt-get install rsync

    Step 3

    Setup Hadoop

    $ gedit hadoop-2.7.3/etc/hadoop/core-site.xml

    Add (replace {user-name} with system username, E.g "foo" for /home/foo/)

    <configuration>
        <property>
            <name>fs.defaultFS</name>
            <value>hdfs://localhost:9000</value>
        </property>
        <property>
    <name>hadoop.proxyuser.{user-name}.groups</name>
            <value>*</value>
        </property>
        <property>
            <name>hadoop.proxyuser.{user-name}.hosts</name>
            <value>*</value>
        </property>
    </configuration>

    $ gedit hadoop-2.7.3/etc/hadoop/hdfs-site.xml 

    Add 

    <configuration>
        <property>
            <name>dfs.replication</name>
            <value>1</value>
        </property>
    </configuration>

    Step 4

    Run

    $ ssh localhost 

    If it requested for password, run:

    $ ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
    $ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys
    $ chmod 0600 ~/.ssh/authorized_keys

    Try ssh localhost again.
    If it still asks for password, run following and try again:

    $ ssh-keygen -t rsa
    #Press enter for each line
    $ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
    $ chmod og-wx ~/.ssh/authorized_keys 

    Step 5

    Clean namenode

    $ ./hadoop-2.7.3/bin/hdfs namenode -format

    Step 6 * Not provided in Hadoop Documentation 

    Replace ${JAVA_HOME} with hardcoded path in hadoop-env.sh

    gedit hadoop-2.7.3/etc/hadoop/hadoop-env.sh

    Edit the file as 

    # The java implementation to use.
    export JAVA_HOME={path}/jdk1.8.0_111

    Step 7

    Start Hadoop 

    $ ./hadoop-2.7.3/sbin/start-all.sh

    The Hadoop daemon log output is written to the $HADOOP_LOG_DIR directory (defaults to $HADOOP_HOME/logs).

    Browse the web interface for the NameNode;

    http://localhost:50070/

    Step 8

    Check processors running by running:

    $ jps

    Output: 

    xxxxx NameNode
    xxxxx ResourceManager
    xxxxx DataNode
    xxxxx NodeManager
    xxxxx SecondaryNameNode

    Step 9

    Make HDFS directories for MapReduce jobs:

    $ ./hadoop-2.7.3/bin/hdfs dfs -mkdir /user
    $ ./hadoop-2.7.3/bin/hdfs dfs -mkdir /user/{user-name}


    Danushka FernandoGenerate JWT access tokens from WSO2 Identity Server

    In Identity Server 5.2.0 we have created an interface to generate access tokens. Using that we have developed a sample to generate JWT tokens. You can find that sample under msf4j samples[1][2]. If you are build it as it is you will need to use Java 8 to build since msf4j is developed on Java 8. So you will need to run Identity Server on Java 8 as well. After building the project[2] please copy the jar inside target directory to $IS_HOME/repository/components/dropins/ directory. And then please add the following configuration to Identity.xml which is placed under $IS_HOME/repository/conf/identity/ folder inside tag OAuth.

     <IdentityOAuthTokenGenerator>com.wso2.jwt.token.builder.JWTAccessTokenBuilder</IdentityOAuthTokenGenerator>  


    Then go to the database you used to store oauth tokens (This is the database pointed from the datasource you mentioned in the $IS_HOME/repository/conf/identity/identity.xml) and then alter the size of the column ACCESS_TOKEN of the table IDN_OAUTH2_ACCESS_TOKEN to the maximum value provided by your database provider.


    Danushka FernandoWSO2 Identity Server 5.2.0 - Setup Multiple Attribute login with JDBC userstore

    In WSO2 Products multiple attribute login (login with either email or username for example) can be done with LDAP Userstore manager with simply by changing some configurations. But with JDBC Userstore manager we need some customization to achieve that. We can achieve that by using Implementing a custom userstore manager. In this blog entry I am going to make work with email and username. You can find the full sample here[1].


    For login purposes


    To login to the server with multiple attributes, you will need to override doAuthenticate method and doGetExternalRoleListOfUser method. Following are the overridden methods for login.


       @Override  
    public boolean doAuthenticate(String attribute, Object credential) throws UserStoreException {
    if (!checkUserNameValid(attribute)) {
    return false;
    }
    if (!checkUserPasswordValid(credential)) {
    return false;
    }
    if (UserCoreUtil.isRegistryAnnonymousUser(attribute)) {
    log.error("Anonnymous user trying to login");
    return false;
    }
    Connection dbConnection = null;
    ResultSet rs = null;
    PreparedStatement prepStmt = null;
    String sqlstmt = null;
    String password = (String) credential;
    boolean isAuthed = false;
    try {
    dbConnection = getDBConnection();
    dbConnection.setAutoCommit(false);
    sqlstmt = realmConfig.getUserStoreProperty(JDBCRealmConstants.SELECT_USER);
    if (log.isDebugEnabled()) {
    log.debug(sqlstmt);
    }
    prepStmt = dbConnection.prepareStatement(sqlstmt);
    // Insert attribute as parameter for each occurrence of
    int paramCount = StringUtils.countMatches(sqlstmt, "?");
    // If we specify the tenant into query, we assume that it is the last parameter
    if (sqlstmt.contains(UserCoreConstants.UM_TENANT_COLUMN)) {
    // Assign attribute value to parameters, except the last one
    for (int i = 1; i < paramCount; i++) {
    prepStmt.setString(i, attribute);
    }
    prepStmt.setInt(paramCount, tenantId);
    } else {
    // There is no tenant indication, set all parameters with attribute value
    for (int i = 1; i <= paramCount; i++) {
    prepStmt.setString(i, attribute);
    }
    }
    rs = prepStmt.executeQuery();
    if (rs.next() == true) {
    String storedPassword = rs.getString(3);
    String saltValue = null;
    if ("true".equalsIgnoreCase(realmConfig
    .getUserStoreProperty(JDBCRealmConstants.STORE_SALTED_PASSWORDS))) {
    saltValue = rs.getString(4);
    }
    boolean requireChange = rs.getBoolean(5);
    Timestamp changedTime = rs.getTimestamp(6);
    GregorianCalendar gc = new GregorianCalendar();
    gc.add(GregorianCalendar.HOUR, -24);
    Date date = gc.getTime();
    if (requireChange == true && changedTime.before(date)) {
    isAuthed = false;
    } else {
    password = this.preparePassword(password, saltValue);
    if ((storedPassword != null) && (storedPassword.equals(password))) {
    isAuthed = true;
    }
    }
    }
    } catch (SQLException e) {
    String msg = "Error occurred while retrieving user authentication info.";
    log.error(msg, e);
    throw new UserStoreException("Authentication Failure");
    } finally {
    DatabaseUtil.closeAllConnections(dbConnection, rs, prepStmt);
    }
    if (log.isDebugEnabled()) {
    log.debug("User " + attribute + " login attempt. Login success :: " + isAuthed);
    }
    return isAuthed;
    }
    @Override
    public Date getPasswordExpirationTime(String attribute) throws UserStoreException {
    Connection dbConnection = null;
    ResultSet rs = null;
    PreparedStatement prepStmt = null;
    String sqlstmt;
    Date date = null;
    try {
    dbConnection = getDBConnection();
    dbConnection.setAutoCommit(false);
    sqlstmt = realmConfig.getUserStoreProperty(JDBCRealmConstants.SELECT_USER);
    if (log.isDebugEnabled()) {
    log.debug(sqlstmt);
    }
    prepStmt = dbConnection.prepareStatement(sqlstmt);
    // Insert attribute as parameter for each occurrence of
    int paramCount = StringUtils.countMatches(sqlstmt, "?");
    // If we specify the tenant into query, we assume that it is the last parameter
    if (sqlstmt.contains(UserCoreConstants.UM_TENANT_COLUMN)) {
    // Assign attribute value to parameters, except the last one
    for (int i = 1; i < paramCount; i++) {
    prepStmt.setString(i, attribute);
    }
    prepStmt.setInt(paramCount, tenantId);
    } else {
    // There is no tenant indication, set all parameters with attribute value
    for (int i = 1; i <= paramCount; i++) {
    prepStmt.setString(i, attribute);
    }
    }
    rs = prepStmt.executeQuery();
    if (rs.next() == true) {
    boolean requireChange = rs.getBoolean(5);
    Timestamp changedTime = rs.getTimestamp(6);
    if (requireChange) {
    GregorianCalendar gc = new GregorianCalendar();
    gc.setTime(changedTime);
    gc.add(GregorianCalendar.HOUR, 24);
    date = gc.getTime();
    }
    }
    } catch (SQLException e) {
    String msg = "Error occurred while retrieving password expiration time.";
    log.error(msg, e);
    throw new UserStoreException(msg, e);
    } finally {
    DatabaseUtil.closeAllConnections(dbConnection, rs, prepStmt);
    }
    return date;
    }
    public String[] doGetExternalRoleListOfUser(String userName, String filter) throws UserStoreException {
    if(log.isDebugEnabled()) {
    log.debug("Getting roles of user: " + userName + " with filter: " + filter);
    }
    String sqlStmt;
    if(this.isCaseSensitiveUsername()) {
    sqlStmt = this.realmConfig.getUserStoreProperty("UserRoleSQL");
    } else {
    sqlStmt = this.realmConfig.getUserStoreProperty("UserRoleSQLCaseInsensitive");
    }
    ArrayList roles = new ArrayList();
    if(sqlStmt == null) {
    throw new UserStoreException("The sql statement for retrieving user roles is null");
    } else {
    String[] names;
    if(sqlStmt.contains("UM_TENANT_ID")) {
    names = this.getStringValuesFromDatabase(sqlStmt, new Object[]{userName, userName, Integer.valueOf(this.tenantId), Integer.valueOf(this.tenantId), Integer.valueOf(this.tenantId), Integer.valueOf(this.tenantId)});
    } else {
    names = this.getStringValuesFromDatabase(sqlStmt, new Object[]{userName});
    }
    if(log.isDebugEnabled()) {
    if(names != null) {
    String[] arr$ = names;
    int len$ = names.length;
    for(int i$ = 0; i$ < len$; ++i$) {
    String name = arr$[i$];
    log.debug("Found role: " + name);
    }
    } else {
    log.debug("No external role found for the user: " + userName);
    }
    }
    Collections.addAll(roles, names);
    return (String[])roles.toArray(new String[roles.size()]);
    }
    }

    And with this you will need to modify your $CARBON_HOME/repository/conf/user-mgt.xml user store manager configuration section as below.

         <UserStoreManager class="org.wso2.carbon.userstore.jdbc.CustomJDBCUserStoreManager">  
    <Property name="TenantManager">org.wso2.carbon.user.core.tenant.JDBCTenantManager</Property>
    <Property name="ReadOnly">false</Property>
    <Property name="ReadGroups">true</Property>
    <Property name="WriteGroups">true</Property>
    <Property name="UsernameJavaRegEx">^[\S]{3,30}$</Property>
    <Property name="UsernameJavaScriptRegEx">[a-zA-Z0-9@._-|//]{3,30}$</Property>
    <Property name="UsernameWithEmailJavaScriptRegEx">[a-zA-Z0-9@._-|//]{3,30}$</Property>
    <Property name="UsernameJavaRegExViolationErrorMsg">Username pattern policy violated</Property>
    <Property name="PasswordJavaRegEx">^[\S]{5,30}$</Property>
    <Property name="PasswordJavaScriptRegEx">^[\S]{5,30}$</Property>
    <Property name="PasswordJavaRegExViolationErrorMsg">Password length should be within 5 to 30 characters</Property>
    <Property name="RolenameJavaRegEx">^[\S]{3,255}$</Property>
    <Property name="RolenameJavaScriptRegEx">^[\S]{3,255}$</Property>
    <Property name="CaseInsensitiveUsername">true</Property>
    <Property name="SCIMEnabled">false</Property>
    <Property name="IsBulkImportSupported">false</Property>
    <Property name="PasswordDigest">SHA-256</Property>
    <Property name="StoreSaltedPassword">true</Property>
    <Property name="MultiAttributeSeparator">,</Property>
    <Property name="MaxUserNameListLength">100</Property>
    <Property name="MaxRoleNameListLength">100</Property>
    <Property name="UserRolesCacheEnabled">true</Property>
    <Property name="UserNameUniqueAcrossTenants">false</Property>
    <Property name="PasswordHashMethod">SHA</Property>
    <Property name="SelectUserSQL">SELECT distinct u.* FROM UM_USER u left join UM_USER_ATTRIBUTE ua on u.UM_ID = ua.UM_USER_ID WHERE u.UM_USER_NAME = ? OR (ua.UM_ATTR_NAME = "mail" AND ua.UM_ATTR_VALUE = ?) AND u.UM_TENANT_ID = ?</Property>
    <Property name="UserRoleSQLCaseInsensitive">SELECT UM_ROLE_NAME FROM UM_USER_ROLE, UM_ROLE, UM_USER WHERE LOWER(UM_USER.UM_USER_NAME) IN (SELECT LCASE(u.UM_USER_NAME) FROM UM_USER u left join UM_USER_ATTRIBUTE ua on u.UM_ID = ua.UM_USER_ID WHERE u.UM_USER_NAME = ? OR (ua.UM_ATTR_NAME = "mail" AND ua.UM_ATTR_VALUE = ?) AND u.UM_TENANT_ID = ? GROUP BY u.UM_USER_NAME) AND UM_USER.UM_ID=UM_USER_ROLE.UM_USER_ID AND UM_ROLE.UM_ID=UM_USER_ROLE.UM_ROLE_ID AND UM_USER_ROLE.UM_TENANT_ID = ? AND UM_ROLE.UM_TENANT_ID = ? AND UM_USER.UM_TENANT_ID = ?</Property>
    </UserStoreManager>


    To use User Info endpoint with Oauth

    Further extending this If you need to use User Info endpoint with oauth2, then you will need to furthe extend following method as well.

       public boolean doCheckExistingUser(String userName) throws UserStoreException {  
    String sqlStmt;
    if(this.isCaseSensitiveUsername()) {
    sqlStmt = this.realmConfig.getUserStoreProperty("IsUserExistingSQL");
    } else {
    sqlStmt = this.realmConfig.getUserStoreProperty("IsUserExistingSQLCaseInsensitive");
    }
    if(sqlStmt == null) {
    throw new UserStoreException("The sql statement for is user existing null");
    } else {
    boolean isExisting = false;
    String isUnique = this.realmConfig.getUserStoreProperty("UserNameUniqueAcrossTenants");
    if(Boolean.parseBoolean(isUnique) && !"wso2.anonymous.user".equals(userName)) {
    String uniquenesSql;
    if(this.isCaseSensitiveUsername()) {
    uniquenesSql = this.realmConfig.getUserStoreProperty("UserNameUniqueAcrossTenantsSQL");
    } else {
    uniquenesSql = this.realmConfig.getUserStoreProperty("UserNameUniqueAcrossTenantsSQLCaseInsensitive");
    }
    isExisting = this.isValueExisting(uniquenesSql, (Connection)null, new Object[]{userName});
    if(log.isDebugEnabled()) {
    log.debug("The username should be unique across tenants.");
    }
    } else if(sqlStmt.contains("UM_TENANT_ID")) {
    isExisting = this.isValueExisting(sqlStmt, (Connection)null, new Object[]{userName, userName, Integer.valueOf(this.tenantId)});
    } else {
    isExisting = this.isValueExisting(sqlStmt, (Connection)null, new Object[]{userName});
    }
    return isExisting;
    }
    }

    And you will need to add more configurations and following is the updated user store manager configuration.

         <UserStoreManager class="org.wso2.sample.userstore.jdbc.CustomJDBCUserStoreManager">  
    <Property name="TenantManager">org.wso2.carbon.user.core.tenant.JDBCTenantManager</Property>
    <Property name="ReadOnly">false</Property>
    <Property name="ReadGroups">true</Property>
    <Property name="WriteGroups">true</Property>
    <Property name="UsernameJavaRegEx">^[\S]{3,30}$</Property>
    <Property name="UsernameJavaScriptRegEx">[a-zA-Z0-9@._-|//]{3,30}$</Property>
    <Property name="UsernameWithEmailJavaScriptRegEx">[a-zA-Z0-9@._-|//]{3,30}$</Property>
    <Property name="UsernameJavaRegExViolationErrorMsg">Username pattern policy violated</Property>
    <Property name="PasswordJavaRegEx">^[\S]{5,30}$</Property>
    <Property name="PasswordJavaScriptRegEx">^[\S]{5,30}$</Property>
    <Property name="PasswordJavaRegExViolationErrorMsg">Password length should be within 5 to 30 characters</Property>
    <Property name="RolenameJavaRegEx">^[\S]{3,255}$</Property>
    <Property name="RolenameJavaScriptRegEx">^[\S]{3,255}$</Property>
    <Property name="CaseInsensitiveUsername">true</Property>
    <Property name="SCIMEnabled">false</Property>
    <Property name="IsBulkImportSupported">false</Property>
    <Property name="PasswordDigest">SHA-256</Property>
    <Property name="StoreSaltedPassword">true</Property>
    <Property name="MultiAttributeSeparator">,</Property>
    <Property name="MaxUserNameListLength">100</Property>
    <Property name="MaxRoleNameListLength">100</Property>
    <Property name="UserRolesCacheEnabled">true</Property>
    <Property name="UserNameUniqueAcrossTenants">false</Property>
    <Property name="PasswordHashMethod">SHA</Property>
    <Property name="SelectUserSQL">SELECT distinct u.* FROM UM_USER u left join UM_USER_ATTRIBUTE ua on u.UM_ID = ua.UM_USER_ID WHERE u.UM_USER_NAME = ? OR (ua.UM_ATTR_NAME = "mail" AND ua.UM_ATTR_VALUE = ?) AND u.UM_TENANT_ID = ?</Property>
    <Property name="UserRoleSQLCaseInsensitive">SELECT UM_ROLE_NAME FROM UM_USER_ROLE, UM_ROLE, UM_USER WHERE LOWER(UM_USER.UM_USER_NAME) IN (SELECT LCASE(u.UM_USER_NAME) FROM UM_USER u left join UM_USER_ATTRIBUTE ua on u.UM_ID = ua.UM_USER_ID WHERE u.UM_USER_NAME = ? OR (ua.UM_ATTR_NAME = "mail" AND ua.UM_ATTR_VALUE = ?) AND u.UM_TENANT_ID = ? GROUP BY u.UM_USER_NAME) AND UM_USER.UM_ID=UM_USER_ROLE.UM_USER_ID AND UM_ROLE.UM_ID=UM_USER_ROLE.UM_ROLE_ID AND UM_USER_ROLE.UM_TENANT_ID = ? AND UM_ROLE.UM_TENANT_ID = ? AND UM_USER.UM_TENANT_ID = ?</Property>
    <Property name="GetUserPropertiesForProfileSQLCaseInsensitive">SELECT UM_ATTR_NAME, UM_ATTR_VALUE FROM UM_USER_ATTRIBUTE, UM_USER WHERE (UM_USER.UM_ID = UM_USER_ATTRIBUTE.UM_USER_ID OR (UM_USER_ATTRIBUTE.UM_ATTR_NAME = 'mail' AND LOWER(UM_USER_ATTRIBUTE.UM_ATTR_VALUE) = LOWER(?))) AND UM_PROFILE_ID=? AND UM_USER_ATTRIBUTE.UM_TENANT_ID=? AND UM_USER.UM_TENANT_ID=?</Property>
    <Property name="IsUserExistingSQLCaseInsensitive">SELECT distinct u.UM_ID FROM UM_USER u left join UM_USER_ATTRIBUTE ua on u.UM_ID = ua.UM_USER_ID WHERE u.UM_USER_NAME = ? OR (ua.UM_ATTR_NAME = "mail" AND ua.UM_ATTR_VALUE = ?) AND u.UM_TENANT_ID = ?</Property>
    </UserStoreManager>

    And further in $IS_HOME/repository/conf/identity/application-authentication.xml you will need to add following property under Authenticator Config for Basic Authenticator

     <Parameter name="UserNameAttributeClaimUri">http://wso2.org/claims/username</Parameter>  

    So my Basic Authenticator Config tag is as below.

             <AuthenticatorConfig name="BasicAuthenticator" enabled="true">  
    <Parameter name="UserNameAttributeClaimUri">http://wso2.org/claims/username</Parameter>
    <!--Parameter name="showAuthFailureReason">true</Parameter-->
    </AuthenticatorConfig>


    With this you will be able to get configured claims when you logged in using different attributes.

    References

    [1] https://drive.google.com/file/d/0ByTCb2KmTk76dWMwcHMzbWJmVzA/view?usp=sharing

    Anupama PathirageWSO2 APIM 2.0.0 DB Configuraiton

    APIM 2.0.0 uses the following databases.

    • Local database (WSO2_CARBON_DB) – Local registry space which is specific to each APIM instance.
    • User Manager database (WSO2UM_DB - Stores information related to users and user roles.
    • API Manager database (WSO2AM_DB) - Stores information related to the APIs along with the API subscription details
    • Registry database (WSO2REG_DB) - Content store and a metadata repository for SOA artifacts
    • Statistics database (WSO2AM_STATS_DB )- Stores information related to API statistics. After APIM analytics is configured, it writes summarized data to this database.
    • Message Broker database (WSO2_MB_STORE_DB) - Use as the message store for broker when advanced throttling is used. This is used in APIM instance which is used as Traffic Manager. If there is more than one Traffic Manager node, each Traffic Manager node must have its own message broker database.

    Following are the databases required for APIM analytics.

    • WSO2_ANALYTICS_EVENT_STORE_DB - Analytics Record Store which stores event definitions
    • WSO2_ANALYTICS_PROCESSED_DATA_STORE_DB - Analytics Record Store which stores processed data
    • WSO2_GEO_LOCATION_DB - statistics generated for selected geographic locations
    • WSO2AM_STATS_DB – Store API statistics related data and this should be shared with APIM instances.
    • WSO2UM_DB – Stores information related to the users. This also should be shared with APIM instances.
    • WSO2_CARBON_DB – Local Database for the APIM Analytics.
    • WSO2REG_DB – Registry database for APIM analytics. We can configure a separate one or use the WSO2_CARBON_DB it self.

    For two active-active all-in-one instances of WSO2 API Manager with analytics we can use DB connections as follows.