Tag Archives: API Management

Four Warning Signs an Integration Wall is Approaching

The Integration and API Management markets are growing, expanding in both popularity and use. Enterprise App integration will surpass $33b by 2020, and other markets like iPaaS and Data Integration are growing at double-digit CAGRs. Enablers, such as containers and serverless technologies are only accelerating the move toward increased disaggregation of applications.

All seems rosy. And it mostly is.

But with the explosive growth of APIs and endpoints, traditional centralized tools like ESBs will become unsuitable, and simple low-code snap-together tools won’t scale to address the broader scope. We’re potentially about to hit an “integration wall” at high speed.

Consider the following four warning signs – some technical, some process – that I find are beginning to plague the integration market:

1. Waterfall Development for integration is hitting a wall.

Although most code development has shifted to an Agile Development model, the same can’t be said for Integration tools. As the quantity and diversity of endpoints increases, and as Integration projects become more diverse and complex, use of the waterfall model is beginning to slow down integration projects. And with a future where there will be billions of Integratable endpoints, it’s obvious that an Agile Development model for integration will need to become the norm.

2. Existing tools and programming languages aren’t optimized for Integration-at-scale.

Enterprises that currently use low-code, snap-together, centralized integration technologies (including iPaaS) will not be optimized for orchestrating, integrating, observing and governing the expansion of constantly-changing endpoints. Nor are traditional centralized approaches (think: EDI and older ESBs) prepared to handle increasing endpoint scale or diversity. Many of these existing tools are well-adapted for Line-of-Business or Citizen Integrators of relatively small-scale implementations but are far from well adapted for more complex integration-at-scale projects.

3. Current programming languages are not optimized for Integration.

With languages like Java/Spring or JavaScript/Node, developers can engineer flow, but must take responsibility for solving the hard problems of integration. With these languages, developers have to write their own integration logic or use bolt-on frameworks. Clearly a new programming paradigm will be needed long term.

4. The Exploding Endpoint Problem is very real.

As I referenced above, IT is ill-prepared to address the oncoming wave of service disaggregation, the diverse types of APIs, differing sources of service endpoints, challenges from Big Data, and multiple approaches to serverless IT. The industry is about to hit a scale and diversity wall. To wit,

  • 917 apps in use per enterprise (Netscope, 2016)
  • 893-1206 average cloud services used per employee (Kleiner Perkins, April 2017)
  • 19,000 APIs as-of January 2018 (Programmable Web, 2018)

And if you don’t believe those numbers, Matt Eastwood of IDC recently pointed out that the number of containerized services has expanding well beyond where VMs ever were. Yep, billions of programmable endpoints aren’t kid’s stuff.

Where does this leave us?

A new approach to addressing the future of integrating thousands-or millions-of endpoints could lie in a new programming language, Ballerina.

Ballerina is a simple programming language whose syntax and runtime have been optimized for the hard problems of integration. Its focus is integration – bringing concepts, ideas and tools of distributed system integration into the language. Based on the concepts of interactions within sequence diagrams, Ballerina has built-in support for common integration patterns and connectors, including distributed transactions, compensation and circuit breakers. And it supports JSON and XML, making it simple and effective to build robust integration across distributed network endpoints.

So, watch this space for future developments. And in the meantime, beware of the approaching wall.

Three Months in to PSD2 – Confessions of the WSO2 Open Banking Team

It’s been 3 months since the PSD2 compliance deadline and the dust is settling in. Or is it really? Just like when it started, the post PSD2 landscape is viewed from different angles. It has been called everything from a ticking time bomb to a slow burn to a never ending honeymoon period. We think the biggest surprise was that everyone thought that January 13 was the end. It wasn’t, it was the beginning.

When we created WSO2 Open Banking, we knew customer needs would be diverse and every technology experience we deliver would be unique. Turns out we were right. Our journey with WSO2 Open Banking has unraveled some interesting experiences while working with different stakeholders in this compliance ecosystem. Here’s what we learned.

Confession #1: (Almost) Everyone was late to the party

Everyone (including us) started counting down to PSD2 from 6 months to 3 months to 1 month. But the reality was, January 13 was just the date when PSD2 was implemented by the EU parliament as a European-wide regulation.

Several regions across Europe chose to deal with imposing PSD2 in their own way. We’ve been tracking the country-specific deadlines quite closely and about 46% are yet to set an official deadline for compliance. We believe that the final date for compliance will be when the Regulatory Technical Standards (RTS) come into effect in September 2019. That’s good news for us because there’s still a large viable market for compliance technology! ;)

Confession #2: Compliance confusion did not discriminate

Over the past several months, we’ve worked with many banks of different sizes across Europe and they all had similar questions:

This led us to believe that banks, regardless of size, require a lot of guidance in the compliance process. It’s a good thing we have a team of experts to do just that!

Confession #3: They came, they saw, they vanished

When PSD2 first started gaining traction in 2016, the knee-jerk reaction of every API management and integration vendor was “this is a goldmine of opportunity we cannot miss”. So they went head on into the market with an existing product. Come 2018 when the need for compliance technology has evolved, these “first mover” technology vendors have gone quiet.

It remains uncertain whether it was the lack of a well thought out strategy to keep consistent market demand, fintech domination, or not giving the compliance market the attention it deserved. One thing is for sure, this is a highly competitive market for technology vendors like us. But no complaints, we love a challenge and are pretty good at winning them!

Confession #4: API standards (and the organizations writing them) are a solution providers BEST friends

A lot of shade gets thrown at not having a common API standard across Europe (version 1.1 of the Berlin Group API specification is yet to come, we’ve got our eyes peeled for that). However, Open Banking UK has got this in the bag by having a comprehensive API specification that WSO2 Open Banking supports.

When we first started out, these standards really helped set the base for building our solution. Our development team continues to spend a good couple of hours every week identifying latest improvements in the specifications and contributing to their development by participating in working groups.

Confession #5: Compliance is not a back breaker…it just needs a well thought out strategy

A lot of banks think of compliance as a major headache and seek a “quick fix” to compliance just so they can tick off the checkbox. The reality is, quick fixes can do more damage than good. PSD2 compliance is a big deal and if you go into it without a strategy, that’s cause for alarm. Even if you don’t have a dedicated open banking or compliance team you can still get the job done.

You just need to rally the right members, set your goals for compliance and figure out what you need from a technology vendor. Then you need to pick the technology that gives you value for money and won’t take eons to work with your systems and deliver compliance. It’s a matter of working closely with a solution provider towards a common goal.

Confession #6: Do your research or go home – The learning never stops

There is a minimum of 3 articles written a week on open banking. Everything from thought leadership material, opinion pieces (like this one), and publications from standards continue to explore and discuss this ecosystem. And what we learn from our conversation with customers is an invaluable source of research to keep abreast of where the market is heading. We treat each of these as a unique source of intelligence and they continue to nurture our product management, sales, and marketing strategies. It’s the only way to survive in an ecosystem as dynamic as this one.

It’s been a great ride so far and we can’t wait to see what comes up next! No doubt there will be plenty more surprises and exciting developments to look forward to!

The WSO2 Open Banking Team

Why Swiss Chocolate Relies on WSO2

The Swiss Federal Office of Information Technology, Systems and Telecommunication (FOITT) is one of the internal ICT service providers in the Federal Administration. It supports the administration by developing and providing efficient, secure, and user and public-friendly IT solutions. As part of its responsibilities, FOITT manages more than 40,000 enterprise users of two of their key platforms – one an electronic customs declaration process for imports/exports and the other, an automated way to manage revenue from taxes.

While these platforms have proved successful, FOITT embarked on a digital transformation initiative to make these more efficient. What they had hoped to achieve was the ability to scale to provide a more seamless experience to users.

At WSO2Con US 2017 Dr. Gion Sialm, chief architect at FOITT, explored how they leveraged WSO2 technology to achieve their objectives. They worked together with Yenlo, a WSO2 Premier Certified Partner, to implement their solution. To illustrate how the two platforms work, Gion took the example of Swiss chocolate – the process of importing cocoa to make chocolate and the distribution of the end product within and outside Switzerland.

The e-Dec (Electronic Declaration) Platform

All goods, and in this instance the import of cocoa and export of chocolate, need to be declared and there’s a specific process that needs to be followed. Given that it’s a fairly complex process involving many functions and stakeholders, FOITT created the e-Dec platform to simplify this process. What it essentially did was digitize this process and made it more efficient and user-friendly. As with any digital platform, the e-Dec platform too needed to be refreshed and revamped to be more aligned with new requirements.

For instance, the platform had a lot of different protocols and some were extremely outdated like POP3S and FTPS. Apart from this challenge, the application was based on the Oracle WebLogic Server, which follows the eXtended Architecture (XA) pattern. “Previously, WSO2 products didn’t support XA, but because of FOITT’s requirement, it’s now a part of their feature list,” noted Gion.

The Fiscal-IT Platform

On the retail side, all goods, like chocolate, sold within Switzerland carries a value-added tax (VAT). Previously, these transactions were done manually so FOITT built the Fiscal-IT platform that automated this process. Again, like the e-Dec platform, this too required improvements to further streamline this process.

For instance, the platform was created in a modular manner so as to have the best of breed technology for each feature resulting in a mix of multiple different technologies, like FileNet, Java and SAP, which all needed to be integrated. “Because we decided to employ microservices, we ended up with a lot of REST and SOAP APIs as well as JMS so we needed an enterprise service bus that was flexible enough to maintain these things easily,” said Gion.

The WSO2 Solution

They followed the same architecture for both platforms so as to reduce cost and speed up their go-to-market. The API Gateway, Publisher and Store components of the WSO2 API Manager as well as the WSO2 Identity Server as the Key Manager were used as their core API management solution. WSO2 integration technology was used for routing and message transformation between the sender’s and receiver’s different protocols. WSO2 analytics (not pictured in the architecture diagram above) also plays an important role in the solution — FOITT, together with their service providers, developed a dashboard using WSO2 Data Analytics Server to identify any problems that occur in the application. The user just has to type in the source and destination program and within a few seconds the metadata of all the messages is collected (message tracing) so that errors can be easily identified. The dashboard can even correlate the messages with the log files, which is a very important feature in a distributed landscape like this.

“WSO2 products relate to digital transformation like the Swiss army knife relates to MacGyver. Our platforms are evolving rapidly. In order to keep pace with this innovation it’s important to have a strong relationship and collaborate well with WSO2,” says Gion. “Automation is also key. We have to manage 11 stages throughout our platforms and doing it manually would be quite impossible,” he adds.

To learn more about how FOITT is leveraging WSO2 technology for key government initiatives, watch Gion’s presentation at WSO2Con US 2017:

Travis Perkins: Disrupting the Retail Industry with WSO2 Integration Technology

Travis Perkins, UK’s largest supplier of building materials, embarked on their digital transformation journey last year in the hopes of enhancing customer experiences, growing their business and improving the usability of their systems. At WSO2Con EU 2017, Christopher Stone, the head of integration at Travis Perkins, talked about the steps they took to go digital with the help of WSO2 technology and key partners.

To help understand Travis Perkins’ current situation, Christopher used an analogy coined by their previous CIO Neil Pearce – the house of IT.

By looking at all the areas of the house that need improvement, Travis Perkins realized that they needed to adopt integration technologies that would allow their systems to be flexible, future-proof, innovative, and reactive. “Integration is the plumbing, electrical wiring, and foundation — basically the rooms of the house,” said Christopher. “Effective integration and quality data are the key enablers for our digital agenda that is built on a solid foundation such as reliable cloud infrastructure and networking.”

Once Christopher explained this analogy, he explored how they previously worked on integration projects. He would most likely be a part of a program delivery team who is pressured to deliver fast and within a strict budget. This hinders their vision of the overall enterprise benefits and results in a point-to-point spaghetti architecture that leads to high maintenance costs, difficult to support, inconsistent standards, reliability issues, and limited reusability.

Today, Travis Perkins has a central integration layer powered by WSO2 integration and API management products. Services are developed according to project requirements, but built with the entire enterprise in mind using a set of policies, patterns, and standards governed by the project diagnostic team. This results in maximum reusability, easier support and maintenance, and continuous improvement in delivery, quality, and speed. The replacement of their core ERP system from a legacy system to an ERP vendor named M3, for sales order entry, sales order management, pricing, tool hire, finance, and supply chain is the largest program Christopher’s team is currently delivering on. But they have been involved in various other integration projects too. “The right tools combined with the correct mindset, architecture, and governance allows you to meet your goals in achieving good enterprise integration which benefits your company as a whole,” he says.

Many external parties helped Travis Perkins along their journey. In their early days, Wheeve, integration technology experts partnered with WSO2, gave them a theoretical and conceptual mindset on how an integration department should run. They helped set up the department and build up the team and aided them with their architecture and processes. During the delivery phase, Travis Perkins’ engineers and analysts were supplemented by ICT solutions providers partnered with WSO2 — Chakray and Mitra Innovation. They offered integration specialists well-versed in the WSO2 platform who helped analyze the requirements and worked in an Agile Scrum fashion to deliver the projects. “They have helped us come leaps and bounds, not only in delivering projects but also in terms of learning from their experience and knowledge of the WSO2 platform,” said Christopher.

“WSO2 is a great platform. It enables us to deliver quickly and compliments our strategy to utilize open source technology wherever possible,” Christopher concluded. “When any of our engineers come across difficulties in development, the WSO2 Subscription gives us great SLA’s. We don’t need to use production maintenance support very often, but when it does happen we have a very good relationship with WSO2 for them to support us in getting back to business as usual.”

To learn more about how Travis Perkins is successfully traversing its digital transformation path, watch Christopher’s video at WSO2Con EU 2017 below.

Bringing an Efficient Home Care Solution to Life with WSO2 Technology

Senior citizens and disabled people—many in fragile health and requiring assistance—often have limited resources for managing their health and ensuring their security. Effective home care solutions allow such people to safely go about their day-to-day lives and enhances their quality of life. To aide home caregivers and patients, Raffaello Leschiera, a solution architect at Engineering Ingegneria Informatica, proposed a reference architecture for efficient home care using WSO2 technology at WSO2Con EU 2017.

Raffaello began by exploring the proposed reference architecture that connected and interfaced with all stakeholders, like the patient, his/her family and medical staff. Firstly, they need to collect data from medical devices in the patient’s home. Protocols like IEEE VU specifications are used and medical devices are mediated using Arduino and Raspberry Pi boards. Once collected, the data needs to be normalized and stored so it’s represented in the same way no matter which device it was collected from.

This data needs to run through analytics to monitor the patient’s health, process events and if needed, send notifications through various communication channels. Data integration channels using the HL7 standard protocol for health care is used to send this data to medical staff. The medical staff can then access it through web and mobile interfaces and an API gateway decouples all features from these user interfaces. And finally, the entire system needs to be synchronized and controlled by identity and access management to ensure security and privacy.

Reference architecture for a home care solution

Raffaello noted that WSO2’s comprehensive technology platform, particularly its integration and analytics capabilities, were the main reasons for picking WSO2 as their technology partner. The open source nature of the products was also a key deciding factor since Raffaello and his team work with many public administrators who prefer to adopt solutions that are completely open source. “WSO2 has a wide technology platform so you can find the right answer to every part of your problem,” said Rafaello. “And because all the products seamlessly integrate with each other it’s easy to focus on the domain problem rather than the technology problem,” he added.

To describe how WSO2 products were used for different tasks, Raffaello compared the home care solution to a football game:

  • Goalkeeper: WSO2 Microservices Framework for Java (WSO2 MSF4J) serves as the goalkeeper. This is the entire back-end of the system, which is based on lightweight microservices that are developed, deployed and monitored through MSF4J in a highly scalable and reliable manner with integrated security.
  • Defenders: WSO2 Data Analytics Server serves as one defender that receives data, analyzes it in real-time, and sends notifications. WSO2 Enterprise Integrator is the next defender who transforms disparate types of data into a normalized format and sends it to the hospital IT systems.
  • Forwards: WSO2 API Manager is one of the forwards, which faces the medical staff and is used to design, prototype and publish APIs and govern API usage. WSO2 IoT Server is another forward, which faces the medical devices for data collection, device management and protocol support.
  • Wings of the pitch: Here the WSO2 Identity Server takes care of all the strict security and privacy requirements.
  • Center of the pitch: Finally, WSO2 Governance Registry serves as the ‘Lionel Messi’ at the center of the pitch; in other words it governs the solution through surveillance just like how Messi would guide and lead his team to victory.
  • For this solution to work, Engineering Ingegneria Informatica needed a remote device that can track a patient’s movements within his/her home. Enter Joe Care (or the Joker pictured above). Joe Care is a remote presence device that is flexible and agile enough to move around the patient’s home. They used various technologies like Arduino boards, software that deals with movement and the sense of space as well handling (touch). It served as the medical eyes, ears, voice and fingers within the patient’s home.

    In the future Rafaello and his team aim to engage with users more, further analyze threat paths and include more technology like wearables that monitor movement and exercise. They would also like to create more intelligent early warning score models and move their entire solution to the cloud so more patients and operators can access it.

    Watch Rafaello’s presentation at WSO2Con EU 2017 below to learn more about their home care solution powered by WSO2.

A Smarter Transport Management System for London with the Help of WSO2

Transport for London (TfL) has a daily challenge – to keep a city of over 8 million people moving around the metropolis. Its magnitude can neither guarantee the transport system will always absorb commuters nor give them a congestion-free experience. It is a place where the smallest of changes would have a massive impact on your journey. Citing an example, Roland Major, a former enterprise architect at TfL, says that a London Underground strike once saw a 3% increase in traffic and a staggering 90 minute increase in journey time. Estimates project a 60% increase in congestion around central London by 2031.

Given all these complications, TfL decided to become more intelligent with technology to reduce commuter times, make the roads safer for pedestrians, cyclists and drivers, and to slow the pace of traffic. Intelligence and data with a purpose are the buzzwords here. “We need better understanding of real-time demand. What insight can we get from our data, and how can we get innovative with all this information?” says Roland. He was actively involved with TfL’s Surface Intelligent Transport System (or SITS), a project that aims to better manage the city’s entire road space of pavements, cycle lanes, and motorways.

SITS’ business proposition is that it can offer billion pounds’ worth benefit to London by identifying delays in the road networks sooner than it is done at present: “We weren’t detecting incidents, and by the time we have detected them, they were already over. With technology, we can see these incidents early. We recognized that the market can do sensible things with our data,” says Roland. For example, within the traffic light system in London, TfL manages an estimated 7,000 junctions around the city and 14,000 magnetometers detect millions of daily events. This data is discarded after analysis; however, if used, TfL realized that the response time to delays improved by 15 minutes.

TfL has a 10 year plan in place, with all the of different required components mapped out. Data analytics form the core of this operational model. Data is obtained from GPS systems and bus routes. The road incidents are logged and used to determine what additional information is needed to understand and manage each leg of commuter journeys. All the data is hosted on the cloud and currently TfL is in the process of adding these components to the framework.

TfL’s transport management system

London’s new road management system relies on WSO2’s API management, integration, identity and access management, and analytics products for the intelligent work needed. These products are deployed on a private cloud managed by WSO2. The starting point – LondonWorks, a registry of all road works and street related events, both planned and current, in the Greater London area. LondonWorks is used to assess road networks, coordinate the various road works to minimize congestion and for inspection, compliance, and monitoring. Maps and forms of type data have been integrated to allow entry of incidents into the system and their identification on the map.

As their model progresses, TfL has ambitious plans for all the data they have streaming in – big data analytics to give them more insights to road movements, which will enable them to give the necessary alerts and empower them with smarter ways to deliver better, safer commuter experiences for London.

Watch Roland’s presentation for more details on TfL’s plans for London.

Explore the WSO2 middleware platform with its offerings in API management, integration, identity and access management, analytics, and IoT.

Did you know that WSO2 won TfL’s data analytics Hackathon contest? Learn all about it.

Building a Cloud Native Platform for CitySprint’s On the Dot Delivery Service

Picture a scenario where you are analyzing the results of a marketing survey which shows that a high percentage of consumers prefer same day shipping, online tracking of their orders, choice of shipping options, and deliveries within a specific time slot. Then you find out that retailers already fulfill around 65% of these needs, but there is a gap in the market, a gap that you can fill by offering a novel service. This is precisely what UK-based logistics and delivery service provider, CitySprint did when they developed the On the dot delivery service, which allows shoppers to receive their orders during a one hour time slot of their choice without extra costs.

“We wanted to positively disrupt the time slot delivery space. In doing so, we wanted to build an API ecosystem that sparks interaction, open new channels and reach new streams of revenue,” says Eduard Lazar, Senior Solutions Consultant at LastMileLink Technologies (a CitySprint Innovation Lab). At the heart of of this project was generating value for users and driving innovation, “On the dot is all about convenience for consumers, be it as a fulfillment method or in terms of collection and delivery time slots. We also wanted to simplify integration and create a developer community through our API ecosystem,” he adds.

Defining the key challenges was one of the first steps before introducing On the dot to consumers. To begin with, CitySprint had to move their data centers to the cloud in order to become a cloud native platform. They also had to create open RESTful APIs, enable identity federation, foster innovation so that it can result in a community of developers who will think up new marketable ideas and simplify integration. Selecting open source software is one of main tenets at CitySprint, and as such, they set about developing an open source platform made of WSO2’s API management, integration and identity and access management capabilities, using a DevOps approach. Meanwhile, the architecture was developed using Apache’s Tomcat and Cassandra, and WSO2Carbon used for continuous deployment.

By placing API management at its core, CitySprint has been able to achieve the required functionality and formed their innovation community (an interesting anecdote on the latter, a TechSprint event was organized where high profile companies sent teams of developers to CitySprint to build innovative products within 24 hours. Results have been quite amazing with an added bonus of introducing CitySprint to new leads).

From a business perspective, implementing this project was primarily underpinned by issues of costs, in addition to those of speed, integration, lifecycle, and skillset. When CitySprint introduced more complexity into the system, this also meant they potentially introduced a time lag. Yet, can this platform control costs through simplification and reuse? Is there a way to save time by simplifying integration? Is the skillset future proof? Can they model the whole lifecycle?

The result – On the dot – answers all the above with a yes. On the dot cloud native platform has empowered CitySprint to enter the market with an adaptable platform, which allows developers to self-sign and begin using the APIs, it is integrated as there are multiple systems working together, they have also connected data and devices, integrated platforms with those of their partners, and connected the user experiences of both customers and partners. Following their successes in the UK, plans are underway to make On the dot a global phenomenon and CitySprint is certain they can achieve this with the right technology.

If you need more details on how CitySprint made On the dot, watch their presentation.

Learn more about WSO2’s API management, integration and identity and access management capabilities.

UNRWA and Capgemini: Creating a Refugee Centric Data Model for Better Insights

The United Nations Relief and Works Agency for Palestine Refugees (UNRWA) has over 5 million registered refugees requiring education, healthcare and social safety assistance, among others. UNRWA aids refugees across five countries – namely Lebanon, Jordan, the West Bank, Syria, and the Gaza Strip which has over 500,000 students, 692 schools as of now, and hundreds of primary health facilities.

In order to automate several processes across the region, the team based in Gaza had already developed the Education Management Information System (EMIS) consisting of three modules (students, staff and premises) and reporting tools. EMIS captures information and manages the educational progress of half a million students, by integrating data from registration, health, facility management and human resources systems that are already in existence.

Yet, given the numbers and scale of its operations, a central data model that has the capacity to integrate data from several entities was the need of the hour to support its regional operations and EMIS. To transform their information management system, UNRWA and Capgemini used WSO2 technology to create a model which mirrors UNRWA’s organizational ethos – placing the refugees at the heart of all their operations.

“The technology is there, but it’s really about the people,” says Francesco Lacoboni, Managing Consultant at Capgemini. Accordingly, the main drivers of the new UNRWA Enterprise Architecture are built upon the strategic principles of people, information, collaboration, and security. People influence how the information is created, managed, and consumed. The platform is an information-centric one – rather than managing documents, it manages open data and content. Its shared approach design aims to improve collaboration, reduce costs, maintain standards, and ensure consistency across the board. Security and privacy features for data protection round off the principles of this platform.

Before the new model was introduced, there was a time where the information that streamed through the system was physically replicated via the transaction log. For reasons of ease and efficiency, UNRWA and Capgemini decided to provide a common set of APIs to all the developers, not only to fulfill the needs of the specific application, but to also create the framework for future use of this semantic concept. Every entity has a credible API that can be used to navigate the knowledge, eliminating the need to design a new API. The resultant Common Data Model (CDM) was created using OWL (Web Ontology Language), and its architecture and governance completed using WSO2’s integration and API management platforms.

For Luca Baldini, Chief of Information Management Services at UNRWA, it was the first time such an approach was used and now that it has been rolled out, he praises its benefits: “The new model has been very productive, as it created a common language between IT specialists and our business representatives. We can use different kinds of technology for data retrieval and distribution.” Francesco believes one of the main benefits of the new model is that it helps increase the transparency of UNRWA’s operations. Now that the new model is successfully in practice, analytics is the next frontier and they hope to leverage WSO2’s analytics capabilities to meet their requirements. Spurred by the possibilities of analytics, plans are in the pipeline to use this data model along with unstructured data provided from the field to improve operations and add further value.

You can watch Luca’s and Francesco’s presentation at WSO2Con USA 2017 to hear more about their project.

Learn more about WSO2’s integration, API management and analytics capabilities if you would like to use them in your enterprise.

Brigham Young University: Enabling API Discoverability and Data-driven Business Insights with WSO2

Brigham Young University (BYU) began their API Management story 2 years ago when they decided to adopt an API-first architecture that follows a governed process. With over 451 APIs for both external and internal customers, and several development teams working independently of one another, Brayden Winterton (Software Engineer at BYU) likens its management akin to running a small city.

Modernizing their API management was a result of a problematic system that existed at that time. For one, the API manager in existence was closed-sourced and used an old, unsupported third party code. Adding some confusion to the mix, BYU had two versions of their API infrastructure in production – having started with one version, developing a second version along the way and the migration process forever a work in progress. Due to a memory leak, boxes had to be rebooted nightly (if not all API traffic ceased by noon the next day). Furthermore, there was no monitoring of API usage and the documentation support was out of date. In short, BYU was in a “serious situation” to use Brayden’s exact phrase.

Faced with all these scenarios, BYU was looking to implement a new API management solution. A key need was to create a centralized repository for all the APIs at BYU, which enables developers to search for and find all the available APIs, in addition to the respective authorization processes. A seamless transition without drastic changes to their existing developer work was another one of their important requirements. Low latency, up-to-date documentation, integrating with legacy systems and the ability to keep track of all the APIs being utilized completed their wish list.

To implement their requirements, they turned to WSO2 API Manager and WSO2 Identity Server. BYU now has subscriptions that allow consumers to get through to the API and subsequent monitoring; they were able to integrate all legacy systems with message mediation, minimized latency even while mediating quite heavily and of course, it is all open source. The BYU model works on open subscription first, however there are instances where they have needed to block a subscription until further approval was granted. They have been able to do this with an open source platform. Another huge plus has been the ability to utilize industry standards and BYU even got something that was not available to them previously – monitoring and analytics to support their business decision making. Improving discoverability and keeping the documentation up to date were the last pending issues for BYU, ultimately solved by the BYU developer portal in the second stage of their implementation.

“Our developers who have migrated are having a fantastic experience. They’re able to use things in a standard way, able to find the documentation they are looking for, utilize libraries, things aren’t drastically different, all of their old systems are continuing to work and they are getting a lot better reliability out of what they’re trying,” says Brayden. Adding to this success, BYU has seen higher API consumption as of late and with the improvements in place, Brayden is excited about the future.

If you would like to listen to Brayden’s full presentation at WSO2Con USA, click here.

Learn more about the WSO2 API Manager and WSO2 Identity Server if you haven’t tried it out yet.

Cashing in on APIs – Leveraging Technology to Boost Your Business

Even if you’re not an excessively tech-savvy individual, you most likely would have used a mobile application in your mobile device via an internet connection, used a Gmail client, Twitter, Facebook, or mobile apps, or purchased something online. In a tech world, you’re already reaping the benefits of application programming interfaces (APIs). The use of APIs is becoming even more popular today as service providers are scrambling to embrace the Internet of Things. With the availability of new tracking devices, smart homes, smart vehicles, mobile phones and tablets, consumers now have more options on how they consume applications.

Let’s take a step back and try to understand what this all means. An API is a term that’s used to denote a well-defined interface to access certain resources – in other words a service available to an end-user. If you haven’t worked with web APIs before, you may think it’s a type of service exposed over the Internet to perform certain operations. APIs are the foundation of today’s software engineering industry and enterprises are jumping on the bandwagon to reap the benefits of using them to integrate and automate to make their online services more appealing and user-friendly to end-users. Well-designed APIs will enable your business to expose content or services to internal and external audiences in a versatile manner. Today, most organizations use APIs to build their solutions internally and expose these services to the world at large. APIs will immensely benefit both service development teams as well as service consumers.

A good, yet simple example that illustrates this well is a weather update application that’s available on your mobile device. This application that typically runs on a device will not be able to provide weather forecasts of a specific area without connecting to an external service. However, it can call a GPS device on your mobile device or request the user to retrieve location coordinates of a specific area for which you want a forecast. Once you’ve defined your geographical location, the mobile application can simply call a weather service API and request the required information. What’s important to note here is that you don’t need to perform any complicated tasks, do calculations, or run an analysis on the mobile device. You can simply push relevant parameters to an API and obtain the results you want.

If you view this same example from one level up, you’d see that there’s a client application and a service and both of these are connected by an API. That’s essentially what an API does; it can integrate your services, data, content, and processes with external parties in a very effective and efficient manner. So, what’s the difference between services and APIs? Essentially, the functions of both are the same, but a slight differentiator would be that an API would generally have a well-defined interface to its services. That said, there’s a notable difference between managed and normal web APIs/services. Managed APIs are often enriched with additional features on top of a standard API or service. These are referred to quality of services or QoS. Common QoSs include security, access control, throttling, and usage monitoring. Security forms the foundation any API infrastructure across the entire digital value chain. Malicious users can access your systems the same as legitimate users would, therefore it’s important to enable security at all points of engagement. Usage monitoring helps enterprises to improve their APIs, attract the right app developers, troubleshoot problems and, ultimately, translate these to better business decision.

Boosting efficiency to become more competitive

Enterprises too are seeing the potential benefits of APIs to propel business growth, irrespective of the size and nature of the business and the industry they operate in. The key is to get started now to be able to maintain a competitive edge. A typical example is the extensive use of APIs in the hospitality industry; for instance, the owner of a restaurant or a small hotel would operate a simple website and some internal services. But at some point, when the business grows, they cannot maintain the same internal system and work with external parties. At this stage, business owners would need to think about consuming external services and exposing their services to the external world. And that’s when APIs and API management solutions come into play.

Large, global companies in the financial, transportation, logistics, and consumer sectors have already started to expose their systems and services to the outside world as APIs. The real benefit lies with being able to seamlessly integrate internal systems with those external ones to leverage benefits like creating properly structured services that are synced within the company, e.g. human resources department exposing non-sensitive employee data to other departments that need this information. A typical example is an online retail business that would need a payment solution to integrate with its system. Such a solution would not need to be implemented from scratch, rather the business can expose APIs via already available payment solution providers like Stripe, Zuora, or PayPal.

To explain this further, let’s consider a restaurant owner who can expose menus and ordering services via APIs. This will enable external developers to consume these APIs with their apps and incorporate the restaurant’s menus and services into the travel applications they’re building. When exposing APIs, the restaurant owner would need to consider throttling, a process responsible for regulating the rate at which the application is processing, as well as the security aspect of exposing these APIs. On top of these, a service provider may need some insights into the usage of these APIs – for instance, details about service consumers (like which apps have been invoked more), usage patterns (most popular food types), traffic patterns (peak order times), etc in order to make certain business decisions and make the service more efficient. For this, you might need sort of analytics and usage monitoring capabilities as part of your overall API management solution.

How Internal Services Can Expose Services to External World Via APIs

how internal services can expose services via APIs

Ultimately what you achieve in terms of business benefits is brand awareness by becoming a smart business. Moreover, in addition to profits gained from direct API consumption, users can earn additional revenue by charging users for API/service usage. This concept is known as API monetization and most API management solutions already have this feature in-built as an extension, enabling creative users to turn cool ideas into revenue generating APIs within minutes. And open source products have proved to be most useful to meet all your API management requirements as its cost effective and easy to deploy.