Simplify Building Content-Rich Portals – A “Knockout” Cloud Journey with Oracle PaaS

Oracle Content and Experience Cloud service

Transform the customer engagement and digital experience using Oracle Content and Experience Cloud service with content workflow, collaboration using Oracle Process Cloud and seamlessly surface the data from various enterprise applications including SaaS, On-Premise ERP.

Introduction

Heather, as a marketing team lead, works daily on creating lot of collateral and content for upcoming campaigns, workshops. She would start with a draft version of all digital assets like images, view-lets, teasers (quick two minute videos), branding, slide decks etc. Day-to-day she makes lots of changes (edits/revisions) to the collateral and would like to share/review with her supervisor before she finalizes them. Additionally, she wants to be able check how those assets appear on various devices like mobile, tablets and web browsers.

Laura an HR executive would like to announce and quickly get word out to all internal employees about upcoming Holiday Party. With some examples from last year (images, videos etc.). Also as part of recruitment process, she would like to post the current openings (with referral bonus) on an internal portal/website.

The question is, is there a comprehensive, enterprise-wide digital content and experience management platform which can help both Heather and Laura to manage their day-to-day activities efficiently, with easy configurations, controlled access, integrated, and, of course, zero-coding. Can that system be accessed from anywhere through any channel with no disruption? Can that system be available on a need-based subscription model which is flexible enough to turn on/off?

There is. With the Oracle Content and Experience Cloud service, which provides omni-channel digital content management and creating engagement portals with one click of a button right from web browsers.

In this whitepaper we will take you deep into Oracle Content and Experience Cloud features and how easily we can integrate other PaaS offerings like Process Cloud Service (PCS), Integration Cloud Service (ICS) to surface the data from back-end systems and SaaS applications.

Continue reading…

Sofbang Tech Team Tips Series: Data Extraction ETL – An effective method to manage data

ETL Article Header Image

What is ETL?

ETL stands for Extract, Transform and Load, a process used to collect data from various input sources, transform the data depending on business rules/needs and load the data into a destination source. The need for this process comes from the fact that in modern computing, business data lives in many distributed locations and in multiple formats. For example, data is saved by organizations in various formats such as a word doc, PDF, XLS, plain text, etc. or may be kept in any of the commercial database servers like MS SQL Server, Oracle, and MySQL. Managing this business information efficiently is a great challenge and ETL plays an important role in solving this problem.

The ETL process has three main steps, which are Extract, Transform, and Load

 

Extract – The first step in the ETL process is extracting the data from various sources. The data in each source can be in any of the formats like flat files or some database files.

Transform – Once the data has been extracted by various filters, validations, aggregate functions or some other business logic, it can be applied to the data to get the output in the desired format.

Load – This is the final step where the ‘transformed’ data is loaded in the target destination which may again be a flat file or some predefined RDBMS file.

 

Why and Where is ETL Required

Companies or organizations with years of history and/or a global presence will inevitably go through technological changes at some point; ranging from manual systems to simple in-house applications and data storages ranging from flat files to RDBMS. This can potentially create subprocesses within the big process (business) with completely different applications running on suitable hardware and architectural platforms.

In such scenarios, the organization’s unit in location “X” might be using mainframes and another unit at location “Y” would be using the SAP system to manage operations related data. In this type of setup, if an organization’s top management needs a consolidated report of all the assets of the company, it can be a challenge to gather all the data and reports. Collecting the right data for reports from disparate systems, then consolidate them manually can be a cumbersome process that could take days to deliver a final report to management.  A more efficient way would be to have a system that fetches data from these disparate sources, stores it in a data warehouse environment and generate a report whenever needed.

So how do you fetch the data from these different systems, make it coherent, and load it into a data warehouse?

To do this, we need a methodology or a tool that can extract the data, cleanse it and load it into a data warehouse application. In order to consolidate the historical information from all disparate sources, we set up an ETL system, which transforms the data from the smaller databases into the more meaningful long-term databases.

ETL is useful when

  • Companies need a way to analyze their data for critical business decisions.
  • The transactional database cannot always answer complex business queries.
  • You need to capture the flow of transactional data.
  • There is a need to adjust data from multiple sources to be used together.
  • To structure data to be used by the various Business Intelligence (BI) tools.
  • To enable subsequent business/analytical data processing.

 

ETL graphic

 

There are a variety of ETL tools available in the market. Some of the prominent ones are:

No. List of ETL Tools ETL Vendors
1 Informatica PowerCenter Informatica Corporation
2 InfoSphere DataStage IBM
3 Oracle Data Integrator Oracle
4 Decision Stream IBM Cognos
5 Data Integrator (BODI) SAP Business Objects
6 SAS Data Integration SAS
7 Talend Studio Talend
8 Pentaho Data Integration Pentaho
9 Pervasive Data Integrator Actian / Pervasive Software
10 Clover ETL Javlin

 

Advantages of the ETL tool

  • ETL tools normally provide for better performance even for large datasets.
  • They have built-in connectors for all the major RDBMS systems.
  • They help to reuse complex programs for validations etc.
  • They offer intuitive visual integrated development environment.
  • They also offer performance optimization options such as parallel processing, load balancing etc.

At, Sofbang I have worked with Talend Open Studio, an open source project for managing various facets of ETL (Extract, Transform, Load) process for BI and data warehousing. It is one of the most innovative data integration solution in the market today.

It’s open source, free to use, and community-supported. It summarizes every operation that loads, retrieves, transforms and shapes data,  and provides very easy to use ‘drag and drop’ UI components to enable intuitive and faster UI development a shown below:

Fig: Talend IDE Screen

For example, let’s try this with an ‘Excel Sheet’ as a raw input, which needs some validations, and filters to apply to data.  Based on that information we should get our desired data in the ‘output’ Excel.

Step 1: The sample input Excel is shown below which contains some invalid names and other details of employees.

Step 2: Drag and drop the respective components (in this case for processing Excel) from the components palette on the right-hand side, put them on the screen and draw the output connections as shown below:

Step 3: Now define the validations and filters to be applied to input data, by clicking on the ‘map’ component as shown below: In this case, we define our filters and validation as;

  • Names should be valid
  • Date of birth should be greater than ’01-JAN-2012’
  • All employees drawing salary greater than 20000 should be filtered and stored separately.

Step 4: Click on the ‘Run’ button to execute the job and get the results.

Step 5: When clicking ‘Run’ button we will get the following screen:

Step 6: The resulting ‘filtered’ and ‘validated’ Excel is shown below:

Fig: Excel with Valid names and Salary > 20000

 

Fig: Excel with Valid names and DOB > ’01-JAN-2012’

Manual vs. Automation – Let the Battle Begin

Testing is like a stage production showcasing the standard of work for an organization in the market. It is a vast, and generally overlooked, part of the software development process which can be considered  a change agent who highlights risk early to provide efficiencies in the way that we develop and deploy our software. It is broadly categorized into two methods: Manual and Automated Testing.

With more organizations understanding the essential role testing plays in quality software production, they are still discovering the best way to examine the quality of their software.

So what is the difference between Manual and Automated Testing?

Both manual testing and automation have their own benefits and setbacks and its worth knowing when to use which type of an improved outcome.

Manual testing reflects its definition and usage.  Testing is conducted by humans while with automation, the tester needs the support of tools to perform the tests. Both the methods cover all testing methods like black box, white box, load testing, etc. Among these methods, some case better performed manually while others work best in automation, it all depends on the situation or the requirement of the software. Trends show that organizations are keeping eye on automation but that doesn’t mean manual is going anywhere anytime soon.

advantages-disadvantages-test-automation

Let’s take a look at the pros and cons  each:

Automation Testing Track up’s

Automation has advantages over manual in that it is fast test execution reliable, repeatable, and programmable etc. In automated testing, test execution speed is faster – like a racing car – which reduces manpower, time, and efforts deployed during the testing process. It also plays an important role in long-term projects and is suitable for regression testing purposes. Let’s use the example of filling in the same registration forms manually and repeatedly. In different cycles or iterations, this becomes inefficient since manual testing does not offer code reusability. Thus, the complete code needs to be re-written in the case of a change in resources. If we compare this with automation, any team member can use the test case anytime. Additionally, the cost of tools with fewer resources is less expensive than having a large team for the manual testing process required to complete the same tasks. The aspect of automation is not just to reduce testing, but it’s also productive and results-oriented.

Manual Testing Track up’s

So does this mean the end of manual testing? Does it have a future?

There are still stacks of technologies which require manual testing where automation fails. Applications which are based on touch technologies, such as Kindle, iPad, Tablets etc. still require manual testing. GUI testing proves manual testing is preferable over automation. In the GUI, layout changes are difficult to test through automation because when you playback the recorded scripts any gestures or buttons which were available at the time of recording that were not found may cause the test scripts to not function properly. Also, with manual testing, a person can perform random testing that allows for the finding possible bugs.

Below is a chart of the differences.

Manual Testing

Automation Testing

Test run by a person Test runs through tools
The initial phase of testing without it automation would not possible Continuous part of manual testing
All the STLC phases like test planning, Executions, bug tracking, etc. is done successfully by human Hands In automation, we can do using various open source and licensed tools like Bugzilla, HP ALM, JIRA, etc.
Lower cost Higher cost
Time-consuming Takes less time
Difficult to do regression testing Regression testing simple with the help
of tools
More resources required to execute
test cases manually
Need fewer resources as testing is
done with the help of tools
Random testing can be performed to
track bugs
Can only test according to automated scripts
No programming skills are required –
a non-technical person can also do
manual testing
Programming skills are needed. Testers can program the complex tests to find
the bugs
Considered to be less reliable Considered to be more reliable
Low accuracy results High accuracy results
Difficult to do non-functional tests in manual testing Non-functional tests are effortless with
the help of tools

 

Who wins the battle?

So which is better Manual or Automation? They both have their benefits, so it depends on what testing approach is the best for the situation. Choosing the right approach gives you the right direction to achieve your goals as well as saving you time, result and efforts. The entire outcome of these tests comes when the right type of testing is applied in the right environment. Both testing approaches have their own benefits and drawbacks, but for software testing quality, you need to utilize both methods sensibly.

bot

Serve, Innovate and Save

Serve, Innovate and Save

More than ever, CIOs are posed with critical challenges that will shape the future of their IT department and how they service their enterprise. It is important to not only effectively orchestrate people, process and technology, but to serve as a catalyst for your business, empowering stakeholders, while reducing the overall IT budget. The need is simple, but daunting: Enhance Service, Continuously Innovate & Save Costs

With these pressures, IT departments are forced to evaluate their vision and mission and assess how they manage ongoing operations, SLAs, LOB project requests, support contracts and much more. As business needs rapidly evolve it is vital that IT not only responds to those ever changing needs, but plays a proactive role to serve, innovate and save on the bottomline.

Continue reading…