A Canonical Data Model, the missing link within a Service Oriented Environment

Chris Judson has given an interesting presentation regarding a Canonical Data Model within a Service Oriented Architecture.

First he gave an example of the different aspects and problems you could be facing when defining the existing arhictecture and business flows within an organisation.

One of the aspects that’s needed to accomplish this, is getting IT and business to consolidate and collaborate with eachother to have a clear understaning of today’s architecture and the goals defined for the future.

The Canonical Data Model will define a common format to describe business entity within the enterprise wide organisation, as well for business as IT.

Take aways from this session:

  • The CDM will reduce the interface maintenance and encapsulate business logic in one central place
  • Put the CDM on the bus: you can plug in new applications to listen to existing events, without the need to define a new format for the new consumer + there’s a common understanding of the data model for as well business as it
  • Use the 80/20 rule to define a CDM: First you take all the unique identifiers combined with a super set of data which will be used by most consumers. In other words, if 80% of the consumers have the needed data within the CDM, the 20% can be delivered using the enrichment pattern, without the need to enlarge the payload of the CDM
  • Managing change is hard within such a model, because the dependencies between several applications are mostly high. To manage change, the 80/20 rule is applicable as well. When 80% of the consumers need new attributes, changes in the existing attributes, … the CDM can be changed. The other consumers can be delivered the same functionality using the enrichment pattern again.
  • For schema versioning the Format Indicator Pattern is mostly used
  • Use generic XML types for the XSD instead of DB specific types
  • Use declarative namespaces to manage data domains to have a generic enterprise wide data definition strategy in place

The presentation of Chris was very enlightning, because a lot of these tips & tricks are valuable for each design or implementation using XML type data and service enablement.

Oracle Data Integrator (ODI) – Data Integration Strategy

In the ODI CAB Meeting held during Open World we’ve heard a lot of new insights regarding ODI Suite and several cases.

To sum up the main differentiators of ODI when talking about a Data Integration Strategy:

  • ODI Suite: Data Delivery Services by usage of ESB, the mediator
  • ODI Suite: Orchestration of composite services through Oracle BPEL Process Manager integration
  • Data Profiling & Data Quality: end to end governance, statistical analysis, cleansing of data and prevention of bad data being loaded
  • Unified workflows using knowledge modules
  • Declarative design using only databases and pl/sql, no need for other technologies
  • Datamart to deliver star schema’s
  • Hyperion integration for fast querying
  • Webservices to integrate to third party applications, e.g. for campaigns
  • Metadata Navigator for data lineage (and documentation purposes)

ODI will be incorporated throughout the entire Oracle Stack to enable pervasive data integration.

Pre-packaged data-integration for Apps (Peoplesoft, Siebel, SAP) so all ETL-flows are performed by ODI.

Tips & Tricks when using ODI for your enterprise-wide data integration:

  • Define groups in ODI
  • Use ODI Security Module for creating generic and non-generic profiles and arrange user rights
  • Use Oracle Analytic functions
  • Use naming standards for variables, user functions, …
  • Don’t call pl/sql functionality outside ODI, this would be strong wiring

For information regarding ODI you can download our presentation were we’ve compared OWB, ODI and ODI Suite: The Next Generation of Business Integration: Making the right choice!

Handling request-response message in ESB Routing Services

Ever wondered how to define routing services based on soap services or database stored procedures where you could define routing rules on the request-and response-message.

Let’s say you want to define the following scenario:
- Call an existing stored procedure that takes as input parameter ‘employee.id’ and as output parameter the employee-information (firstname, lastname, address, …)
- Define routing rules on this stored procedure so you can send the response-message, the employee information to a 3-party application when the response-message isn’t empty. In other words, when employee information is found for the given employee.id you will send this xml-message to the 3-party application.

This isn’t rocket science right … this should be a piece of cake …

Well let’s start building up the scenario in ESB, using our IDE Jdeveloper:
1) Create a new esb project
2) Create a new esb system
3) Create a DB Adapter calling the stored procedure we’ve defined to extract employee information using the employee.id as input parameter
4) Create a routing service based on the DB Adapter

=> We stumble upon the first problem => we’re not able to define a routing rule on the response message, only the request-message (which is the input-parameter employee.id) can be used in our routing service.

Well let’s try to create an empty routing service and use the xsd’s generated by the db adapter, use the request-parameter as request-message and the response-parameter as response-message.

=> Nope, still not the right path to follow, we still can’t define a routing rule on our response message.

Having a look at OTN doesn’t provide me any answers, using the ESQREQUEST parameter is much to complicated for my basic scenario.

Now let’s have a look at how this could be accomplished in a much more straigth-forward manner:
1) Define DB Adapter (as in the scenario defined above)
2) Define a routing service using the request-parameter of the xsd (don’t fill in the response-message)
3) Define a routing service using the response-parameter of the xsd as the request-parameter of the routing service

In other words :
Define 2 routing services in which you will define the first to handle the request message and the second one to handle the response-message.

4) Link the 1st routing service, handling the request message, to the second routing service and now you can start using your services as needed.

I will provide screenshots for this scenario ASAP.

So what is FUSION exactly … well it’s exactly what you want it to be !!!

When you’re looking into Oracle 11g, what’s coming up, you will notice one thing very clearly … FUSION. SCA, Webcenter, Jdeveloper, OWB, ODI, BPM Suite, Oracle Service Bus, … it’s all becoming 1 user experience.

If we take a look at the Oracle Fusion Approach, you will have the same user experience in each environment you working in.

The IDE’s are converging, the management console’s are converging, … in other words the developer experience, dba experience, management experience, etl experience … every person, every team will work in the same UI, having the same experiences and in other words can team up with all different teams and projects.

It’s amazing when looking at OWB 11g release … it’s ODI put in a Jdeveloper UI, it’s like you’re working inside Jdeveloper not on a web application, soa architecture, database application … no your designing your datawarehouse.

AIA patterns can be used in BPA Suite, BPM Suite will hold all the features of BPMN to deliver a BPEL Blueprint which can be enriched in your Bpel designer. Bpel designer can be used in Jdeveloper or Eclipse, as well as the ESB Designer. Having the Jdeveloper experience and talking about data integration aspects and ETL, well no problem open up OWB 11g and have the same developer experience. The IDE looks the same, works in the same manner as Jdeveloper as well for OWB as for ODI, all suites converge … were do you have that experience now?

Moving towards 11g you will have the converge of BEA and Oracle to have a stronger middleware, an enhanced governance-approach (using AL Enterprise repository) and a whole new feature set coming up in SCA.

So now the million dollar question … when can we use all this great stuff ;o)

Case at Oracle SOA Partner Community Forum

On August 26th and 27th, the Oracle SOA Partner Community forum will be held in Utrecht, The Netherlands. During this event, technical sessions as well as partner success stories are presented throughout the community forum. We were invited to bring our customer case study of Forms modernization using web services and ESB.
The story: Empowering Oracle Forms within a SOA Architecture: The case of ZLM.

During this forum you can learn from success stories of partners, join different breakout sessions, gain information from other SOA partners and listen to a vibrate panel discussion.

Additionally to the SOA Partner Community Forum, you can participate in technical hands-on workshops for BPA, SOA and J2EE Infrastructure, on August 28th and 29th. The goal of these workshops is to prepare you for customer implementations.

You can find the paper, case itself on the following locations:
http://otn.oracle.com/products/forms
http://otn.oracle.com/goto/formsmodernize
http://otn.oracle.com/formsdesignerj2ee

Empty error message for instance in ESB

Yesterday I was creating an ESB that we wanted to use in a demo.
I was using a DB schema that I exported from a DB that was used in one of our internal applications.

The ESB itself was in fact a very simple example with just a file adapter to get the content of a csv file and a database adapter for inserting the content retrieved from this csv into the database.
I also checked the option to remove the csv when it was successfully retrieved.
Everything looked fine and I registered the application onto the application server.

So far, so good…


The application was registered without any problems and I wanted to test my application.
So I created a test csv file and copied it in the directory, waited 5 seconds until the file was removed, so I tought that everything worked correctly. I looked in the database, but my records where not inserted in the table.

Time to take a look at the ESB controller…
And indeed the status of my instance was invalid, so the next step to do is to take a look at the error message….
Strange, the error message was completely empty, no error, no trace, nothing…
This made it off course a bit more difficult to find out what the problem was.

So I took a closer look on this and tried to find out what I missed. I changed some things here and there, but did not found a solution, I always got the empty error.
Then I looked a bit closer to the table where I wanted to insert the data into and compared it with the content of the csv file. All the values of my csv for the primary, foreign, not nullable columns looked ok.

So why did he gave me that empty error message?
Then I tried to insert a record manually into the table, and I found out that this didn’t work either. Seemed that there existed some insert triggers on this table that would insert some data in a column when this field was empty and that these triggers used a packaged that was invalid(because it was using another schema that I didn’t imported). So just for testing I enabled all these triggers and tried to execute my esb process again. And it worked, so the problem I had was caused by a trigger on the table who was calling a procedure in an invalid package…

So, if you ever get an empty error message, the best thing to do is to check your database first…

Soa Suite 10.1.3.3 – ESB, BPEL – Nice-to-knows, pitfalls

I’ve been checking out the different capabilities and new features of the Adapter-framework in ESB and BPEL for some weeks now and came across some nasty pitfalls, nice-to-knows, … which I would like to share with you.

Of course I would like to share thoughts, opinions and start discussions on these topics.

ESB:

  • How to define xsd-validation on file-adapter (validate payload at runtime-option isn’t available in file adapter) : In the ESB console, select the routing service which is invoked after the inbound file adapter. See the “Definition” tab. The validation option is in the “Operation Details” section. (with thanks to Ronald)

BPEL:

  • Inserting master-detail data using DB Adapter functionality : Referring to my experiences so far it’s best best way to make use of stored procedures instead of the toplink mappings file. I am mainly using the stored procedures because the tooling support in Jdeveloper (wizards, toplink ui), I still miss a good ui for the toplink support. Also it is easy to give the task to create an PL/SQL api to the PL/SQL developers that are working on a certain application. (with thanks to Orjan)

  • [Error ORABPEL-10007]: unresolved messageType for “{http://schemas.oracle.com/bpel/extension}RuntimeFaultMessage”: When you’ve defined an empty bpel process (which is best practice to do the brain-work) you will face this issue when defining fault-handling inside your bpel-process. To solve this error you need to import the RuntimeFault.wsdl inside the adapter you’re using. Following import statement needs to be added:

  • Best practices when invoking Bpel Processes from different UI’s (Flex, JSF, …) : Many thanks to Hajo for his explanation: It is best practice to use the default ways to invoke a BPEL process – create a WSDL that maps to WSIF binding in a controlled environment and to a SOAP/HTTP binding in a more B2B type of scenario. A call to the BPEL API would be a “custom” solution that needs way more governance to communicate with fellow developers and to maintain properly, when compared to the straightforward standard way. . For more details see OTN Thread: http://forums.oracle.com/forums/thread.jspa?messageID=2329384&#2329384

  • Use multiple sources in transform-activity: In 11g a new feature has been added to be able to use multiple sources using bpel 2.0 (bpel:doXslTransform(string, node-set, (string, object)*)). The workaround in 10.1.3.3 is by using the params-approach => http://blogs.oracle.com/rammenon/2007/05/07. Or by using an assign-activity with append-functionality to add the variable inside your source-target and in the same assign-activity add the process-xslt functionality to call your xsl to populate the source with the target-information.

Invoking Web Services from Database:

  • Call an esb service using the UTL_HTTP package => ORA-29266: end-of-body reached => make sure to pass variables using String-notation instead of Character-notation

Interesting New Features in 10.1.3.3 :

  • Controlling the Size of a Rejected Message (10133technotes.pdf):
    You can now control the size of a rejected message by specifying the following
    endpoint property for the inbound File/FTP adapter partner link.
    In this example, you reject 100 lines from the file since the actual file is too large.
    oracle.tip.adapter.file.debatching.rejection.quantum=”100”

  • ESB Endpoint Properties : e.g. ability to add RejectedMessageHandler to file adapter services

Enhancement Requests:

  • Ability to validate xml payload at runtime on Adapter-level instead of on domain level or routing service level
  • Ability to add xsi:nil attribute using xsl-functionality in transform-activity
  • File-adapter: Ability to skip columns besides skipping rows + ability to use special characters in column headers

Well that’s it for now … feel free to share thoughts, comments, etc.