Pages

Showing posts with label soa. Show all posts
Showing posts with label soa. Show all posts

JiBX - Part 1

When we consider XML as a data model while programming, focus is mostly given on strings/tags/elements in parsing data.The ETL operations are done on this ubiquitous document interchange model when communication is made between applications.To make things simple in the internet; which is fond of documents ; XML became the foundation of services through web.The object, which is accessed from object graphs in the memory have features such as inheritance / polymorphism and attributes / object-based relationships while XML does not have any of these features as objects have.It is merely a grammatical (hierarchical) representation of data with all its branches mangled to itself .But they have similarity in the sense of representation of real world data.Therefore they can exists as the representation of each other which eases in programming.They are effective in defining business uses cases.But there is an impedance mismatch between objects and xmls.An application written in Java, will have it data types defined within the scope while XML Schema which defines the data is richer than Java.Complex objects find difficulty in serializing.Have a look on this paper which explains the X/O impedance mismatch.

We can generate Java classes from XML or vice versa.In this case, it is a "Schema-Centric" approach.We define an XML schema.Then one or more XML documents.Then generate java classes based on them.In this case you need a stable schema which can be used to validate data.Essential for a "reliable" web service.But application code is forced to use an interface that reflects the XML structure which makes them tightly coupled to contracts defined.Any change in schema make a need to regenerate the object model and change application code to match.

If we map the classes using bindings, then its "Java technology-centric " approach.This can be adopted when you don't want object model tied to schema.When there is a need to support different versions of the schema with the same object model or a common data exchange format for existing objects is needed.Binding definitions are themselves XML documents, with a schema defined for it.

JiBX is fundamentally a Java technology-centric (mapped binding) approach uses binding definitions.You define how XML relates to Java objects.Start from schema, code, or both.Binding code is compiled (or wired ?) into files (which uses BCEL byte code enhancements ). It can be done at build time, or on-the-fly at runtime.This makes JiBX compact and fast.JiBX achieves its performance by using post-compilation bytecode manipulation rather than the reflection.The advantage is that there is no need for getters, setters and no-arg constructors.One can write the class without considering mapping issues and then map it without modifications. XML schema describe domain objects.JiBX really maps well to XSD. JiBX is based on a XML pull parser architecture (XPP3) . Rather than generating code from a DTD or Schema, JiBX works with a binding definition that associates user-supplied classes with XML structure. The binding compiler is executed after code is compiled and the marshalling/unmarshalling code is added on to the class files.There are tools along with it, like Bindgen that can generate schema from existing classes.


Tutorials

http://jibx.sourceforge.net/binding/tutorial/binding-start.html
Pdfs - JiBX -Part1 JiBX-Part2 Intro

Previous articles
Creating a Java WebService using Axis 2 (a lazy approach)
A simple RSS parser

What i think about event processing...

An amateur thought.

I think the most interesting area of information processing is about event processing.Most of the large scale enterprise applications are based on event driven architecture.Event based information processing is the most advanced area i haven't gone through yet.But reading about it i found it really interesting.The state models,lexical analysis,reactor patterns, callback event models etc are the used behind it.Event driven design is an approach to program design that focuses on events to which a program reacts.According to these events there event handlers registered will respond.This is the fundamental of any GUI based application.An event listener will be attached to a button and handler responds to events.I think it is the basic underlying architecture of any responsive application.If you worked on a 3D application the events on 3D positions of polygons have to be registered.Every movement in space and trigger an event.. good gaming.. If i have to think big , consider the finance stock viewer online.The stock responses are reflected in real-time...most of them know about ajax based technology which is popular behind the dynamic graphs. But what about the complex business logic? Any rule engine will define a set of rules to act according to changes in input.I can compare this system as a stimulus response of an organism.If we take human brain, the predefined genetic rules will be there to adapt to these ever changing environment.There can be sudden stimuli or gradual one, depending on inputs.What about the pattern recognition ? Human brain is highly sophisticated ... mmm i am boring now.If it is about realtime processing, then I like to refer to CEP, Complex Event Processing (CEP) which is a technology for low-latency filtering, correlating, aggregating, and computing on real- world event data. If this complex event processing is enabled in a network...? To a collective intelligence? I read that context based switches are now implemented in CDNs.Whatever.... its really complex and interesting..No wonder the huge amount of data in the web can be used for social "business" intelligence ...CEP actually builds on what business intelligence (BI), services oriented architecture (SOA), cloud computing, business process modeling (BPM) provide.Mashup technologies along with semantic web can provide more granulated data where most of the technology based products moving into.Some people say SOA, some WOA and SaaS,cloud and so on..Consider about NASA satellite data.Huge amount of data from satellites gushing all through the channels are processed using various algorithms of image processing,signal processing algorithms... What about all those RFID based data affecting the supply chain tracking? What if we are going to track every consumption of fule in the world in a realtime using gps trackers and sensors ? What about streams of data processed by supercomputers on weather forecast based on certain models ? They are crucial and brain forging.That`s how information technology becomes the backbone and most sophisticated part of human civilization.

Its all about data and Network is the computer!!

May be we are trying to make an efficient system as fast as our brain.At least the model of all these logical applications are expert systems.Why should I write about stuffs that are very complex to me ... I am not expert in all these.. just blogged in curiosity.There are basics to learn...

There is a good article in wikipedia about CEP

http://en.wikipedia.org/wiki/Complex_Event_Processing

Another article in infoq

others... Link Link .

An article on NASA funded CEP project Link