Pages

Showing posts with label internet. Show all posts
Showing posts with label internet. Show all posts

About HATEOAS

I think designing a REST URL is really challenging The resources are represented as hyperlinks.When REST applications are to be designed do business logic states depend on REST URL design ? The changes in application state are done through hyperlinks.So every possible state of applications are exposed as services and state changes are also exposed as urls.So we can draw a graph depicting change in states.So the transitions are done at run time while using the application.It is decided by the server.So the hypermedia as the engine of application state (HATEOAS) becomes a design constraint.So if the server responds as a set of links depicting the transitions, the consumer client can traverse through them to arrive at the final state... Concept seems pretty straightforward, but are there any underlying implementation complexities ? The relationship between resources can be logical, but the server makes relationships for the application.The client can have states defined by hypermedia elements.The tags like IMG,OBJECT,SCRIPT embeds resource within itself.The CSS can have image background links.So these urls can locate to the resource globally.The name space is defined by web itself.I think the scene becomes more interesting when web service discoveries,ordering of interactions, business logic making the client dynamic by the features of this concept.The libraries and frameworks as we know provide a way to call a single function and the system itself calls other functions internally.So this form of calling a single URL as a an entry point for calling other related urls linking logic or resource states performs same functionality like an interactive website.The urls can change at any time.The urls become a constraint in designing for a dynamic application.There will be a tight coupling for the resources and the client due to these permanent urls which will have to be residing forever.So the solution ? The response can be documents of urls for the client to traverse as i mentioned above as to call the resource from a single entry point... which is what HATEOAS is about

With a truly REST based architecture you are free to change just about anything about the disposition and naming of resources in the system, except for a few well known start point URIs. You can change the URI shapes (for example, if you decide that you really hate the currently scheme). You can relocate resources to different servers (for example, if you need to partition you data). Best of all, you can do those things without asking any ones permission, because clients will not even notice the difference.

http://barelyenough.org/blog/2007/05/hypermedia-as-the-engine-of-application-state/

Ambient Intelligence and location aware web

Ambient intelligence(Aml) is explained by wikipedia as
In computing, ambient intelligence (AmI) refers to electronic environments that are sensitive and responsive to the presence of people.In an ambient intelligence world, devices work in concert to support people in carrying out their everyday life activities, tasks and rituals in easy, natural way using information and intelligence that is hidden in the network connecting these
devices.
Its also called pervasive computing or ubiquitous computing.We can build up Necromancer-like future visions of ultra sci-fi devices roaming invisibly around us mumbling 0s and 1s wrapping us inside its matrix... In the Wikipedia , you can read an interesting scenario about the use of ambient intelligence.This idea will be used by most gadgets now in market.. all those smart home appliances,wifi devices, mobiles,RFID, GPS etc.There is a lot to know about the concepts and implementation (complex middle-ware architectures ) for Aml.

In the web world some recent works brought everyone's attention are location aware technology standards and APIs.Geographical Information systems was here for a longtime.Those tools like ARCGIS are prominent in the field.After the Gmap revolution hit the web, there was such an insurge of developing applications based on the geo spatial information.The virtual representation of real world is always the cyber punk readers dreams.The recent technological innovations do make them viable .

Yes the web browsers are the ubiquitous application which will connect the real space to virtual one.Now most mobile devices have browsers.So the web is getting ready for the opera of next generation location aware applications.W3C has now recently released the draft of Geolocation API specification standard.

Mozilla has developed Geode,an experimental add-on to explore geo-location in Firefox 3. It includes a single experimental geolocation service provider so that any computer with WiFi can get accurate positioning data.






Yahoo! Fire Eagle is a service that acts as a broker for your location, creating a single place where any web service, from any device can get access to your last updated location.service enables you to share your geographic position over many different applications, websites and services. You can update your location with your GPS-enabled phone (e.g. iPhone 3G) or any other software that integrates with FireEagle and you can allow websites and services to use this information.

Google released its Geo location API for google Gears .The Geolocation API provides the best estimate of the user's position using a number of sources (called location providers). These providers may be on board(GPS for example) or server-based (a network location provider).


So it is possible to use these apis to develop environmentally intelligent applications in devices especially mobile .It is possible to read my feeds / news based on locality ... like reading The Times of India epaper from Bangalore edition or Hyderabad edition based on location while I am traveling.

What about a web based Advertising platform based on location ..?

No wonder why Google urge to allocate ambient spaces for wifi ... free the airwaves !! These technologies will give numerous possibilities for Android development

Cartography did give the light to Renaissance.It did make revolutions,wars,civilizations ... Now the neo-geographers is leading the world to the cyberspace navigation.World is again shrinking... an invisible revolution ?

The ambient intelligence is realized by the human centric products than technology centric products.The information technology is now spreading from big corporates dealing with the mammoth analytical data and transactions to common man...where the data is unimaginably galactic.. geo-spatial data.. the paradigm of consumerism enabled by the new generation web ideologies. These ubiquitous location aware devices could form the new era of invisible computing
a term invented by Donald Norman in his book The Invisible Computer published a decade ago to describe the coming age of ubiquitous task-specific computing devices. The devices are so highly optimized to particular tasks that they blend into the world and require little technical knowledge on the part of their users.

image from here

Edge Computing

In our academic final year BTech project , we built a Linux cluster to demonstrate how an on demand SaaS philosophy (actually we didn't know the term at that time !) application could be implemented.It was not a 100% ideological demonstration.We could run the cluster of 4 nodes in the lab and the application ... but not the grid level implementation . We as a team did work well to make it happen.It was cutting edge at a smaller level...

Edge computing ... ?

I did write an article on CDNs(Content Delivery Network) before. But I didn't know about the concept of edge computing even when I saw Beijing Olympics through NBC online. I never guessed when i watched Youtube videos. The real transformation that is happening in this web 2.0 cyberspace.The web uses additional metadata to cache and transport data than traditional web caching.Its content-aware caching(CAC).This form of computing is made available to on demand applications too. All those web 2.0 mashups,widget networks, etc I thinks its too complex in the architectural and deep theoretical level to understand the technology, because it seems so "cloudy".All these data network latencies, jittering, content replications ,databases spread throughout... and so on.The request to an html or javascript are delivered from a nearby cdn node in networks like Akamai or Limelight (used by Youtube) to our pc/gadget. What about the personlized pages? They also need dynamically generated content.They get data from databases.The databases are spread across the web. The database cache mechanisms deliver the data. What about business applications ? Their database results are to be cached.Suppose two queries having same subset of data . They are to be cached reducing frequent database access.Using query templates and containment check it can be done. I saw a ppt from here explaining web application replication strategies. .Technologies like sharding using MysQl , database clustering etc are used these days.There were a lot papers published on this technology for a long time.Its practical and becoming the backbone of internet for technologies like video streaming , VOIP, mashup services ,even illegal botnets !! etc these days.Its providing high QoS, availability,virtualized computing,but security through firewalls at all the nodes (centralized as well as network level).So the internet at the network level is becoming strong needing more complex administration.... cool for network engineers.Its complex it sounds interesting and promising.

Its about data ...More curious What about the computing and implementation level ?

I grabbed an article from Akamai regarding the deployment of java enterprise applications in their CDNs from here

Another article was found about Objectweb's JonAS application server working on a self-* autonomic computing based on IBM research.

From the manifesto i read about the comparison of today's internet autonomic computing and our autonomic nervous system, freeing the conscious mind with all those involuntary activities .....

Many intelligent students,professors, engineers working on it .Yes the collective intelligence is in the embryo stage....

The breadcrumb folk tale and intelligent user interfaces

In the old German folk tale of Hansel and Gritel the two young children attempt to mark their trail by leaving breadcrumbs on their path as they walk through a forest. In information architecture, especially interface design or GUIs, breadcrumb refers to some sort of visual path that allows the user to see where they are in the interface and to retrace their steps if needed.The technology is called hyperlink connecting Univeral Resource Identifiers.Its the human psychology to virtually traverse through his thoughts on sorting out the infomration he needs.The words in the mind-space is connected through the virtual links tracing the inner thoughts in the process of the data quest.The cliche information at the tip of fingers ... The links in the web page you see in the browser are the virtual representation of information deep down in the cyberspace.Are we entering a metaphysical psyche warp of our mind through these portals? :P

A psychology paper from the University of Wichita shows that using the breadcrumbs in the model tends the users to use these shortcuts more .This is definitely a good approach to define the related entities and the users will be able to navigate through the huge data in web.Is it possible for an intelligence in web, some mention it as collective intelligence, converse to user using this tool of navigation?


As semantic technologies will become more prominent in near future internet information architecture,the user interfaces to communicate this collective intelligence effectively to user are expected to be simple and elegant.May be the touch screens and 3D interfaces will provide an immersive user interaction.What will the fastest way of traversal?

An old Buddhist monk revealed to his disciple that mind is the fastest traveler in this universe.. Will the future interfaces catch up with lightning speed of mind? We can expect warp holes in our web pages ? Will there be information teleporters on our desktops ( that sucks!)? Even hyper links are based on these ideologies .. may that will suffice.. The problem is we choose the path now ...

Ha there I saw an old paper describing about AI based interfaces.It says

Most researchers would agree that a system that can maintain a human dialogue would be considered intelligent (remember the Turing test?). The problem is that there are a lot of interfaces that we would consider intelligent, that do not look "human" in any sense at all. An example is the PUSH interface , which presents hypertext in a manner that is adapted to the user's current task. The system is controlled mainly through direct manipulation, but the output consists of a text where certain pieces of the text are "hidden" from view, to give a comprehensive overview of the pieces of text that are most relevant to the user in his or her current task. This very passive form of user adaptation does not in any way mimic human behaviour, but is constructed to be a natural extension of the hypertext view of information.
I like playing computer games.In games the interface , or the protagonists goals can be changed by the AI of the game as per the user interaction.I will say thats what happening in web.The information we seek will be based on the understandings made by a collective intelligence in the web.Suppose you want travel to a place during your vacation.. what if the application itself choose the best destination,information about it , the room,the flight ,the travelers tools.. so on to be displayed in the browser within a second..? Sounds cool even if you are your way back home with your personal intelligent assistant (possibly a future iphone !)See this movie made by Apple in 1980s , a film short that also imagines a future of computing via intelligent agents a long way back internet blips occurred. Link


In future choice wont be yours..

What is Computational REST ?

I have written about REST before.CREST is about computational REST.I think the concept will be a milestone in developing applications on Resource Oriented Architecture (ROA).ROA is a set of guidelines of implementing REST architecture.This wikipedia article give a simple idea of the world of representations.

The image is taken from here gives an illustration of the kind of message passing used in Representational State Transfer.

REST is architectural style of distributed systems.RESTful Webservices are gaining importance in developing scalable applications.The web 2.0 technologies,ajax,mashups,comet,pub-sub technologies do envision the power of computing using internet.The computational resources in the web can be accessed by URIs.Whats buzz CREST ?

A client needs to execute a program.The origin server executes the code and return the result.This form the basis of CREST or Computational REST. Code snippets having conditional logic can be sent and the response will reflect the behavior of application.So the application can become more client centric.It can be like a service behavior changes with different client conditions.Computational REST (CREST) as an architectural style to guide the construction of computational web elements... The idea is similar to mobile code.This idea is originated from Rethinking Web Services from First Principles .They have given the underlined principles of REST and CREST.The technology is under research.So internet technologies are heating up.
The technology for using REST for process intensive integration is in a budding stage only.The distributed processing in REST way.. Programmability of web is becoming complex and disruptive!

Death of ECMA 4 and future of Native JSON parsing



JSON is a subset of JavaScript.It is used to represent data as tokens of name - value pairs.This technology provides an efficient dataportability for mashable appliations.The JSON syntax is like JavaScript's object literal syntax except that the objects cannot be assigned to a variable. JSON just represents the data itself.Data is the string.This literal has to be converted to object.We can use eval() in Javascript to evaluate script.But using eval() is harmful.So a JSON parser will recognize only JSON text, rejecting all scripts.




If u need a javascript parser use ..

JSON.parse(strJSON) - converts a JSON string into a JavaScript object.
JSON.stringify(objJSON) - converts a JavaScript object into a JSON string.
Other parsers are

Jackson JSON Processor based on , STreaming Api for Xml processing .

JSON-lib is a java library for transforming beans, maps, collections, java arrays and XML to JSON and back again to beans and DynaBeans.

If you are using mozilla firefox and your browser supported above gecko 1.9, there is a regexp:test() function.

According to specification proposed by D.Crockford :

A JSON text can be safely passed into JavaScript's eval() function (which compiles and executes a string) if all the characters not enclosed in strings are in the set of characters that form JSON tokens. This can be quickly determined in JavaScript with two regular expressions and calls to the test and replace methods.

var my_JSON_object = !(/[^,:{}\[\]0-9.\-+Eaeflnr-u \n\r\t]/.test(
text.replace(/"(\\.|[^"\\])*"/g, ''))) &&
eval('(' + text + ')');


This particular technology is alternative to XML to port data.The webservices implementations return XML data.The applications use parsers to generate native objects from these XML.XML is the data used in AJAX, which became the buzz in new age web.The data format is vital to the efficiency of any application.JSON provides an efficient data transfer.

Read JSON: The Fat-Free Alternative to XML

If you want to add JSON to application , read here how Yahoo webservices implemented this.

According to John Resig, browsers should support native json support,

He summarises that

The current, recommended, implementation of JSON parsing and serialization is harmful and slow. Additionally, upcoming standards imply that a native JSON (de-)serializer already exists. Therefore, browsers should be seriously looking at defining a standard for native JSON support, and upon completion implement it quickly and broadly.
Read here for the ECMA4 proposal

but now ?

ECMAScript 4.0 Is Dead


JavaScript standards wrangle swings Microsoft's way

The industry is again having war on browsers and internet standards.

So different methodologies to make JSON will be exist for some years on.As the system become complex, vulnerabilites will arise and we have to findout resolutions for those issues every time.