Cloud Computing+ IoT, new challenges for the softwareTechnology Posted in 27 de October de 2016
By Roberto C. Mayer*
The spread of Cloud Computing and the Internet of Things (Iot) has created a wide space for innovative implementations. Thousands of startups are being set up with the goal of developing and taking advantage of this innovative potential.
Through the last 50 years the number of CPUs in our planet has grown at a rate of about ten times every decade. History can be summed up, starting from mainframes, followed by minicomputers, the personal computers, the mobile devices and the autonomous device (which are called connected “things” because they don’t need a human being operatingthem).
At the same time, telecommunications networks have become so wide that there have been people who would state that “the computer is the net”. But the modern global network is way less reliable than the traditional local networks: there are not only internet links that are unavailable or operating slowly, but also a growing number of devices whose energy comes from batteries (that are over in the worst moments, according to Murphy’s law).
In other words, the Cloud Computing and Iot presents us with connections that are not fully reliable and that contain devices that disappear from the network (due to lack of energy, such as cell phones, tablets, credit card machines, etc.) At the same time we want to integrate the traditional mission critical servers in this new environment without losing safety, scalability and audibility among others.
That way we realized that not only the rhythm of software innovation, that has always been behind the curve when compared to the speed of hardware innovation, but also the characteristics of the new environment form a new software development neck, which we need to solve ate the lowest cost possible.
Incorporating new users, through the cloud, to corporative mission critical information systems, is a challenge for most companies.
This challenge is even bigger when the companies have lots of interaction channels with their clients. For example, banks and retailers engage with their clients in physical stores our agencies, ATMs, through the internet and cell phones, e-commerce (or online banking), etc. The clients preferences of customer service are damaged due to the difficulty of sharing the information about preferences among the various channels.
For example, if a bank client personalizes tis mobile banking to prioritizes the transactions that he uses more frequently on the screen, it is uncommon to find the use of this information in ATMs of the same bank. Making this omni-channel service demands sharing information that belongs neither to the specific systems of the channels nor to the mission critical system of the bank.
Similar difficulties arise when implementing the Internet of Things. As an example, let’s consider an “intelligent city” covered by rain sensors over regular distances that give (to a Central system) information about the volume of rain that falls.
This information about the rain volume may be shown in a city hall map so that the citizens can check where there is more rain; the same information can be given to the server of the app Waze so that it sends less drivers to the streets where the rain is more intense and can be used by the telemetry system of the urban buses, so that they reduce the speed drastically when they enter in an area of heavy rain (decreasing the possibility of accidents).
Concerning the examples above, the most important conclusion was that these new implements have new requirements that are common to most of them: saying in a different way, the requirements are much more of a consequence of the Cloud Computing and IoT than specific needs of the implementing.
Limited transactions in time
Through the design of a new tool, we could still recognize the opportunity of innovating in the concept of “transactions” as well, a concept widely spread since the adoption of Relational database management systems at the beginning of the 80s.
Transactions are defined as “a group of operations that only make sense if all of them (or none) are successfully executed”. The software warns the server when a transaction starts and later confirms that the group of operations was successful (called commit in IT slang), or, in the case of error in one of the operations of the group, asks the server to ignore the previous operations that had already been executed as part of the transaction (fact known as rollback).
Initially, the programs and the data manager worked on the same computer. Later, in the client-server era, the started to operate in separated equipment, but in the same local network.
As in the “world of the cloud”, but the communication between the program and the server in not reliable and the program may be turned off (when the battery of the equipment is over) in the middle of a transaction: in such cases, the transactions that were started and that were not neither commit nor rollback object are pending and occupying resources of the server indefinitely (that is why some of the managers of these server make the option of restarting them every night).
However, we considered that it would be much more adequate to modify the negotiation around the transactions in a way that only the Commit is warned in case of a successful execution. To achieve that, it is only necessary to specify (at the beginning of the transaction) the maximum time of the execution of the transaction.
That way, if the server does not get the Commit in the maximum time predicted, it can execute the Rollback automatically (releasing its internal resources). And we make possible the construction of transactions involving individual operations executed in different back-end servers.
*Roberto C. Mayer is a member of the EthiclaCounsil of Assespro – SP; Director of Communication of Assespro Nacional; founder and CEO of MBI and vice-president of ALETI.