Jul 23, 2012

Top technologies for future enterprise IT

10. HTML5:


The fifth edition of the Markup language; HTML, originally introduced by Opera Software, is capable of accomplishing tasks that are significantly different from its predecessors HTML4.0 and HTML4.1. 

HTML5 brings with itself the features of Local Storage, canvas, video, that let you do much more than dragging images and text into one rectangle box. The new WebSockets spec brings a new way to conduct full-duplex communication that will bring a drastic shift for web developers from Flash to HTML5.




9. Client-Side Hypervisors:


Desktop Virtualization failed for two reasons: It entailed constant connection between the client and server, as well as a huge server to run the desktop Virtual Machines. Client hypervisors provide a solution to both these problems. By installing Client Hypervisor on an ordinary machine you can power the processing capabilities of the client. Laptop users can take a business Virtual Machine that contains the OS, apps, and personal configuration settings. This VM is safe and separate from everything else that may be running on that machine including accidentally downloaded malware and you get all the virtualization management advantages, such as VM snapshots, portability, and easy recovery. Client hypervisors represent a future where we can bring our own computers to work and download or sync our business Virtual Machines to start the day. As of July 2012, the Client hypervisors are still under development and once released will revolutionize the way we work.



8. Continuous Build Tools:


Continuous Integration helps developers implement quality control by applying a small amount of effort put in repeatedly. It helps in implementing quality analysis of software and to reduce the time in which the product is delivered. Some continuous tools like Jenkins, Hudson, and other "continuous integration" have been there for a long time. These tools put the code through a number of tests, warning developers about problems with code they checked in recently, so that everybody keeps moving towards the same goal.


Tools like Hudson or Jenkins are old and there have been a number of efficient proprietary continuous integration tools for some time, but the surfacing of open source solutions hearten the kind of experimentation and innovation that comes when programmers are given the chance to make their tools better. It has many advantages like early warning of conflicting or broken codes, immediate unit testing of all changes that explain the reason why continuous integration is a technology that will stay for a long time.



7. Trusted Hardware Check:


Promising security at the highest application level requires authentication at every level, including the physical construction of the computing device. The TPM (Trusted Platform Module) from the TCG (Trusted Computing Group) was among the first hardware chip that was popularly adopted to check for trusted hardware and boot sequences. In the same line, last year, Intel combined the TPM chip and a hardware hypervisor layer to guard boot sequences, memory, and other components. Any software vendor can benefit from it. Hardware trust solutions are not perfectly secure, as the Princeton memory freeze and electron microscope attacks have shown in the past exposing its slim side of TPM. The hardware protection schemes will only get better in the future. The latest advancements in this field have shown tremendous promise in the recent past and soon you can expect to see every computer device you use having a hardware/software protection solution running.



6. JavaScript Replacements:


JavaScript has been the most commonly executed code in this world until now, but for all the success it had, everyone wants to move on to the next big thing. Some users want to build an entirely new language that will fix all of the troubles they were facing with JavaScript; while others just want to translate their code into JavaScript, so that they can act as if they don't use it. Translated code is the new rage in the market. Google's Web Toolkit cross-compiles Java into JavaScript so that the developer types only correctly typed Java codes. Some translations are superficial: Programmers who write in Coffee Script don't need to worry about much of JavaScript's punctuation because the cross-compiler inserts it before it runs. Other translations, such as Google's Dart, are more ambitious, pointing to a future of ever more options. These will bring a necessary change in the software industry enabling swift development of the software.


5. Distributed Storage Tier:


Much faster than disks and many times cheaper than DRAM, NAND flash memory is a hot commodity that will heat up further with time when storage management software will catch up with its potential in the data center. Its unique combination of high speed and low cost makes it excellent for the server-side cache and a natural choice for tier-one SAN storage. With flash becoming cheaper day by day and the capacities of SSDs on the rise, the days of disk drives in servers and SANs seem to be reaching an end very soon. The best part is that the Flash storage will enable server-side storage to be administered as an extension of the SAN, storing only the most frequently accessed or I/O-intensive data closer to the application. It resembles caching, but smarter and more cost-effective and this is the primary reason that it can be touted as one big game changer in the Enterprise IT sector.


4. Apache Hadoop:


An open source software framework, it supports data intensive data applications under the Apache v2 license. It enables its users to work with number of computational independent computers; it also lets its users to read through several terabytes of unstructured data. Tools such as Apache Hive and Apache Pig allow the users to explore Apache Hadoop in a much easier way. It has been one of the top level Apache project and has contributions from all across the globe.  Yahoo! has been the major contributor in the project and uses this platform extensively across almost its entire platform. Some of the other users of Hadoop have been Amazon, American airlines, HP, IBM, Intuit, Microsoft, LinkedIn, Twitter, eBay, Foursquare and many more. The best part about Hadoop is its wide acceptability and the way this platform has so much to offer. Its importance in the long haul can be estimated from the fact that we have just started discovering Apache Hadoop and it has lots more in store.


3. Advanced Synchronization:


There has been one major flaw in the computing model that we have been using for years: The limitation of accessing the data on a single platform. Apple and Microsoft understood that single user environments will only limit their users and thus Apple initiated the concept of cloud computing from its operating system iOS. Microsoft taking a step further has integrated the feature of putting Applications on cloud so that the users can sync their devices irrespective of their location and the environment in which they are working. This advancement is sure to bring a new development in the way we use computers today and will give apps a whole new utility. This will give rise to a more user centric model of computing and will be free from contexts such as location, available input methods, connectivity etc and will profoundly change the way how IT has been approaching applications off late.


2. Software-Defined Networks:


The Data center networks have grown drastically in the recent past. The servers and storage have benefited themselves from software abstraction, the networks still continue to be hardware-bound and static, and this has been the major hurdle in implementing cloud computing. This brings the role of SDN (software-defined networking) into act, which covers one software layer over switch and router hardware to serve as both a centrally managed control plane and a platform for innovation. SDN is not a network virtualization rather it is a way to program the networks that is; it allows the cloud source and ISVs to build new networking abilities on which the rest of the users can draw on. The leading example of SDN today is OpenFlow which is the idea of university researchers who wanted to experiment with new network protocols on large production networks. These software defined networks can bring a drastic change and have the capability to become an integral part of enterprises over a long period of time.


1. Private Clouds:


A private cloud is a term used to refer to clouds that organizations implement borrowing technologies from public cloud providers to implement their own data centers. These private clouds can redefine the way organizations store the data. This will not only improve the way data is stored these days but will also enable the workers of the organization to access the organizational data on the move irrespective of the contexts of location, connectivity. This will empower the organization manifolds in dealing with storage problems. Several Private cloud orchestration have gained popularity in the recent past, one among them is the open source software OpenStack that has received wide acceptance among users in a short span of time. Eucalyptus is another option that gives the facility of private cloud orchestration. Private clouds will surely transform the conventional form of storing data that forced its users to be limited with respect to its location.








0 comments:

Post a Comment

Thanks for sharing your knowledge