August 25, 2014
Volume XLIX, Issue 5
Executives Predict Cloud-Enabled Transformation
Excerpted from Baseline Magazine Report by Dennis McCafferty
The majority of US companies recognize that the cloud will play an essential role in innovation in the very near future. Yet, most enterprises don't have a clear cloud migration plan in place, according to a recent survey conducted by Oxford Economics for Windstream.
The accompanying survey report, titled "The Path to Value in the Cloud," reveals that organizations are falling short when it comes to getting the right performance and ROI metrics in place to measure cloud-enabling success.
There are also concerns about the cloud's impact on security, costs, platform compatibility, and privacy. However, overall sentiments remain upbeat, as the cloud is expected to help increase geographic market expansion, business transformation, collaboration, and other strategy drivers.
"Cloud computing today is fundamentally altering business processes and changing the way organizations interact with customers, partners, and employees," according to the report.
"This transformation brings incredible opportunities, including the ability to build a real-time enterprise where interaction and innovation flourish." A total of 350 US business and technology executives took part in the research.
Please click here for the full report.
VC Investors Hot for the Cloud & Mobile
Excerpted from ComputerWorld Report by Sharon Gaudin
Venture capital fundraising has picked up steam in the US, with cloud computing and mobile technology getting solid backing.
US technology companies are looking at a strong investment climate with global investor confidence in US companies up significantly for the third year in a row, according to the 2014 Global Venture Capital Confidence Survey released by consulting firm Deloitte & Touche and the National Venture Capital Association (NVCA), a trade group for the US venture capital industry.
"For the past three years the US has seen a significant increase in investor confidence, continuing the trend which began to take hold in 2012," said Jim Atwell, a Managing Partner at Deloitte.
So where are investors looking to sink their money? According to the 2014 study, which surveyed more than 300 global venture capital, private equity and growth equity investors in May and June, they've been largely backing cloud computing, mobile technologies, and robotics companies.
The survey noted that for the second year in a row, US venture capitalists named cloud computing as the area in which they were most confident of investing. Mobile technology came in a close second to the cloud and enterprise software wasn't far behind.
In the survey, investors were asked to rate their confidence levels in different industry sectors on a scale of 1-to-5, with a score of 5 showing the most confidence.
Investors, with a confidence level of 4.11, were the most interested in backing cloud computing. Mobile technologies came in next with a confidence level of 4.02. Healthcare IT and services was next with 3.94 and enterprise software showed 3.77.
The survey also showed that global investor confidence in the cloud was up 2% from 2013. It's holding steady for mobile and is up 9% for healthcare IT and services. Enterprise software is up 2% year-over-year.
"Both the cloud and robotics are hot areas right now, so there's lots of opportunity for change there," said Zeus Kerravala, an analyst with ZK Research. "This is great. Typically, innovation comes from start-ups, not the huge IT companies. These smaller, more nimble companies can bring innovative solutions to market. VC investments help drive new ideas."
"For the cloud, I think we can expect a broader set of cloud offerings that are really optimized for the mobile world," Kerravala said. "A lot of the cloud offerings today are desktop apps that are made to run on the cloud. To me, cloud apps should be more predictive and have contextual knowledge of who you are, where you are and what you're doing."
Cloud Technology Is the Final Piece of the Globalization Puzzle
Excerpted from Science 2.0 Report
If you were a 1990s protester in a developed nation, you probably hate the idea of globalization, though democratization of culture and wealth have clearly been very good things. Globalization used to be controversial but by now no one sentient really thinks cultures that condone rape and stoning of women should be preserved.
Cloud computing will take that globalization to the next level, because it is a key enhancer of innovation and economic development - and it gives groups without giant budgets for hardware a way to compete, just like food science that lets crops grow in inhospitable climes saves lives.
It promises the democratization of access to knowledge and resources is as palpable in the business world as in the end user's. For example, developing world groups can compete technologically, optimizing the use of resources while making a minimal investment, while the user experiences how to the immeasurable range of information and the possibility of using specialized tools with the aid of a web browser.
Cloud computing is an emerging paradigm for distributed computing systems whose goal is to offer software as a service over the Internet. The CLOUDS research project has focused on advancing the state of the art of this technology trend, one that is revolutionizing computing and the way in which users and network providers interact online.
As everything Internet touches, this development is marked by the imprint of the globalization spirit. From the self-sufficient individual device that stores all data and contains all the applications required by the user, we are evolving towards an on-demand service model for computation, communication and information storage that dynamically adapts to variations in consumption and meets the needs of a global market.
The cloud allows the crossing of technological, geographical and administrative boundaries, concentrating information and services in data centers and devices that are remote but accessible online at any time, from anywhere and from almost any device or terminal. The level of autonomy, scalability, automation and flexibility this computing model provides is unprecedented. Technological resources are distributed globally and the information is stored on Internet servers, freeing the user from the traditional dependence of the device, enhancing mobility, accessibility and security, and allowing until recently unthinkable access to next generation services via payment for consumption, that is, without a substantial prior economic investment.
Research on cloud computing poses challenges and promises advances that go beyond what is covered by the CLOUDS project. Indeed, its potential applications impact various areas that have in common a high degree of innovation. From cellular telephony (HLRs, fraud detection, real time processing of CDRs), banking and finance (detecting fraudulent card payments and money laundering), business intelligence (real time data warehousing and scalable targeted advertising), safety (mitigation of denial-of-service attacks, processing systems security events), sensor networks (processing output of massive sensor networks) to domotics (smart buildings).
The CLOUDS (Cloud Computing for Scalable, Reliable and Ubiquitous Services) project has been financed by the Department of Education, Youth and Sports of the Regional Government of Madrid and was in operation from January 2010 to May 2014. Researchers at IMDEA Networks Institute collaborated with research groups from two Madrid universities, the Polytechnic University of Madrid and the University Rey Juan Carlos.
Report from CEO Marty Lafferty
The Distributed Computing Industry Association (DCIA) and Cloud Computing Association (CCA) are very pleased to welcome IBM to our all new co-hosted CLOUD DEVELOPERS SUMMIT & EXPO 2014 (CDSE:2014), featuring industry leaders Amazon, Dell, Google, Microsoft, NetSuite, Oracle, Rackspace, and SAP, among many other cloud-computing innovators.
Delegate registration at early-bird rates ends September 6th for CDSE:2014, which will take place in Austin, TX on October 1st and 2nd.
Business strategy and technical sessions covering the latest trends — Mobile Cloud, DevOps, and Big Data — as well as general interest cloud service topics will be featured along with a special focus on three economic sectors experiencing the most cloud adoption: Media & Entertainment, Government & Military, and Healthcare & Life Sciences.
IBM is a CDSE:2014 Gold Sponsor.
IBM's keynote address, "Mobile Cloud Architectures," by Sal Vella, Vice President, Rational Product Development and Customer Support, IBM Software Group, will highlight how government and healthcare entities, and businesses from banks to retailers, are using cloud-based mobile development and delivery environments to very quickly develop in the cloud, deliver on mobile, and test using innovative mobile and cloud capabilities.
Most industries are also demonstrating a strong preference for hybrid-cloud environments by blending the use of on-premises and off-premises environments. Real client examples will be featured.
Sal Vella oversees worldwide development and support of the extensive IBM Rational Product Portfolio.
In previous positions, he was the Vice President for IBM Software Group Architecture and Technology based in Somers, NY, responsible for the IBM Software Group Technical Architecture including architectural standards, support of industry standards, and setting standards and guidelines for IBM's large software development organization.
Sal Vella was also IBM Vice President, Development, Distributed Data Servers and Data Warehousing, and IBM Director of Enterprise Content, responsible for managing IBM's worldwide development of its Content Management software portfolio; and has also been responsible for the development of Storage Software and development of the DB2 Universal Database product set.
IBM will also present two workshops: "DevOps Services in the Cloud: From Idea to Production in Minutes" and "IBM Bluemix: Build, Deploy and Test a Mobile App in the Cloud."
Sandhya Kapoor, Senior Software Engineer, IBM Ecosystem Development, Strategic Initiatives, will lead the DevOps Workshop.
Do you want to build, deploy, and optimize mobile and cloud applications quickly? Attend this session to experience a rapid development environment with integrated services for DevOps — you just need minutes, not weeks, to be up and running with a fully integrated environment offering all the services and tools you need.
If you need a secure, private environment to collaborate across your team and your external consultants (as others are doing today), this is the place for you: Get your hands on an end-to-end solution with the services you need to deliver your innovative ideas quickly. Mobile testing and feedback is included!
Sandhya Kapoor has been working in the Austin, TX Lab since 1989, contributing to IBM's major software projects including Bluemix, IBM's Cloud-based Platform-as-a-Service (PaaS) Offering, Distributed Computing Environment (DCE), WebSphere Application Server, Stack products using WebSphere Application Server, Business Process Manager and Pure Application Systems.
Check-out Sandhya Kapoor's blog here and follow her on twitter @sandhya_ibm.
Leigh Williamson, IBM Distinguished Engineer, IBM Mobile Software Development Strategy, CTO Team, Rational Software, will lead the Mobile Apps Workshop.
Do you need a way to have users just shake their mobile device to provide detailed In-app Bug Reporting -- without time consuming manual entry on status and the issue? How about Over-the-Air Distribution to get your latest app version in the hands of testers fast?
What about Automated Crash Reporting and User Feedback and Sentiment? You need all of this and more. Attend this session to see how you can instantly gain these capabilities through the cloud. Plus, you'll learn how you can build and deploy an App in minutes in the IBM Bluemix environment.
Leigh Williamson has been working in the Austin, TX lab since 1988, contributing to IBM's major software projects including OS/2, DB2, AIX, Java, WebSphere Application Server, and the IBM Rational portfolio of solutions.
Check-out Leigh Williamson's blog here and follow him on twitter @leighawilli.
IBM's participation exemplifies the two major offerings of CDSE:2014:
During the business conference at CDSE:2014, thirty-six highly focused strategic and technical keynotes, breakout panels, and Q&A sessions will thoroughly explore cloud computing solutions, and ample opportunities will be provided for one-on-one networking with the major players in this space.
At eighteen co-located CDSE:2014 instructional workshops and special seminars facilitated by industry leading speakers and world-class technical trainers, attendees will, see, hear, learn and master critical skills in sessions devoted to the unique challenges and opportunities for developers, programmers, and solutions architects.
All aspects of cloud computing will be represented: storage, networking, applications, integration, and aggregation.
To learn more about conducting an instructional workshop, exhibiting, or sponsoring CDSE:2014, contact Don Buford, Executive Director, or Hank Woji, VP Business Development, at the CCA.
If you'd like to speak at this major industry event, please contact me at the DCIA.
Register now. Share wisely, and take care.
The Internet Is Officially More Popular than Cable in the US
Excerpted from Wired Report by Marcus Wohlsen
You can't call them "cable companies" anymore.
For the first time, the number of broadband subscribers with the major US cable companies exceeded the number of cable subscribers, the Leichtman Research Group reported today. Among other things, these figures suggest the industry is now misnamed. Evidently these are broadband companies that offer cable on the side.
To be sure, the difference is minimal: 49,915,000 broadband subscribers versus 49,910,000 cable subscribers. But even assuming a huge overlap in those numbers from customers who have both, the primacy of broadband demonstrates a shift in consumer priorities. Nearly all the major cable companies added broadband subscribers over the past quarter, for a total of nearly 380,000 new signups. Cable subscribers don't have to worry about TV as they know it going away any time soon. But cable is on its way to becoming secondary, the "nice to have" compared to the necessity of having broadband access.
Such a transition might suit the "cable companies" just fine. I first saw these numbers pointed out by Peter Kafka at Recode, who wrote: "Some smart people suggest that the cable guys would not be unhappy if most of their business moved over to broadband instead of video, since there are much better margins—and almost no competition—for broadband."
The better margins boil down to the fact that broadband is purely about access, while cable is about content. The crux of the cable side of the cable business is hatching deals with the makers of sports, news, and entertainment so there's something to send through the box. And the costs can be steep. ESPN, the most pricey by far, tops $5 per subscriber per month.
With broadband, the cable companies don't have to put anything through those pipes themselves. They just have to be the plumbers. They may not like the way Netflix and its more than 36 million U.S. subscribers are eating into their TV businesses. But Netflix and other streaming services are helping drive demand for broadband—a service cable operators can provide without having to serve up any content themselves at all.
What this means for the future of TV is still tough to predict. While these figures may suggest the inevitable transition to an Internet-dominated future, nearly 50 million cable subscribers don't appear ready to cut the cord just yet. Even with a plethora of on-demand options, people are still watching TV like they used to, which means a business model still based around ads and subscription fees. But that's still a loss of millions of cable subscribers over the past half-decade, while the number of broadband subscribers has climbed at a much faster clip.
Meanwhile, traditional TV as a format already is being engulfed by the open-endedness of the Internet. From mainstream streaming services like Netflix, Hulu, and Amazon Instant Video to niche sites like Funny or Die to YouTube celebrities—to name just some of the options that fall under entertainment—the kinds of moving pictures available and the ways to consume them have never been greater. Within this broader spectrum, cable as a concept could become just another niche, one channel among many as the insatiable Internet swallows everything it encounters.
China Telecom & IBM Team to Deliver Cloud Computing to Chinese Market
With consulting services from IBM and infrastructure support from China Telecom, businesses can build SAP clouds.
China Telecom and IBM have entered into a three-year agreement to help small and medium businesses (SMBs) implement secure, cost-effective and scalable SAP cloud-based applications.
Additionally, the cloud-based SAP applications will better enable China Telecom SMB customers that have or are planning to implement SAP applications, to reduce operating and application management costs with enterprise-ready applications. By taking advantage of cloud computing technology, businesses will be able to accelerate implementation, providing scalability and elasticity to meet their individual needs.
As a leading Chinese telecommunication company and the largest infrastructure network operator in China, China Telecom will manage clients' infrastructure that includes cloud platform resources, networking and mobile devices. IBM will integrate the software, hardware and end-to-end service capabilities to create a complete environment that supports SAP applications on the cloud.
Working with China Telecom, IBM will provide integrated and seamless management across all SAP architectures and delivery models. Clients can migrate and integrate new applications on the cloud, while maintaining and operating current applications.
Given IBM's extensive experience delivering successful cloud and SAP implementations to drive enterprise transformations across Greater China, IBM will provide the tools and skill education China Telecom requires to ensure support for their SMB clients throughout all phases of implementation. This includes testing, customization and ongoing issue resolution.
Under the agreement signed today, China Telecom and IBM will first focus on clients in the Guangdong province and then extend the project to such key areas as Yangtze River Delta, Pearl River Delta, Beijing and Tianjin.
"Today's announcement represents an important step in the Chinese cloud computing market that will provide one of the fastest growing segments within China greater access to cloud resources," said D.C. Chien, general manager and chief executive officer, IBM Greater China Group. "Our work with China Telecom will provide small-to-medium sized businesses in China with the means to more quickly deploy enterprise-grade cloud capabilities and assist them in driving innovation via the cloud."
Telefonica Proves Brocade Router Performs for NFV
Excerpted from Light Reading Report by Carol Wilson
Tests run by Telefonica in its labs show Brocade's software-based router can achieve 80Gbit/s throughput, matching the performance levels required for carrier applications and setting a benchmark that supports network functions virtualization(NFV) deployment.
Brocade Communications Systems provided Telefonica SA with a thumb drive containing the latest iteration of its Vyatta 5600 vRouter, and the carrier deployed that on a commercial off-the-shelf Intel based x86 server within a Red Hat KVM environment. Deployed as a single virtual machine, the Vyatta 5600 was able to support all of the server's available ports at line rate.
"In less than two hours, we deployed the Brocade Vyatta 5600 vRouter from a memory stick and completed our performance tests in our NFV Reference Lab," notes Francisco-Javier Ramon, head of Telefonica's NFV Reference Lab, in the press release. "These results are allowing us, as network operators, to aggressively change our perspective regarding what is possible with software-driven networking in order to accelerate the adoption and deployment of these revolutionary technologies."
By hitting the 80Gbit/s mark, Brocade actually exceeded its own original goal, which was to prove that a software router can support the 10Gbit/s performance that is mainstream in carrier environments, says Andrew Coward, VP of service provider strategy at Brocade.
"The performance of software networking products has been pretty abysmal," Coward concedes. "It's been less than one gig -- sometimes more like a couple hundred megs. For carrier-type applications, it became really important to have a much better performance, otherwise there is a significant disconnect between the 10-gig interfaces on most routers today and the software networking type product."
By over-delivering on this promise, Brocade believes it has created a software router that makes the price-performance curve look far more attractive for NFV, and enables a server to not only fill the 10Gbit/s pipes widely in use today, but also have processing power left over for applications and other network functions, Coward notes.
Intel's Xeon processor-based servers and its Data Plane Development Kit were key to the performance improvements, and Brocade rewrote its architecture around those improvements with drivers that make the most of the Intel process, Coward says. The company provided the new software router to a number of its service provider customers for testing.
Telefonica was clearly pleased enough with the test results to announce them publicly -- an unusual move these days for many telecom service providers.
Coward says the Brocade Vyatta 5600 vRouter is in about 40 proofs-of-concept currently, but the next big step will be determining functions or applications that can go beyond the single-use test and be more repeatable.
Rackspace Expands Cloud Offerings and Verizon Retakes Speed Crown
Excerpted from San Antonio Business Journal Report by Mike Thomas
Rackspace took steps to broaden its appeal to next-generation app developers by integrating Redis, the open source in-memory key-value data server, into ObjectRocket, the company's DBaaS platform it acquired in 2013.
ObjectRocket will now offer fully-managed support for Redis as well as turnkey provisioning, administration and orchestration tools.
The addition of Redis support expands ObjectRocket's purview beyond its previous focus on supporting MongoDB, the open source, NoSQL database.
San Antonio, TX based Rackspace is a leading provider of cloud computing services.
Verizon Communications has regained the wireless speed crown it lost last year to AT&T after making substantial infrastructure investments in major cities like New York.
Dallas, TX based AT&T was able to snag the title of fastest wireless service provider last year after Verizon got bogged down with network congestion in some of its biggest markets.
Verizon was the fastest national service in the first half of the year and retained its No. 1 spot in overall quality, reliability, calling and data service, according to a study by RootMetrics released today. AT&T took the top spot in text messaging.
New York, NY based Verizon, which operates the largest wireless service in the US, has expanded capacity using AWS airwaves, making for a speedier service.
Verizon Promises to Get its Cloud Service Online in Early September
Excerpted from GigaOM Report by Barb Darrow
Verizon's promised new cloud infrastructure will go live the first week in September. Like some rivals Verizon will offer base-level infrastructure for one price with additional charges for layered services atop it.
Things have been pretty quiet on the cloud front from Verizon since last October, when the company made some pretty big promises for its brand new, built-from-scratch cloud. But, starting the first week in September, that cloud, which has been in beta for months, will be broadly available to paying customers.
The game plan has been tweaked a bit in the intervening months. "We've enlarged the scope of our next-gen cloud and included a managed service tier — which is one thing our customers have been asking for. And we've been on-boarding services from a brand-new console," said Siki Giunta, global SVP for Verizon Enterprise Solutions, in an interview. Giunta joined Verizon 5 months ago (well, 5 months and 23 days ago, but who's counting?) from CSC, where she directed that integrator's global cloud business.
"We have the base compute, object store, network for very competitive pricing and then we have a rate card for guided services, where we take over more of the traditional management like monitoring and patching but the customer still brings their own templates. And then we have premium where we do all the maintenance of the applications," she said.
This tiering of base level from higher-end services is becoming the norm for cloud providers. Rackspace and CenturyLink — which also has roots in the telecom universe — are doing similar price breakouts.
What is new is that the company will offer Verizon Cloud Compute as described last year, but will also layer service tiers and a new interface atop that base cloud. And the promise of Amazon-like capabilities without the noisy-neighbor headaches that can afflict workloads on public clouds remains in place.
And, Verizon will offer its new cloud in tiers — Verizon Cloud Compute is the new cloud platform as described last year. But there will also be a "unified" Verizon Cloud offering that layers service tiers and a new interface atop that base cloud, Giunta said. And for the record, Verizon continues to field its legacy Terremark-based Enterprise Cloud; but the plan is to migrate those customers over to the new infrastructure in time.
Giunta said she came to Verizon because she saw big possibilities in aggregating that company's various telecom and networking assets into a broad offering that can accommodate not just current business workloads but a the growing mass of next-gen applications that fall under the broad Internet of Things rubric. In those applications, machines often talk to machines (hence the M2M jargon) without human intervention and data from all those devices — from Fitbits or Jawbones to jet engine sensors —get aggregated and parsed.
"Verizon is a strong M2M player and we've aggregated 400 partners in that area. We see a trend that IoT creates the dynamic of a cloud that uses network and wireless network — and we feel strongly that we have a cloud that is IoT ready," she said. Verizon and its partners are migrating its existing m2m applications to its cloud, she said.
Over the past few months, I had heard indirectly from Verizon insiders that there were some growing pains and glitches in the rollout of what is, after all, all a very ambitious architecture. Giunta seemed to acknowledge that a bit, noting that while Verizon's new cloud is running for 500 customers in various stages of beta — "everything is great and hunky dory until they have to pay something." When billing starts, the rubber meets the road.
Gartner VP and Distinguished Analyst Lydia Leong said the gist of this news is that Verizon is now doing managed hosting on a more cloud-like platform than in the past. "They'll compete with Amazon on some deals, but I see this as more directly competitive with Rackspace, AT&T, and CenturyLink," she said.
DataDirect Networks Doubles Revenue in Enterprise Big Data Markets
DataDirect Networks (DDN) has achieved impressive revenue traction for the first six months of 2014, demonstrating a strong trajectory of profitable growth for the company heading into the second half of the year.
The company added more than 20 net new petabyte-class customers including one of world's largest banks, a leader in global oil and gas exploration, and a major Japan-based automobile manufacturing company, increasing the storage capacity sold year-to-date to over 250PBs.
In the first half of the year, DDN experienced more than 100 percent growth in its Enterprise Big Data and Object based business compared to the first half of 2013, driven by significant growth in Financial Services, Life Sciences and Energy Exploration markets.
Building on the company's first half momentum, which included new partnerships, expanded footprint, and industry recognition, DDN also opened a Paris Advanced Technology Center to enhance its European technology innovation.
As worldwide demand for DDN high-performance, massively scalable storage for Big Data and Cloud use cases grows, current customers continue to broaden their use of the company's technology offerings while new customers including James Cook University, Japan National Statistics Center (NSTAC), and NSW Office of Environment & Heritage adopt DDN technology.
Coinciding with rapid sales growth, DDN has accelerated its worldwide employee hiring. The company has grown headcount 20 percent since Q1 and plans to add significant further headcount by year's end. Key hires include new additions to the company's technical, engineering, sales, marketing and quality assurance teams.
Building on its history of supporting the world's largest, most data-intensive environments, DDN continued to strengthen its technology partnerships in a number of areas notably in supporting object and cloud storage environments. In addition to joining the Active Archive Alliance, DDN also announced a partnership with Hyve Solutions to release the industry's first enterprise OCP-compliant webscale object storage appliance.
DDN also invested in strengthening its leadership teams in the first half of 2014, adding Molly Rector as the company's first Chief Marketing Officer, Bob Merkert as Vice President of Federal Sales, and current Cisco Vice President and Treasurer, Roger Biscay to its Board of Directors.
Huawei Gets inside SAP
Excerpted from Business-Cloud Report by Ian Murphy
Huawei and SAP have been growing their relationship for a while. Now Huawei has opened its own Innovation Center inside the SAP Partner-Port.
Huawei has been growing above the average for tech companies over recent years. Like Cisco, it has moved out of its traditional telecoms and router market to become a successful player in the integrated system market. That success has seen Huawei Enterprise grow to over $2.5 billion in just three years. Part of that success has been its relationship with other companies and its solid performance among the BRIC countries.
In March, at CeBIT, Huawei announced the FusionCube its second generation SAP HANA appliance and plans to work with SAP on a range of other initiatives such as Smart City and Bring Your Own Device (BYOD). It should come as no surprise then that Huawei is one of the first companies to open its own Innovation Center in the new SAP Partner-Port.
According to Kevin Tao, president, Huawei Western Europe. "We look forward to building a comprehensive partnership with SAP to collaborate on technologies and innovations such as the SAP HANA platform, cloud data center technology, mobility, smart city, big data, machine-to-machine (M2M) and others to complement our respective strengths and capabilities."
This is more than just a sales promotion exercise for Huawei. It allows them to hook into SAP's large enterprise customer base and work with them on specific business solutions. One of the key targets for Huawei is for this to become its largest joint partner innovation center within five years. That's a very bold target for Huawei because it will mean it achieving substantial successes in Europe through its SAP relationship. There is however a risk that in tying itself so heavily to SAP it closes doors to other potential partnerships.
One of the key challenges for Huawei here is that SAP has been working very hard to move everything, including SAP HANA, to the cloud. There has been a concerted move with SAP to work with a larger number of partners in both the appliance and pure cloud space. IBM, HP, Fujitsu, Hitachi Data Systems and Cisco all have well established SAP HANA appliances in the market along with cloud infrastructure.
If Huawei is going to be in the top tier SAP HANA club, they will have to bring something different to the table. One of those differences is likely to be Huawei's own Software Defined Network (SDN) and Network Function Virtualization (NFC) plans. While HP, IBM and Cisco also have divisions and projects in these areas, it is only really Cisco that Huawei will be competing with inside telecommunication operators.
Another benefit for Huawei will be its perceived position inside the BRIC (Brazil, Russia, India, China) group of countries. Huawei is keen to be seen as a local supplier inside this trading bloc and while it is not untainted by suggestions of spying via its hardware, it has suffered no direct damage from the Snowden revelations unlike HP, IBM and in particular Cisco.
This looks like a good move for Huawei and it will be interesting to see how long it takes to achieve its goal of making this its biggest joint partner project.
IoT: Out of the Cloud & Into the Fog
Excerpted from Network Computing Report by Andrew Froehlich
Cloud computing architectures won't be able to handle the communication demands of the Internet of Things (IoT). The future is in fog computing.
By now, most IT organizations have embraced the concept of cloud computing and are using it in some capacity. But if grand predictions regarding the Internet of Things (IoT) turn out to be true, even the most advanced, distributed cloud architectures aren't going to be able to handle the IoT's data and communications needs.
That's where the idea of "fog computing" comes into play. It's a term coined by Cisco, but most major IT vendors are developing architectures that describe how the IoT will work by bringing the cloud closer to the end user -- similar to how fog is nothing more than a cloud that surrounds us on the ground.
The problem that IoT forward thinkers see with the current cloud architecture is that it's heavily reliant on distributed processing and available bandwidth from the edge device to the backend server. Most data in a cloud environment is sent to the cloud to be processed, leaving our edge devices as dumb portals into the cloud.
Though this architecture works well today, it falls apart when we're talking about adding billions of devices and microdata transactions that are incredibly latency sensitive. Instead of forcing all processing to backend clouds and forcing all IoT device intercommunication through a cloud intermediary, fog computing proposes that devices have the opportunity to talk directly to one another when possible and handle much of their own computational tasks.
This evolutionary shift from the cloud to the fog makes complete sense to me. The original cloud boom began when mobile devices like smartphones and tablets were becoming all the rage. Back then, these devices were weak on computing power, and mobile networks were both slow and unreliable. Therefore, it made complete sense to use a hub-and-spoke cloud architecture for all communications.
But now that most of us are blanketed in reliable 4G technologies, and mobile devices now rival many PCs in terms of computational power, it makes sense to move from a hub-and-spoke model to one that resembles a mesh or edge computing data architecture. Doing so eliminates bandwidth bottlenecks and latency issues that will undoubtedly cripple the IoT movement in the long run.
So if you thought that cloud computing was the pinnacle of infrastructure designs for the foreseeable future, think again. If we're talking billions of devices and instant communication, current cloud models won't be able to handle the load. Fortunately, advances in mobile processing power and wireless bandwidth have allowed many to design a far more capable architecture that brings us out of the clouds and into the fog.
Internet of Things Is Overwhelming IT Networks
Excerpted from Baseline Magazine Report by Dennis McCafferty
By 2020, the Internet of things (IoT) is expected to interconnect 26 billion computing devices in businesses, homes, cars, clothes, animals, and pretty much everything else, according to Gartner. That's a thirtyfold increase over the past five years.
While the potential for innovation is exciting, it's taking a toll on IT resources, according to survey research from Infoblox.
Many tech professionals surveyed said that any required deployments for the IoT will become part of their existing IT network, even though most said their network is already at capacity. It doesn't help, findings reveal, that the business side often does not keep the IT organization informed about their IoT-related projects.
"It's encouraging that IT professionals recognize the demands the Internet of things will make on their networks," says Cricket Liu, Chief Infrastructure Officer at Infoblox.
"But business units often get deep into the buying process before calling IT, sometimes forcing IT to scramble to provide support for devices that lack the full set of connectivity and security protocols found in established categories such as PCs, tablets and smart phones."
On the positive side, IT employees feel their companies are committed to providing the budget and staffing needed to accommodate IoT-related demands. A total of 400 IT professionals from the United States and the United Kingdom took part in the research.
Click here for the full report.
Microsoft Research May Have Solved Latency Issue for Cloud Gaming
Excerpted from ITWorld Report by Andy Patrizio
Many concepts of computing have moved to the cloud, but gaming has not been one of them. Even with the fastest pipe into your home, latency is inevitable, and who wants to die in a "Call of Duty" deathmatch because of lag? We get enough of that as it is with the software loaded on our PCs.
Cloud-based gaming would also help overcome the problem of console hardware because it would require just a thin client to display the game rather than hefty hardware to render it. Displaying the video is a lot easier and less system intensive than having to render each frame. Given how underpowered the Xbox One is, cloud-based rendering would help overcome its shortcomings.
But how do you get the rendered frames down the pipe to the gamer quickly? Microsoft Research may have a solution in a project called DeLorean. In a nutshell, it renders frames before an event occurs in the game based on a number of variables, the correct set of frames are sent down to your device.
A recently published white paper from Microsoft lays out the concept and solution. Microsoft notes that people could enjoy high-end graphics without needing a high-end GPU through cloud gaming. However, cloud gaming is hindered by latency as low as 60ms.
Microsoft calls its solution "speculative execution." It uses future input prediction, which is predictable based on player behavior, along with speculation of multiple outcomes and error compensation. Microsoft also came up with a new form of bandwidth compression that uses the speculation component to take advantage of the frames being similar from one to the next.
With this, Microsoft was able to achieve a playable cloud-based version of "Doom 3" and "Fable 3," both of which are framerate-intensive games that were easily playable on thin clients despite a latency of over 250ms. Microsoft found players preferred DeLorean over traditional thin clients and that DeLorean can mimic a low-latency network successfully.
So when will we see it? Like with other Microsoft Research projects, they give no release date. This is still a lab experiment. But it could herald a day when gaming, like Salesforce's CRM, is a SaaS experience rather than 5-10GB on your hard drive.
Mitigating Security Risks at the Networks Edge
Distributed enterprises with hundreds or thousands of locations face the ongoing challenge of maintaining network security.
With locations that typically process credit cards, distributed enterprises are at a particularly high risk of suffering data breaches.
This CradlePoint white paper provides strategies and best practices for distributed enterprises to protect their networks against vulnerabilities, threats, and malicious attacks.
Please click here for the white paper.
Netflix Inks Time Warner Cable Pact to Pay for Direct Internet Connections
Excerpted from Variety Report by Todd Spangler
Netflix has agreed to pay Time Warner Cable for guaranteed bandwidth to deliver its streaming-video service to the cable operator's broadband users, following similar deals with Comcast, AT&T and Verizon.
The No. 1 streaming service, which represents about one-third all downstream Internet traffic in North America, now has deals with the country's top four broadband providers, representing about 68% of all high-speed Internet subscribers in the US.
"Time Warner Cable reached an agreement with Netflix in June, and we began the interconnection between our networks this month," a rep for the cable company said. Netflix confirmed that it has a deal in place with TW Cable.
Comcast is in the process of acquiring Time Warner Cable, pending regulatory reviews.
Netflix execs have said they will "reluctantly" pay ISPs interconnection fees, with the goal of delivering better-quality video to their mutual customers. But CEO Reed Hastings has said such payments represent an arbitrary "toll" for gaining access to a provider's customers and the company has lobbied the Federal Communications Commission to reclassify broadband as a telecommunications service, which would give the agency authority to impose new price controls and enact other regulations.
Comcast and Verizon have defended their paid-peering agreements with Netflix, saying they are a standard way the Internet bandwidth market works. The broadband operators argue that that Netflix is seeking to avoid paying its fair share of the cost of delivering Internet video, and that Netflix is free to procure additional bandwidth from third-party content-delivery networks (CDNs).
News of Netflix's deal with TW Cable was first reported by GigaOM.
4 Things You Need to Know about Cloud Security
Excerpted from FutureGov Report by Medha Basu
Most government agencies are storing their data on the cloud, but this isn't without risks. Red Hat explains 4 things that you must know about cloud security.
1. Traditional security methods won't work.
Cloud computing requires a totally new approach to data security. Previously, agencies would physically isolate one network from another based on the levels of data classification and clearance. This approach is not suitable for cloud environments because that would require a cloud for every combination of data classification and employee clearance level.
Thankfully, cloud security measures have grown more accessible with open source alternatives. For example, the US Government's National Security Agency developed the security-enhanced Linux project that allows any Linux system to separate information based on confidentiality and integrity requirements.
2. Standardize security approach across agencies.
Governments cannot leave cloud security details up to individual agencies. Since cloud resources are shared between many agencies, these agencies need to be able to communicate with each other quickly in the event of a security threat. Without a uniform approach towards security across agencies, they could run into problems.
The Security Content Automation Protocol is a set of rules that allows governments to specify and standardize how they would like their systems to be secured and to maintain the security configurations. Since it is based on open standards, it can operate across a number of cloud products. These same rules can also be applied to the government's on-premise data centers so that there is consistency across existing and new infrastructure.
3. Be prepared for the worst.
Even with the best efforts of cloud providers, governments have to be prepared to deal with any risk of data falling into the wrong hands. While this risk exists even with on-premise servers, governments' perceived lack of control over cloud environments makes this risk a real barrier to public sector adoption of cloud.
Open source security solutions allow agencies to encrypt their data in the cloud environment, so that even if the physical storage is compromised, the data is still protected. Moreover, network encryption tools prevent eavesdropping on data and keep information within their own security enclaves.
4. Trust international public sector standards.
Even with all these security measures in place, governments are unsure about which products and vendors to trust. Civil servants are reluctant to let go of the servers sitting in the back of their offices and trust a third-party to take care of their data in another location.
International public sector security standards such as the Common Criteria make things easier for these government officials. Authorizing governments evaluate and certify the security features of products. In Asia Pacific, the Common Criteria certificate is recognized by governments in Australia, India, Japan, Malaysia, New Zealand, Korea and Singapore.
Red Hat ensures that its open source cloud solutions meet the security standards of the governments it works with, Glenn West. Cloud Business Unit Manager, Asia Pacific, Red Hat told FutureGov. Red Hat's latest release of Linux is now being evaluated against the highest security standard in Common Criteria. "Red Hat is big about taking everything from the operating system and all the way up, and getting it all public sector certified. That's one of the differences between pure open source solutions and Red Hat," he said.
X-as-a-Service (XaaS): What the Future of Cloud Computing Will Bring
Excerpted from Rickscloud Blog by Rick Blaisdell
In this post, I'll discuss XaaS: what it is and why you might want to consider using it.
First, what is XaaS? Is this just more marketing fluff? Why do we need to define yet another model to fully describe cloud services? I contest that XaaS is a legitimate term, and that it is useful to describe a new type of cloud services — those that make use of IaaS, PaaS, and SaaS all neatly delivered in one package.
Such packages are intended to fully displace the delivery of a commodity IT service. My favorite example of XaaS is desktop as a service, or DaaS. In a DaaS product, a service provider might assemble it with the following:
Servers to run Virtual Desktop Infrastructure from a provider such as Terremark (IaaS)
An office suite such as Microsoft Office365 (SaaS)
Patching and maintenance services
A physical endpoint such as a Chromebook or thin client device
The organization providing DaaS would design, assemble, and manage the product out of best-of-breed offerings in this case. The customer would pay one fee for the use of the product and have the all-important "one throat to choke" for the delivery of the product.
At GreenPages, we see the emergence of XaaS (such as DaaS) as a natural evolution of the market for cloud services. This sort of market behavior is nothing new for other industries in a competitive market. Take a look at the auto industry (another one of my favorite examples).
When you purchase a car, you are buying a single product from one manufacturer. That product is assembled from pieces provided by many other companies — from the paint, to the brake system, to the interior, to the tires, to the navigation system, to name a few.
GM or Ford, for example, doesn't manufacture any of those items themselves (they did in days past). They source those parts to specialist providers. The brakes come from Brembo. The interior is provided by Lear Corp. The Tires are from Goodyear. The navigation system is produced by Harman. The auto manufacturer specializes in the design, marketing, assembly, and maintenance of the end product, just as a service provider does in the case of XaaS.
When you buy an XaaS product from a provider, you are purchasing a single product, with guaranteed performance, and one price. You have one bill to pay. And you often purchase XaaS on a subscription basis, sometimes with $0 of capital investment.
You can download John's "The Evolution of Your Corporate IT Department" eBook here
So, secondly, why would you want to use XaaS? Let's go back to our DaaS example. At GreenPages, we think of XaaS as one of those products that can completely displace a commodity service that is delivered by corporate IT today.
What are commodity services? I like to think of them as the set of services that every IT department delivers to its internal customers. In my mind, commodity IT services deliver little or no value to the top line (revenue) or bottom line (profit) of the business.
Desktops and email are my favorite commodity services. Increased investment in email or the desktop environment does not translate into increases in top-line revenue or bottom-line profit for the business. Consider that investment includes financial and time investments. So, why have an employee spend time maintaining an email system if it doesn't provide any value to the business?
Two key questions:
1. Does investment in the service return measurable value to the business?
2. In the market for cloud services, can your IT department compete with a specialist in delivering the service?
When looking at a particular service, if you answer is "No" to both questions, then you are likely dealing with a commodity service. Email and desktops are two of my favorite examples. Coming back to the original question… you may want to source commodity services to specialist providers in order to increase investment (time and money) on services that do return value to the business.
This is a guest post from John Dixon, from Journey to the Cloud.
Coming Events of Interest
Cloud Connect China — September 16th-18th in Shanghai, China. This event brand was established in Silicon Valley (US) in 2008. Last year, it was first introduced into China, providing all-dimensional cloud computing solutions through pay conferences and exhibition.
International Conference on Internet and Distributed Computing Systems — September 22nd in Calabria, Italy. IDCS 2014 conference is the sixth in its series to promote research in diverse fields related to Internet and distributed computing systems. The emergence of web as a ubiquitous platform for innovations has laid the foundation for the rapid growth of the Internet.
CLOUD DEVELOPERS SUMMIT & EXPO 2014 — October 1st-2nd in Austin, TX. CDSE:2014 will feature co-located instructional workshops and conference sessions on six tracks facilitated by more than one-hundred industry leading speakers and world-class technical trainers.
CloudComp 2014 — October 19th-21st in Guilin, China. The fifth annual international conference on cloud computing. The event is endorsed by the European Alliance for Innovation, a leading community-based organization devoted to the advancement of innovation in the field of ICT.
International Conference on Cloud Computing Research & Innovation — October 29th-30th in Singapore. ICCRI:2014 covers a wide range of research interests and innovative applications in cloud computing and related topics. The unique mix of R&D, end-user, and industry audience members promises interesting discussion, networking, and business opportunities in translational research & development.
GOTO Berlin 2014 Conference – November 5th–7th in Berlin, Germany. GOTO Berlin is the enterprise software development conference designed for team leads, architects, and project management and is organized "for developers by developers". New technology and trends in a non-vendor forum.
PDCAT 2014 — December 9th-11th in Hong Kong. The 16th International Conference on Parallel and Distributed Computing, Applications and Technologies (PDCAT 2014) is a major forum for scientists, engineers, and practitioners throughout the world to present their latest research, results, ideas, developments and applications in all areas of parallel and distributed computing.
Storage Visions Conference — January 4th-5th in Las Vegas, NV. The fourteenth annual conference theme is: Storage with Intense Network Growth (SWING). Storage Visions Awards presented there cover significant products, services, and companies in many digital storage markets.