March 3, 2014
Volume XLVII, Issue 4
Saluting CCE:2014 Charter Sponsors
The DCIA salutes charter sponsors and exhibitors for signing up early to support CLOUD COMPUTING EAST (CCE:2014),
CCE:2014 is the must-attend strategic summit for business leaders and software developers in our nation's capital, and initial backers include Apptix, Aspiryon, Axios Systems, Clear Government Solutions, CSC Leasing Company, CyrusOne, IBM, Oracle, SoftServe, and VeriStor Systems.
Please contact Don Buford, CEO, or Hank Woji, VP Business Development, at the CCA to learn more about attractive conference exhibition and sponsorship opportunities.
This important gathering of thought leaders and first movers will thoroughly examine the current state of adoption and the outstanding challenges affecting two major and increasingly related sectors of the economy, whose principals are currently engaged in migrating to the cloud.
CCE:2014 will focus on the gCLOUD (The Government Cloud) and the hCLOUD (The Healthcare Cloud).
Jointly presented by the DCIA & CCA, CCE:2014 will take place on Thursday and Friday, May 15th-16th, at the Doubletree by Hilton Hotel in Washington, DC.
Plenary sessions will feature principal representatives of such leading organizations as Amazon Web Services, Google, IBM, and Verizon, providing delegates with a real insider's view of the latest issues in this space from those leading our industry's advancement.
gCLOUD speakers will include representatives of such organizations as ASG Software, Clear Government Solutions, DST, HP, IBM, Ipswich Public Library, NASA, NetApp, The City of New York, QinetiQ-NA, SAP America, Tech Equity, Unitas Global, Verizon, Virtustream, and WSO2.
hCLOUD speakers will include representatives of such organizations as AVOA, BP LOGIX, BrightLine, Dell, DICOM Grid, Johnson & Johnson, Level 3, MultiPlan, NTP Software, Optum, ServerCentral, SYSNET Intl., and Vuzix.
Other featured speakers will include authors, analysts, industry observers, and representatives of such organizations as ActiveState, Aspera, BUMI, CDAS, Edwards Wildman Palmer, Expedient, Intuit, Juniper Networks, Kwaai Oak, M*Modal , Mobily, Numecent. The PADEM Group, Rackspace, SOA Software, Stratus Technologies, Trend Micro, Trilogy Global Advisors, V2Solutions, Veristor, Visionary Integration Professionals, and WikiPay,
To review conference topics and apply to join the speaking faculty for this event, please click here. If you'd like to speak at this major industry event, please contact DCIA CEO Marty Lafferty.
Federal IT Reform Bill Passes House
Excerpted from InformationWeek Report by William Welsh
On Tuesday, the US House of Representatives passed a bill by voice vote that contains the most far-reaching changes to the federal IT procurement system in almost two decades. It could save the federal government as much as $2 billion annually by changing wasteful buying practices.
The bipartisan Federal Information Technology Acquisition Reform Act (FITARA) embodies several measures designed to boost the role of agency CIOs. Under FITARA, government agencies would have a single CIO, appointed by the President, with greater authority over IT operations within component agencies.
FITARA would also require government agencies to eliminate software IT assets that are duplicative or overlapping, and curtail the purchase of new software licenses until an agency's needs exceed the number of existing or unused licenses. Agencies would also be required to make stronger business cases for their acquisition plans. Agencies would be required to track and report their progress in consolidating data centers.
"FITARA puts in place needed changes to IT acquisition and management, including increasing the authority of agency CIOs, promoting data center optimization, and recognizing the importance of a highly skilled IT workforce," said Mike Hettinger, Senior VP for Public Sector and Federal Government Affairs with the trade group TechAmerica, in a letter the trade group released to thank Congressman Darrell Issa (R-CA) for his work on the legislation.
Issa, who chairs the House Oversight and Government Reform Committee, cosponsored the bill with Subcommittee on Government Operations ranking member Gerry Connolly (D-VA). "Good software saves billions of dollars and countless lives and countless hours if it works," Issa said. "Bad or poorly done software can frustrate the American public and deprive them of the very product or service they expected to receive. This is a significant and timely reform that will enhance both defense and non-defense procurement."
If the bill is signed into law, it could save the government upwards of $2 billion a year, according to IT industry estimates, by eliminating unused and underused software licenses and eradicating non-compliant software that incurs financial penalties for the federal government.
"We know from experience that as much as 25% of the government's software budget could be saved by implementing industry best practices and technology around software license optimization," said Jim Ryan, COO at software optimization firm Flexera Software. "There is no need for this massive waste of taxpayer dollars to continue when the means to solve the problem are known and readily available to government agencies."
Significantly, the legislation requires federal agencies to adopt proven private-sector practices that employ new technologies and use analytical tools to improve and manage IT systems, said Tom Schatz, President of the watchdog group Citizens Against Government Waste.
"Section 301 of FITARA requires federal agencies to optimize their software licenses by knowing the software licensing requirements of their workforce prior to purchasing or renewing licenses, thus avoiding overpaying for software licenses they do not need or being subject to penalties for using more software licenses than they purchased," Schatz said.
The House bill is divided into sections that address management of IT resources, improve data center optimization, and eliminate duplication and waste in IT acquisition. Other sections seek to strengthen IT management and to improve transparency and competition in IT solicitations.
The IT industry had strong objections to the original legislation introduced in 2012, concerned that it created unnecessary red tape and that it put proprietary software on an uneven playing field against open source software. Lawmakers made significant changes to it before including it in the fiscal year 2014 National Defense Authorization Act. Although industry has some remaining concerns, it will support the bill, Hettinger told Issa.
Report from CEO Marty Lafferty
The DCIA asks for your support of an important measure — the Email Privacy Act (HR 1852) — advancing now through the US House of Representatives.
This bill would end warrantless searches of email and sensitive data stored in the cloud, and is extremely critical to continued industry advancement.
Introduced by Congressmen Kevin Yoder (R-KS) and Jared Polis (D-CO) in May, HR 1852 now has 180 co-sponsors.
It needs 218 votes for passage in the House.
HR 1852 would reform the now seriously outdated Electronic Communications Privacy Act (ECPA), which was written more than a quarter of a century ago.
ECPA was forward-looking when it was signed into law in October 1986 — considering that the Internet didn't even exist back then.
But the law was enacted at a time when landline voice calls instead of cloud storage and mobile-phone location data were the central focus.
Applying its privacy standards to new technologies simply doesn't work.
For example, the law doesn't address mobile-phone location tracking; it allows the government to seize emails without a warrant; and it is unclear how it applies to newer online services like social networks, search engines, and cloud storage services.
This gap between yesterday's law and today's technologies makes each of us vulnerable to abuse.
ECPA contains loopholes that law enforcement has already exploited to conduct warrantless searches of Americans' email and other digital data.
The government should be explicitly required to go to a judge and get a warrant before it can read our email, access private files we store online, or track our location using our mobile phones.
Because that's not currently the case, you need to act.
Last December, over 100,000 people signed the We the People petition for ECPA reform, requiring a White House response that is still yet to be come, by the way, but now it's critical that we communicate with Congress.
As a Digital Due Process (DDP) Member, the DCIA urges you to take advantage of the "Call Your Representative" feature on the Vanishing Rights website and voice your support of HR 1852 now.
If your elected Representative is not yet a sponsor of this bill, ask him or her to become one; and if he or she already is, thank them.
The Electronic Frontier Foundation (EFF) and Demand Progress also provide background information and automated tools for contacting Members of Congress.
TechFreedom also posted a blog post; and reddit, DuckDuckGo and Free Press are lending their support through social media.
You can also share the Center for Democracy & Technology's (CDT) Facebook post.
Join us in demanding a privacy upgrade now. Share wisely, and take care.
Government-as-a-Platform: Cloud Computing Inside the Beltway
Excerpted from Forbes Report by Joe McKendrick
Cloud computing is more than just the latest in a series of attempts to pare down government spending — it's a gateway to unprecedented innovation in a sector known more for bureaucratic inertia. That's the view of Dr. Rick Holgate, President of the American Council for Technology (ACT), an independent advisory group and community of government managers and employees. In his view, cloud is one of three forces of innovation sweeping US federal agencies, along with restructuring and opening up to private industry partnerships.
I recently caught up with Holgate, who is CIO of the Bureau of Alcohol, Tobacco, Firearms & Explosives, to talk about the disruptive forces that are reshaping the federal government. Cloud computing is the first and foremost of such forces — "bringing about more modern and more flexible, agile ways of delivering services and technology to our customers," he says. "We're really thinking different about we build services and build solutions."
How is the adoption of cloud progressing inside the Beltway, among government agencies?
While launched with great promise several years ago, a recent Accenture survey of 286 federal IT executives suggests that progress has been slow to date. The initial effort, the 2011 Federal Cloud Computing Strategy known as "Cloud First," required federal agencies to consider cloud-based solutions anytime new solutions are sought.
More recently, a Government Accountability Office (GAO) report found that only one of of 20 cloud migration plans submitted by agencies to the GAO in 2012 was complete, and only 10 percent of agencies had migrated the bulk of their IT assets to the cloud. The Accenture study shows about 30 percent have implemented cloud strategies.
Holgate says there is a sizable degree of momentum toward cloud computing at the federal level, but agrees progress has been uneven. Progress with cloud has "depended on agencies' missions and the level of comfort they can achieve with cloud computing," he says. "Many agencies, plus the General Services Administration, have moved a number of their traditional internal IT services, including email and customer relationship management, to cloud-based commercial providers."
A notable cloud consumer has been the Central Intelligence Agency, which is "contracting with cloud providers to meet their needs for huge amounts of infrastructure on demand," he continues.
Cloud is part of a growing 'Government-as-a-Platform' initiative that has been developing since those earlier days of Cloud First. "Government has been opening up access to its data, allowing others to innovate using that data.," says Holgate. "Some examples of that are weather data and GPS data, which have really fostered huge amounts of innovation in the private sector."
Cloud lowers the barrier to standing up new capabilities, he continues. "We can start looking at huge volumes of data, and be more thoughtful and analytic about the way we do business. Cloud-type and shared environments enable us to stand up those types of capabilities much more easily."
Opening up access to data is also helping government agencies improve their internal operations as well. "Technology can shine a light on what had been historically inefficient business practices within the federal government," he explains. This spans a range of government activities, from procurement costs to healthcare to student loans.
"Previously, we didn't have an easy way to see into those transactional data sets. Now we can look at that data and we can understand where were seeing inefficiencies and where we're incurring expenses." In addition, cloud makes it easier for agencies to deliver commodity-level services, such as email, to users — areas "we don't want to spend a whole lot of effort and capital and attention to delivering, while allowing us to shift our focus to some of those things that are more transformative and innovative."
The ability to focus more on value-added activities is a key selling point of cloud, Holgate states — it's part of the transformative effect technology is having on the way government agencies are run.
"Innovation comes from all sorts of places in the organization," he says. "You need to make sure you have a way to surface the great ideas that occur within all organizations, and make sure that you bring all of those voices of creativity and innovation to the table. More and more, they're coming from many different sources — because everyone has exposure to mobile capabilities and cloud-based platforms."
Government Cybersecurity Guidance Wanted By Private Sector
Excerpted from InformationWeek by Elena Malykhina
Despite the government's poor track record on cybersecurity, most IT leaders in the US and elsewhere believe government can help the private sector create a solid security strategy and protect organizations against internal and external threats, concludes a new global study of IT leaders by Dell.
Dell interviewed 1,440 IT leaders in 10 countries from public and private sectors to gauge their awareness and preparedness for a new wave of threats in IT security. The study found nearly three-quarters of respondents had experienced a security breach within the last 12 months, confirming the growing seriousness of the security threats IT leaders face.
The scope of threats -- both known and unknown -- pose a multitude of new risks for organizations, the study found, especially as enterprises expand their reliance on cloud computing and allow employees to bring their own devices to work.
Organizations need to restructure their IT processes and collaborate more with other departments to prepare for the next security threat, the study concludes, although that sentiment varies, from 85% among US respondents to 43% in the UK and 45% in Canada. The study also surveyed IT leaders in France, Germany, Italy, Spain, India, Australia, and China.
But the majority — 76% globally and 93% in the US — agreed that to combat today's cyber criminals, organizations must protect themselves both inside and out. Threats come from all perimeters. They are often caused by poorly configured settings or permissions or by ineffective data governance, access management, or usage policies, according to the study.
Companies that have experienced a security breach dedicate an average of 18% of their IT budget to security, according to the survey. Half of IT leaders surveyed believe security training for new and current employees is a priority and two-thirds say they have increased funding for security training and education. Nearly three-fourths (72%) of US respondents, and 54% worldwide, say they increased spending for monitoring services over the past year.
Nearly 90% of respondents believe that government involvement is necessary to help the private sector determine cyberdefense strategies. In fact, 78% of IT leaders in the US believe that the federal government plays a positive role in protecting organizations against internal and external threats. More than half (53%) said the government's role in security is helping operational effectiveness, with only 17% claiming that the government is hindering effectiveness.
Despite positive attitudes toward the government's involvement, federal agencies continue to face security challenges, many of which were highlighted in a February 4 report published by Senator Tom Coburn (R-OK). According to the report, "The Federal Government's Track Record on Cybersecurity and Critical Infrastructure," there were more than 48,000 cyber "incidents" involving government systems, which agencies detected and reported to the Department of Homeland Security (DHS) in fiscal year 2012.
The US government has spent at least $65 billion to secure its computers and networks since 2006, yet "weaknesses in the federal government's own cybersecurity have put at risk the electrical grid, our financial markets, our emergency response systems, and our citizens' personal information," Coburn said on his website.
Those that participated in the Dell study reported that on average it took seven hours to detect a breach. "On government networks, seven hours is much too long, potentially providing cyber criminals with access to critical national security information," Paul Christman, vice president of public sector for Dell Software, said in an interview with InformationWeek Government.
A week after Coburn's report, the National Institute of Standards and Technology (NIST) released its "Framework for Improving Critical Infrastructure Cybersecurity," a catalog of best practices and standards for companies to use in developing security programs. The framework follows an executive order to protect privately owned critical infrastructure. DHS is launching a program to support private sector adoption of the framework by offering access to the department's cybersecurity experts.
Christman said that's exactly what companies are looking for. "There is a clear need for strong leadership and guidance from public sector organizations in helping the private sector," he said.
HealthCare.gov Cloud Computer System Cost 5X More than Expected
Excerpted from Hit and Run Report by Peter Suderman
The cloud computing contract for the federal government's Obamacare exchange came in a lot higher than originally planned, reports NextGov.
The government's contract with Terremark, Verizon's cloud division, had already quadrupled from $11 million when it was first awarded in 2011 to $46 million at the time of HealthCare.gov's disastrous launch in October 2013.
That included a $9 million adjustment just days before launch when testing revealed the cloud could only support 10,000 concurrent HealthCare.gov users rather than the expected 50,000.
CMS ordered an additional $15.2 worth of cloud services from Terremark between the launch date, when most users were unable to access key portions of the site, and November 30th, when officials declared the site was performing at an acceptable level, according to a justification for other than full and open competition document posted on Thursday.
That contract adjustment paid for added cloud storage plus firewall upgrades, additional software and various other services.
Once again, it suggests that the federal government didn't know what they were getting into when the exchanges launched last October.
Asked about the increased cost, a federal health official tells NextGov that "if the additional services were not added urgently, the exchanges would not function as designed and citizens would continue to have issues using the marketplace."
In other words, the original plan had been for a system that wouldn't work.
Community Healthcare Network Provides Reliable, Quality Care
Community Healthcare Network (CHN), a group of nonprofit community health centers providing medical, dental and social services in neighborhoods throughout New York, has selected Microsoft Office 365 and email in the cloud to keep its business and healthcare teams running smoothly and to help enable seamless communication in crisis situations.
"Prior to selecting Office 365, in emergency situations, our health teams were more at risk of losing communication with each other and not being able to help those most in need," said Catherine Abate, CHN's president and CEO. "With Office 365, if a weather emergency hits, we're able to use the power of the cloud to help maintain email and server connectivity, keep our operations moving forward, and stay in contact with our staff members and other organizations so we can focus on helping patients."
When Hurricane Sandy hit, CHN lost its on-premises email communications, which wreaked havoc on an organization that relies heavily on email to coordinate with health teams and manage healthcare operations. Once the storm was over and email was restored, CHN decided to look at moving the entire organization to the cloud permanently to help avoid similar adverse situations in the future. Office 365 not only provided virtually anytime, anywhere access to an increasingly mobile workforce, it also helped ensure HIPAA compliance and a commitment to maintaining the highest security and privacy standards. In addition, Office 365 has helped improve the efficiencies of CHN's electronic medical records (EMRs).
"Even since the beginning of 2014, we've experienced some significant winter weather here in New York, but despite the weather, we've been able to keep everything up and running thanks to the cloud," said Jason Pomaski, CHN's assistant vice president of Technology. "Office 365 has really transformed how our organization is able to provide care and respond to emergencies by keeping us connected in a highly secure, user-friendly way."
CHN provides confidential care to men, women and children, regardless of citizenship status or ability to pay. The nonprofit has a mobile workforce that serves four boroughs with 11 health centers, along with a mobile medical van, and it helps more than 75,000 people per year. CHN is also called upon by state and local governments as a responder to natural disasters and potential health emergencies, which makes reliable communications imperative.
Microsoft and CHN made the announcement as part of the annual HIMSS Conference, the nation's largest health IT event of the year. From empowering easier collaboration among healthcare teams to securing virtually anytime, anywhere access to vital health information, Microsoft Office 365 capabilities are better equipping healthcare organizations to deliver care for patients.
A Checklist for Negotiating Government Cloud Service Contracts
Excerpted from FierceCIO Report by David Weldon
Government agencies at the federal and state level are lured to cloud computing by the promise of cost savings and expanded services. But there are also risks associated with moving operations and applications to the cloud.
A recent report by researchers at the University of North Carolina at Chapel Hill looks at contracting issues that government agencies face with adopting cloud services. The report was written for the IBM Center for the Business of Government.
"As with any form of government contracting, there are risks to be considered," notes an article at Governing. "Do governments lose control over their data? Do they risk losing access to it? Are they locked into a single vendor? The key to success is writing and negotiating a strong contract."
With that in mind, the report offers several tips on how government IT executives can write and manage an effective cloud services contract. Study authors Shannon Tufts and Meredith Weiss developed a 12-part checklist of the issues that should be addressed:
Pricing "typically includes initial or upfront costs, maintenance and continuation costs, renewal costs and volume commitments."
Infrastructure security encompasses "the supplier's responsibilities in the areas of information security, physical security, operations management, and audits and certifications."
Data assurances include "ownership, access, disposition, storage location and litigation costs."
Governing law should specify how and where legal disputes will be settled.
Service-level agreements should specify parameters and also remedies and penalties for non-compliance.
Outsourced services should be the responsibility of the vendor, including informing the government agency of any services that are outsourced.
Functionality provisions should include not only the service being purchased but also requirement of advance notice if there is a change required.
Disaster recovery and business continuity process and safeguards should be clearly spelled out.
Merger and acquisition activity impacts should be covered in the event the vendor becomes involved in a corporate buy-out.
Compliance with laws is an obvious requirement, in addition to language related to warranties and liabilities.
Terms and conditions at the time of the contract signing should be posted so that they can be referred to at any time.
Contract renewal and termination clauses "should specify how data will be retrieved/returned upon termination by either party."
Why Smart Cities Need Cloud Services
Excerpted from InformationWeek Report by Mary Jander
The cloud approach, in which providers outside city government deliver a technological platform for gathering and mining data and producing city applications over the public Internet or a virtual private network, has become the favored means for municipalities to move to the next level.
"While it's possible for cities to enable smart technology without using cloud-based services, it's unlikely you will be able to do so in any meaningful way," wrote Brian Robinson on GCN last month.
Why is this? Why can't cities just get smarter without any involvement with cloud services? A closer look turns up several reasons.
For one thing, most city networks aren't equipped to work with real-time input from sensors, smartphones, electric and water meters, or other sources of input about the functions of cities. Doing so requires specialized hardware and software to gather data, as well as applications to make sense of it. That calls for a lot of expertise that usually isn't in a city CIO's bag of tricks.
Cloud suppliers, on the other hand, are typically in a position to forge the alliances required to get sensors and other real-time data sources to work with city applications. They have to, after all, in order to offer services that are widely marketable.
Read the rest of this article on Future Cities.
2014 Cloud Computing Trends Show Positive Signs
Excerpted from Midsize Insider Report by Robert Lawson
Cloud computing trends for 2014 appear to show positive signs, according to reports. How are these trends playing out so far?
The business and financial news website DailyFinance reported predictions for the cloud in 2014, and things are looking good. The article provided an analysis of the ways in which migration to the cloud will become more of a mainstream process this year.
A Growing Cloud Market
This news is not particularly surprising because reports have been indicating an obvious and ongoing shift to cloud computing, even from the consumer side. It has been referred to as the "consumerization of IT," and 2014 will likely bring forth the "capitalization of IT." Businesses and customers will be tied together through systems that exist in virtual machines and in servers managed by providers.
The market for computing and storage of this type has been growing like wildfire, and it has been segmented by specialists and some companies overseas. However, small and midsize businesses in the United States will likely be very selective when choosing a provider. They will insist on a vendor that is flexible and provides services that align with their business needs.
A Change in How Businesses View the Cloud
The year 2013 could be thought of as the year of testing and research when it comes to migrating to the cloud. Many companies reportedly spent a great deal of time using demos and license-free versions of software to run development staging and other testing phases. These versions lacked tech support, but IT managers saw an opportunity to learn and grow into cloud computing before making an important purchase decision.
The bottom line is that companies want to move more production into the cloud in order to be more competitive and responsive to their customers. They also want to save money on data centers and IT personnel costs.
The cloud has become a powerful and versatile tool for businesses across industries and of every size, including small, midsize and enterprise organizations. Key trends are that prices are falling and features are expanding, and these developments bode well for new adopters aiming to get ahead in the rapidly evolving global marketplace.
The Best Cloud Computing Companies and CEOs to Work for in 2014
Excerpted from Forbes Report by Louis Columbus
2014 continues to be a year marked by the accelerating hiring cycles across nearly all cloud computing companies.
Signing bonuses of $3K to $5K for senior engineers and system design specialists are becoming common, and the cycles from screening to interviews to offers is shortening. The job market in the cloud computing industry is leaning in favor of applicants who have a strong IT background in systems integration, legacy IT expertise, business analysis and in many positions, programming as well.
One of the most common questions and requests I receive from readers is who the best companies are to work for. I've put together the following analysis based on the latest Computer Reseller News (CRN) List of The 100 Coolest Cloud Computing Vendors of 2014.
Using the CRN list as a baseline to compare the Glassdoor.com scores of the (%) of employees who would recommend this company to a friend and (%) of employees who approve of the CEO, the analysis was completed. You can find the original data here.
There are many companies listed on the CRN list that don't have than many or any entries on Glassdoor and they were excluded from the rankings. You can find companies excluded here. You can view the selected company rankings here.
The highest rated CEOs on Glassdoor as of February 23rd include the following:
- Jeremy Roche of FinancialForce.com (100%)
- Robert Reid, Intacct (100%)
- Randy Bias, Cloudscaling (100%)
- Sridhar Vembu, Zoho (98%)
- James M. Whitehurst, Red Hat (96%)
- Larry Page, Google (95%)
- Christian Chabot, Tableau Software (95%)
- Aneel Bhusri, Workday (94%)
- Bill McDermott & Jim Hagemann Snabe, SAP (93%)
- Marc Benioff, Salesforce (93%)
- David Friend, Carbonite (93%)
Software Tool Reduces Cost of Cloud Computing
Excerpted from Scientific Computing Report by Tom Abate
We hear a lot about the future of computing in the cloud, but not much about the efficiency of the data centers that make the cloud possible. In those facilities, clusters of server computers work together to host applications ranging from social networks to big data analytics.
Data centers cost millions of dollars to build and operate, and buying servers is the single largest expense the centers face. Yet at any given moment, most of the servers in a typical data center are only using 20 percent of their capacity.
Why? Because the workload can vary greatly, depending on factors such as time of day, the number of users logged in or sudden, unexpected demand. Having excess capacity is the usual way to deal with this peak-demand issue.
But as cloud computing grows, so will the cost of keeping such large cushions of capacity. That's why two Stanford University engineers have created a cluster management tool that can triple server efficiency while delivering reliable service at all times, allowing data center operators to serve more customers for each dollar they invest.
Christos Kozyrakis, associate professor of electrical engineering and of computer science, and Christina Delimitrou, a doctoral student in electrical engineering, will explain their cluster management system, called Quasar, when scientists who design and run data centers meet for a conference in Salt Lake City, beginning March 1, 2014.
"This is a proof of concept for an approach that could change the way we manage server clusters," said Jason Mars, a computer science professor at the University of Michigan at Ann Arbor.
Kushagra Vaid, general manager for cloud server engineering at Microsoft, said that the largest data center operators have devised ways to manage their operations but that a great many smaller organizations haven't.
"If you can double the amount of work you do with the same server footprint, it would give you the agility to grow your business fast," said Vaid, who oversees a global operation with more than a million servers catering to more than a billion users.
How Quasar works takes some explaining, but one key ingredient is a sophisticated algorithm that is modeled on the way companies such as Netflix and Amazon recommend movies, books and other products to their customers.
How it works
To grasp what's new about Quasar, it's helpful to think about how data centers are managed today.
Data centers run applications such as search services and social media for consumers or data mining and large-scale data analysis for businesses. Each of these applications places different demands on the data center and requires different amounts of server capacity.
The cloud ecosystem includes software developers who run applications, and cluster management tools that decide how to apportion the workload and assign which applications to which servers. Before making such assignments, the cluster managers typically ask developers how much capacity these applications will require. Developers reserve server capacity much as you might reserve a table at a restaurant.
"Today data centers are managed by a reservation system," said Stanford's Kozyrakis. "Application developers estimate what resources they will need, and they reserve that server capacity."
It's easy to understand how a reservation system lends itself to excess idle capacity. Developers are likely to err on the side of caution. Because a typical data center runs many applications, the total of all those overestimates results in a lot of excess capacity.
Kozyrakis has been working with Delimitrou, a graduate student in his Stanford lab, to change this dynamic by moving away from the reservation system.
Instead of asking developers to estimate how much capacity they are likely to need, the Stanford system would start by asking what sort of performance their applications require. For instance, if an application involves queries from users, how quickly must the application respond and to how many users?
Under this approach the cluster manager would have to make sure there was enough server capacity in the data center to meet all these requirements.
"We want to switch from a reservation-based cluster management to a performance-based allocation of data center resources," Kozyrakis said.
Quasar is designed to help cluster managers meet these performance goals while also using data center resources more efficiently. To create this tool the Stanford team borrowed a concept from the Netflix movie recommendation system.
If you liked this application …
Before delving into the algorithms behind Quasar, understand that servers, like some people, can multitask. So the simplest way to increase server utilization would be to run several applications on the same server.
But multitasking doesn't always make sense. Take parenting, for instance. A mom or dad might be able to wash dishes, watch television and still spell a word to help a child with homework. But if the question involved algebra, it might be wise to dry your hands, turn off the TV and look at the problem.
The same is true for software applications and servers. Sometimes differing applications can coexist on the same server and still achieve their performance goals; other times they can't.
Quasar automatically decides what type of servers to use for each application and how to multitask servers without compromising any specific task.
"Quasar recommends the minimum number of servers for each application and which applications can run best together," said Delimitrou.
This isn't easy.
Data centers host thousands of applications on many different types of servers. How does Quasar match the right applications with the right server resources? By using a process known as collaborative filtering — the same technique that sites such as Netflix use to recommend shows that we might want to watch.
Applying this principle to data centers, the Quasar database knows how certain applications have performed on certain types of servers. Through collaborative filtering, Quasar uses this knowledge to decide, for example, how much server capacity to use to achieve a certain level of performance, and when it's OK to multitask servers and still expect good results.
Thomas Wenisch, a computer science professor at the University of Michigan, is intrigued by the Quasar paper, in which Kozyrakis and Delimitrou show how they achieved utilization rates as high as 70 percent in a 200-server test bed, compared with the current typical 20 percent, while still meeting strict performance goals for each application.
"Part of the reason the Quasar paper is so convincing is that they have so much supporting data," Wenisch said.
Increasing data center efficiency will be essential for cloud computing to grow. These installations draw so much electricity that escalating demand threatens to overtax power plant output. So throwing more servers into the data center isn't the answer, even if money were no object.
But while they pursue higher efficiency from multitasking servers, data center operators must deliver consistent levels of service. They can't allow some customers to suffer because the servers are processing the wrong mix of tasks, a shortcoming known as "tail latency."
"The explosive growth of cloud computing is going to require more research like this," said Partha Ranganathan, a principal engineer at Google who is on the team that is designing next-generation systems and data centers. "Focusing on resource management to address the twin challenges of energy efficiency and tail latency can have significant upside."
Kozyrakis and Delimitrou are currently improving Quasar to scale to data centers with tens of thousands of servers and manage applications that span multiple data centers.
"No matter how well we manage resources in one data center, there will always be cases that exceed its capacity," Delimitrou said. "Offloading parts of work to other facilities in an efficient manner is key to achieving the flexibility that cloud computing promises."
Huawei & Telefonica Jointly Accomplish PoC Testing
Excerpted from Light Reading Report
Telefonica, the world's leading international carrier, and Huawei, a leading global information and communications technology (ICT) solutions provider, announced recently that they have completed the phase II Prove of Concept (PoC) testing for the UNICA Infrastructure project, which is "Telefonica's" future-proof ICT infrastructure.
This testing includes 188 test cases that cover the maintenance and operation of the Virtual Data Center, Software Defined Networking (SDN), cloud deployment of Telco services, and compliance of the OpenStack infrastructure.
As Telco Services, Huawei Virtual IP multimedia subsystem (IMS) and value-added service (VAS) Cloud have been verified on the platform. According to the test results, the UNICA Infrastructure can manage data centers distributed in various regions in a centralized manner, provide a unified Virtual Data Center for efficient data sharing among internal departments or OBs, and rapidly provides Telco services and internal applications through the Virtual Data Center.
In this manner, the UNICA Infrastructure facilitates Telefonica to build a unified, cooperated, and highly efficient next-generation ICT infrastructure, and supports network transforming. In addition, Huawei SoftCOM infrastructure, including the Distributed Cloud Data Center (DC2) and SDN network, highly match with the UNICA Infrastructure.
The PoC testing demo of the UNICA Infrastructure was demonstrated by Telefonica at the Mobile World Congress (MWC) in Barcelona in February, 2014. Huawei virtual IMS will be shown as a Telco service platform during the demonstration.
With the development of Telco networks and information technologies, carriers are able to decrease costs, increase revenues, and improve service abilities to the greatest extent. The UNICA Infrastructure aims at establishing a general ICT infrastructure to deploy new network functions and service platforms in a standardized and multivendor way. To promote network transforming, Telefonica and Huawei started planning in July, 2012.
Based on the profound understanding of IT and CT services, Huawei SoftCOM infrastructure integrates self-developed ICT technologies, such as the cloud operating system, SDN network, data center management and Network Functions Virtualization (NFV). This infrastructure matches majority of the requirements of UNICA Infrastructure, and has passed all test cases of the phase II PoC testing.
BitTorrent Revamps Android App with Easier File Sharing
Excerpted from Make Use Of Report by Saikat Basu
BitTorrent brings a new look and feel to its Android app. BitTorrent Version 2 shows significant changes like the ability to select and download individual files within a torrent. You also get a completely redesigned interface for simpler file downloading and for easier sharing of BitTorrent Bundles. Similar UI changes also come to BitTorrent's desktop application and uTorrent as well.
Downloading is now far more intuitive — you can not only download individual files within a torrent, but you can do it before or during the download. The mobile app allows you to pick a download location and see the progress of individual downloads with nicely colored indicator bars. Also, you can now delete torrents only or remove both torrents and files.
Mobile usage of BitTorrent has crossed 50 million installs in two years. It is a significant milestone in our increasing interaction with smartphones as go-to devices for everything.
BitTorrent has announced that bundles will be integrated into the desktop torrent applications; making it easier for users to discover and choose what to download. Creatives can tap into the wide userbase for promotion of their work, and also for revenue. BitTorrent bundles as a publishing medium still needs a push from its present Alpha stage, though. Making it a core feature for desktop BitTorrent and uTorrent users can only help.
Verizon Cloud Adds CloudBees PaaS
Excerpted from InformationWeek Report by Charles Babcock
The revamped Verizon Cloud has added a platform-as-a-service to its Verizon Cloud Compute and Cloud Storage services. Its CloudBees PaaS is noted for its depth of Java development services and middleware, and for its ability to deploy applications to different environments.
Verizon announced at Interop in New York October 3 that it was remaking its basic cloud offering to become more competitive in compute and storage services. The revamped offerings became available alongside its existing Verizon Terremark enterprise services, both a hosted service and a self-service cloud.
Though still in beta, the Verizon Cloud is offered from seven data centers that have been re-equipped with SeaMicro hardware. The Verizon centers boast of green operating practices. The SeaMicro servers are three times as dense as their predecessor servers and use less electricity, according to Verizon CTO John Considine. The abilities both to operate efficiently and to host the development of next-generation applications are becoming more important competitive factors in cloud services. VMware, a newcomer to public cloud services, offers Cloud Foundry PaaS.
CloudBees was founded in 2010 in Woburn, Mass., by the former CTO of JBoss, Sacha Labourey, who served several years as CTO of middleware at Red Hat after Red Hat acquired the open source firm. CloudBees has been available since launch to run on Amazon Web Services, OpenStack clouds (including HP's), and on VMware's Cloud Foundry.
Applications developed on CloudBees may be deployed to any of these environments or back into the enterprise, if their owners prefer. Verizon, in selecting CloudBees, risks being the development site of applications that will end up running in some other cloud. But hybrid cloud operations appeal to many enterprise developers, particularly the ability to deploy some applications in a public cloud and some back behind the enterprise firewall, a practice that CloudBees supports.
"We are working with best-in-class enterprise technology companies to bring additional value to our core availability, performance, and security," said Considine on February 19th.
CloudBees has taken a more direct DevOps approach to the marketplace than earlier PaaS providers. Its platform includes support for deployment via open-source Jenkins, geared to provide "continuous delivery" of freshly developed code to rapidly evolving production applications. CloudBees includes a Jenkins Operations Center. CloudBees can also establish update centers with designated masters in charge of pushing code into production.
Both CloudBees and Apprenda, another PaaS software maker, have provided versions of their platform designed to run on the enterprise premises as opposed to only being available through a public cloud service.
The arrival of CloudBees Java platform on Verizon is timely because Verizon and Oracle have announced they soon will make both the Oracle database and Oracle Fusion middleware available on Verizon Compute Cloud. Fusion is middleware for running Java applications. Oracle is the owner of Java since its acquisition of Sun Microsystems.
"This deal represents two market leaders coming together to create a compelling cloud offering," said Oracle President Mark Hurd on January 10 in making that announcement. No launch date has been announced. Once available, it too will be a beta service.
Octoshape to Power the Education Sector
Research and Educational institutions across the world are switching to Octoshape, an industry leader in cloud-based streaming technology, for the linear, high quality delivery of their educational-based programming to multiple connected campuses, devices and beyond.
Octoshape's patented Infinite HD stream acceleration technologies, which include a cloud based and/or on premise infrastructure, achieve consistent, high quality video playback across any device. The technology is used by educational institutions to efficiently deliver content across campuses for on-net delivery of content to students as well as for off-campus delivery of content to mobile devices, tablets and web browsers. Educators, students, parents and university alumni can stay in touch with the schools while experiencing a new way of watching live and on-demand programming.
Located in the Metropolitan Area of Guadalajara and comprised of eight separate campuses across the State of Jalisco, the University of Guadalajara is the second oldest university in Mexico and the fifth oldest in North America with an enrollment of more than 200,000 students. The University recently launched "Channel 44," an online channel dedicated to offering culture, news, documentaries and infotainment. Channel 44 will now also be available over mobile devices and tablets, using Octoshape Infinite HD services.
"It was exceedingly important to us that our programming was distributed, to an ever-growing audience, with the same level of quality and scale as demanded by major international broadcasters," said M.Sc Gabriel Torres Espinoza, Director of the University System of Radio and Television. "Octoshape Infinite HD has enabled us to grow our audience over the Internet and via smart devices while ensuring the highest quality and scale to meet the demand. We are very pleased with the result."
"Octoshape's strong adoption across the Educational space is a testimony to the common need our clients have," said Michael Koehn Milland, CEO of Octoshape. "Our technologies not only enable educational institutions to exceed their high quality standards, but, as they are rolled out across campus, also enable the efficient delivery of linear video without the need to deploy a native multicast infrastructure."
Another major customer of Octoshape, the National University of Cuyo, which has more than 40,000 students, 4,500 teachers and 1,300 academic support personnel, recently launched "Senal U", the first public, free digital channel in the region offering cultural and science-based programming and analysis. In order to extend the content beyond the university campus and make it available to other cities in Latin America, the university contracted with Octoshape for its stream acceleration needs. Octoshape's Infinite HD enables programs like "Radio Universidad" and "Edicion UNCuyo" for high quality global distribution across multiple devices.
"While researching various methods and vendors to deliver our live video streaming needs, we tested the performance of several suppliers," said Diego Pistone, Head of Streaming at Senal U. "Only Octoshape was able to achieve high quality streaming, in HD, at scale and without buffering, interruptions or delays. The world now has an open window into the Mendoza community via Senal U."
Roughly 10,000 miles away and on the Asian continent, private educational institutions Singapore American School (SAS) and the International School Manila (ISM), offer curriculum targeted at Expatriates and serve students of all grade ranges.
"To better serve the family of our international students, we have adopted Octoshape's live streaming platform," said Alexander van Iperen, IT Technical Director, for the International School Manila. "The parents and family members of the students are now able to enjoy graduation and other important ceremonies from afar without having to compromise on quality."
SAS is the largest international school in the world, located on one campus, with an enrollment of approximately 3,800 students. Requirements for both educational institutions were to be able to plan and deliver frequent, live streamed events to the diverse student body. Cost containment was important but given the high standards of the governing bodies, delivery of a high quality product was critical.
Coming Events of Interest
Interop Las Vegas — March 31st to April 4th in Las Vegas, NV. The leading independent technology conference and expo series designed to inspire and inform the world's IT community. New in 2014: Cloud Connect Summit and the InformationWeek Conference.
CLOSER 2014 — April 3rd-5th in Barcelona, Spain. The Fourth International Conference on Cloud Computing and Services Science (CLOSER 2014) sets out to explore the emerging area of cloud computing, inspired by recent advances in network technologies.
NAB Show — April 5th-10th in Las Vegas, NV. From broadcasting to broader-casting, NAB Show has evolved over the last eight decades to continually lead this ever-changing industry. From creation to consumption, NAB Show has proudly served as the incubator for excellence — helping to breathe life into content everywhere.
Media Management in the Cloud — April 8th-9th in Las Vegas, NV. This two-day conference provides a senior management overview of how cloud-based solutions positively impact each stage of the content distribution chain, including production, delivery, and storage.
CLOUD COMPUTING EAST 2014 — May 13th-14th in Washington, DC. Three major conference tracks will zero in on the latest advances in the application of cloud-based solutions in three key economic sectors: government, healthcare, and financial services.
International Conference on Internet and Distributed Computing Systems — September 22nd in Calabria, Italy. IDCS 2014 conference is the sixth in its series to promote research in diverse fields related to Internet and distributed computing systems. The emergence of web as a ubiquitous platform for innovations has laid the foundation for the rapid growth of the Internet.