May 16, 2011
Volume XXXV, Issue 2
Amazon Chief Holds Court in Yonkers
Excerpted from Marketing Daily Report by Karl Greenberg
Jeff Bezos, President, CEO, and Chairman of the Board at Amazon, held forth before a full house at the Yonkers, NY headquarters of Consumer Reports during its ShopSmart symposium this week. He spoke on everything from feedback-abuse to the future of online marketing and the importance of a corporate vision that transcends Amazon.
The largest online retailer made $9 billion in the first quarter this year alone. The company, whose first edition of Kindle sold out in five hours, has done something new with the latest version: a cost structure partly offset by ads. The company is also moving into cloud computing with its Cloud Music business.
Bezos said the most interesting trend in online retail is mobile. "You will see more and more over time that people will buy from tablet computers. That's very exciting for us. It gives us a new environment to experiment and invent in."
He said the company's Windowshop for iPad represents the company's efforts to create a shopping channel for devices one can hold with two hands and use on the go. One element of Windowshop is a virtual directional controller on the lower right side of the tablet pane allowing one's thumb to skate across a matrix of product offerings. "It will become a very good shopping tool," says Bezos. "The key is that as we continue to move to a better bandwidth on mobile devices, as processors get faster, the apps can become more and more snappy. We will be innovating in that area."
An editor from Consumer Reports noted that people have criticized the Amazon customer service experience for its difficulty in reaching a live person. "Are they valid criticisms," he asked. "And what is Amazon doing?"
Bezos responded that the best customer service is that the customer never need call you. "The best customer service is when it just works. And that is one of the primary metrics that we follow and have improved every year since we have been in business," he said. "Customer contacts per units sold -- we endeavor to drive that down everywhere we can. The number-one contact has always been, 'Where's my stuff?' We have driven that way down."
Bezos said he has been influenced in at least a couple of ways by what Japanese corporations have done. He said, for example, that the company has improved service by doing its own version of Toyota's operating philosophy of "kaizen", or continuous improvement. "It's a bunch of different things, not one big panacea. We find a defect, and find the cause. We use Toyota's five Y's: we ask 'why did it happen' five times as a heuristic to get back to the root cause. Really, the only contact we have should be, 'I just thought I should thank you guys."
The company CEO's other role model from Japan is Sony's founder, because of the larger vision he set for the company, which helped transform not only Sony but the Japanese consumer-product industry. "My role model is the guy who founded Sony right after World War II, Akio Morita. He said, 'We are going to make Japan known for quality.'"
Amazon's larger goal? "We want to raise the entire bar on customer experience, not just customer service. Customer service means it's already too late," he said.
Amazon, Bezos said, has a program in which every employee has to spend two days in the customer service trenches, which he said is excruciating because it involves dealing with problems that are tangled within other problems, so that by solving the first you create a second one. "Since we have eliminated most of the problems that make easy contacts, the ones that remain are excruciating. There is a special kind of hell called snowball contacts where we mess up something and in the process of fixing it we mess of up the fix. You have to hire a very special person to do that job day in and day out."
To succeed long term, companies - Amazon included - must have a willingness to invent, according to Bezos. He said that such a drive also implies a willingness to fail and be misunderstood, perhaps for a long period of time, "Even if you actually succeed," he said, adding that criticism is to be expected "both sincere and from those with a vested interest in 'the old way.'"
As for trends, he said he's generally way more focused on what's not going to change than on trying to discern what will, "because we can build plans around that. Things that don't change rapidly are customer needs. I know 10 years from now customers will want low prices, fast delivery, selection, and choice. We can build a strategy around that. We can build a flywheel: a low-cost structure, finding the root cause of defects and eliminating them. If I say, 'We want to build a low-price model 10 years from now,' we can work on that."
Sony Compensates for PlayStation Network Outage
Excerpted from Digital Media Wire Report by Mark Hefflinger
Sony will provide PlayStation Network account holders, whose personal data and possibly credit card information were exposed in a hacker attack, free identify-theft protection services from Debix.
Subscribers will also receive a free month of service, plus one free day for every day past a month the network is still offline. Sony said on its blog that it "will likely be at least a few more days" before PlayStation Network services - which have been down over three weeks now - are restored.
Report from CEO Marty Lafferty
For the third consecutive week, we're obliged to report to DCINFO readers about high-profile outages experienced by large companies that provide cloud computing software and infrastructure services.
Three weeks ago, we reported that a relatively small subset of Amazon's Web Services (AWS) incurred downtime due to an error in one of its zones affecting social websites such as Foursquare, Quora, and Reddit. While this outage was not directly experienced by Amazon customers, popular consumer sites using its EBS service didn't function properly. Amazon corrected the issue, apologized to affected clients and provided service credits, and took steps to prevent a recurrence and bolster related customer communications.
Last week, we noted that Sony suffered a breach of its PlayStation Network, which is used to play multi-player games, support various online offerings, and to access third-party products, such as Netflix. Sony's site was malevolently hacked and user account information stolen. In response, Sony took down its network, brought in third-party experts to help analyze the vulnerability, developed a comprehensive action plan to correct it and restore the network, and announced a consumer credit program.
Now, this week, both Google and Microsoft faced online service failures, offering an additional reminder that cloud computing is continuing to experience growing pains - attributable more to explosive growth than any common architectural or systemic flaw - and basically has yet to achieve the degree of stability expected from utilities like power companies.
For Google, the issue was its consumer blogging service, Blogger, which was inaccessible or slow on Thursday. Google has owned and operated Blogger since 2003 as a free, ad-supported service, offered in a manner similar to its Google Docs and Gmail. Earlier this week, Google rolled out a maintenance release for Blogger that resulted in its users being locked out of their accounts. Google's engineers worked to restore service as quickly as possible.
A Blogger Service Disruption update offered four entries, starting with, "We have rolled back the maintenance release from last night and as a result, posts and comments from all users made after 7:37 am PDT on May 11, 2011 have been removed. Again, we apologize that this happened and our engineers are working hard to return Blogger to normal and restore your posts and comments."
Overnight updates promised "We're making progress" and "We expect everything to be back to normal soon."
On Friday, Blogger began restoring posts and the service is now operating normally. In an apologetic blog post, Blogger technology leader Eddie Kessler attributed the problem to data corruption.
Microsoft has been experiencing problems for the past few days with its Business Productivity Online Suite (BPOS), a set of online applications that includes Exchange Online, SharePoint Online, Office Communications Online, and Office Live Meeting.
On Tuesday morning, the company's BPOS-S Exchange service had trouble dealing with malformed e-mail traffic.
"Exchange has the built-in capability to handle such traffic, but encountered an obscure case where that capability did not work correctly," explained Dave Thompson, Corporate VP of Microsoft Online Services in a blog post. "The result was a growing backlog of e-mail." The backlog lasted several hours for some customers, but was resolved.
Then on Thursday, malformed e-mail again tripped up BPOS-S Exchange, resulting in the delay of some 1.5 million messages. This second backlog was also resolved in a matter of hours.
The e-mail issues were compounded by an unrelated DNS server problem early Thursday morning, which, for about three hours, prevented US customers from using Outlook Web Access, and also had some impact on Microsoft Outlook and Microsoft Exchange ActiveSync devices.
Thompson said, "As a result of Tuesday's incident, we feel we could have communicated earlier and been more specific. Effective today, we updated our communications procedures to be more extensive and timely. We understand that it is critical for our customers to be as fully informed as possible during service impacting events."
Thompson added that Microsoft will continue to rely on its Service Health Dashboard to communicate about issues affecting its online suite of services. Microsoft's dashboard, unlike Google's publicly accessible Apps Status Dashboard, is accessible only to registered customers.
The common lesson from these incidents is that a hybrid strategy, with local storage and local content creation and editing tools supplementing cloud solutions, is advisable. If local storage or service fails, you can access your data and applications from the cloud. Conversely, if the cloud service fails, you can access them locally. But if you rely exclusively on either local or cloud-based solutions, you're more vulnerable to failure.
While the outages experienced by Amazon, Sony, Google, and Microsoft - and most importantly their customers - reminded us that these services are still young and evolving, the DCIA does not believe they will negatively impact the continually accelerating adoption of cloud computing services.
The benefits accruing to customers of these services are too compelling. And the problems are being corrected expeditiously, with valuable lessons learned quickly applied to improved service and communications.
Because cloud providers' businesses heavily depend on their service levels, the companies are taking extraordinary measures to prevent recurrences.
And despite the downtime, cloud providers still remain better, faster, more reliable, and more specialized in the services that they provide than what most institutional clients provide for themselves or can replicate with alternative approaches.
The greatest outcome of these recent incidents will be a very rapid maturation of cloud vendors, to the substantial benefit of all industry participants. Share wisely, and take care.
Verizon's Incoming CEO Sees Big Future in Cloud Computing
Excerpted from St. Petersburg Times Report by Robert Trigaux
Running communications giant Verizon, number 18 on this year's Fortune 500, demands skills spanning a technological century.
While Verizon still services copper-wire-connected telephones little changed from the days of Alexander Graham Bell, it also runs an advanced wireless phone network and competes for high-speed Internet access and TV services with its FiOS fiber optic cable system.
Now it's pushing into the red-hot business of "cloud computing" - applications and services housed and offered over the Internet - with the recent $1.4 billion purchase of a Miami business called Terramark.
All this is under way as new (think Microsoft buying Skype) and old (think AT&T buying T-Mobile) players seem to multiply daily. Longtime CEO Ivan Seidenberg, who helped create Verizon, is stepping down later this year.
His successor will be company President Lowell McAdam, who was in downtown Tampa on Thursday to visit the Verizon staff and chat with the local press.
McAdam, 56, acknowledged Verizon suffered several years ago from overextending itself, seeking scale in its costly businesses while annoying too many customers with poor service. Those problems are on the mend, McAdam said.
McAdam, polished and thoughtful, touched on many topics. Some highlights follow.
The economy: Verizon slowed its expansion plans in Florida as the recession left so many homes and condo projects vacant. It's expensive to enter a new neighborhood and hard to sell FiOS services when nobody's home. "I do not see the economy getting worse but building continuously," McAdam said. Verizon's key targets will be condos, apartment buildings and office buildings, where there are concentrations of potential customers in one place.
FiOS expansion: While competing mostly with cable provider Bright House Networks, FiOS still is not available in all parts of the market, including portions of St. Petersburg. McAdam suggested it's coming. But slowly.
Wireless: Remarkably, given all the buzz, McAdam did not even mention Verizon offering Apple's iPhone this year. But the executive did speak of the 4G (higher speed) wireless network and this summer's arrival of tiered data pricing on cell phones.
Easing rules on landline phone service: While Verizon focuses on the new, its legacy business - old school copper-wire landline phone service - comes with many obligations. New state legislation would lift limits on rate hikes for home phones and reduce regulators' ability to handle consumer billing complaints.
That's all good news to McAdam, who said "hats off" to Florida Governor Rick Scott and argued that the free market "will more effectively regulate bad behavior."
AT&T Plans to Invest up to $1 Billion in Cloud Services
Excerpted from TelecomTV Report
AT&T announced yesterday that it is accelerating plans to deploy global network-based cloud, mobility, and network sourcing solutions to companies across a range of industries.
It plans to invest almost $1 billion in 2011 to deploy next-generation services for businesses (which will be part of its previously announced $19 billion capex budget for 2011), said John Stankey, President and CEO of AT&T Business Solutions.
"We continue to invest significantly in cloud-based, mobility, and network sourcing solutions because customers are increasingly recognizing that transformative services like these increase productivity, improve operational effectiveness, and lower costs," he said.
AT&T intends to invest in key focus areas, including enterprise mobility applications and cloud-as-a-service (CaaS) enhancements, while also rolling out platforms, systems, and e-capabilities to enhance business customer support. The telco is targeting its investment to companies and institutions in industries such as manufacturing, retail, hospitality, healthcare, and automotive.
Andrew Edison, head of AT&T's operations in EMEA, explained that, "We are seeing strong growth in Europe and this investment will help underpin that. In 2010 we saw growth of around 8% - numbers which have been driven by the acquisition of new clients; demand from our existing customers for services that help their own business transformation; and also the sale of value-added services to our customers."
In 1Q 2011, AT&T said it added 1.6 million "emerging devices", which included connected devices and embedded computing devices such as tablets, netbooks, and laptops. More than 12 million of these devices are now connected to its network, and it reports that the use of mobile applications has tripled since 2009.
Cloud Computing Is "Picking up Steam"
Excerpted from Computer World UK Report by Bernard Golden
Last week marked the second OpenStack Design Summit. OpenStack, if you're not familiar with it, is an open-source project founded by a joint effort and code contribution of NASA and Rackspace; however, the project has grown rapidly and has many more participants today. Among companies participating in the OpenStack project: Cisco, Dell, NTT, Citrix, and many others.
The energy at the conference was quite amazing and attendance went well beyond what the organizers expected. In the interest of full disclosure, I chaired the Service Provider track, which brought together presenters from AT&T, KT (Korea Telecom), and other companies rolling out OpenStack offerings. I expect that this track will become a fixture at future Design Summits, as many service providers throughout the world will be interested in a low-cost, high-quality open source-based cloud computing software stack.
Beyond the technical presentations there were a couple of items that I found really interesting. They both applied well beyond OpenStack itself and offered insights and opportunities for users no matter what cloud infrastructure is used.
The first item was the keynote presentation by Neil Sample, VP Architecture for eBay. eBay is considering Rackspace as a platform for its use of public cloud (although Sample was careful to note that eBay is also considering Azure as well).
What was fascinating about Sample's presentation was that he walked the audience through eBay's thinking on the topic and offered real financials about the numbers driving its decision.
The fundamental reality confronting eBay is that it has extremely spiky computing use. Even after taking all the obvious, straightforward actions (e.g., move non-critical computing to off hours to reduce peak load; move remaining excess computation load to off-hours locations to take advantage of unused capacity), eBay still faces peaky load that has, in the past, required it to own more capacity than it typically uses.
So eBay set out to investigate how it could leverage public cloud computing to reduce its computing costs. eBay spends about $80 million per year on data centers, and each "computational unit" (its normalized measure) runs it around $1.07. What eBay found was that, for a broad range of public cloud computing costs, it could reduce its total spend significantly. In fact, even if the public cloud cost for a comparable computational unit was up to four times eBay's internal cost - in other words, even if a computational unit from a cloud provider cost $4.28 - eBay would still save money. A lot of money.
Why is this? The primary factor driving the public cloud computing benefit is the fact that an eBay data center's cost structure is almost entirely fixed - $.88 of the $1.07 computational unit remains whether there is any work done in the data center or not.
If eBay can avoid purchasing computing capacity that sits idle by using a public provider, it can save money even if the public provider costs significantly more than eBay's own resources.
Essentially, this is an example that illustrates something we preach all the time - data center utilization rates are the key to cloud computing economics. Unless one can guarantee that a cloud data center will operate - on a sustained basis - at 70% or more of capacity, it will be hopelessly uncompetitive from a financial perspective.
If eBay can manage its capacity such that its own data centers operate at high utilization rates and it can harvest additional capacity from public providers at anything like typical rates, it will drop its overall computing cost by something like 40%.
The second interesting item at the Design Summit related to hardware that Dell was showing off. The company demonstrated a high-density collection of servers and storage. Particularly interesting was the configuration of the server portion of this collection. Dell's offering does not use the common blade design that is often used to increase computing density. This is because blade designs commonly fail to offer redundant system services - especially network connections, so that if the network connectivity of the blade chassis fails, the entire blade collection is unable to continue working.
Dell's design, by contrast, provides system resources for each computing device, which it refers to as "sleds", although to me they looked more like trays. Each "sled" is one or two sockets, contains a boatload of memory, and completely separate network connectivity from the other sleds. The only shared resource among all the sleds in a system is the power supply, and two are included for robustness reasons.
Each sled is connected to 12 2.5 inch drives, making very large storage capabilities part of this system. You can see an actual sled and get a sense of how they are constructed in this video I shot of Rob Hirschfield of Dell describing one.
Dell's hardware demonstration is indicative of another aspect of cloud computing: the rapid evolution of different components within the total aggregation of resources necessary to support a cloud computing environment. Two weeks ago, I wrote about Facebook's Open Compute initiative, which addresses the physical infrastructure of a cloud environment; Dell's offering complements it with a high-density power-efficient computing platform. Suddenly, data centers constructed with the existing components seem horribly out-of-date and uncompetitive.
I don't expect that Dell's offering is the pinnacle of what we'll see on the hardware side - far from it. But it is, for sure, one of an ongoing number of steps that will be taken to support the vastly higher scale of computing for the future, offering products that push the boundaries of capability and efficiency.
These are just two of the elements that struck me about the Design Summit, and this doesn't even address the main subject of the conference, which was to enable collaboration and help push OpenStack toward increased functionality and quality. As I noted at the beginning of this post, the energy at the conference was palpable.
What these two elements do illustrate, however, is how cloud computing continues to morph as providers and users gain more experience with the domain. eBay's presentation indicates why cloud computing has so much end-user attention - the cost structures associated with traditional computing environments in the face of scale growth make existing infrastructure approaches obsolete. Dell's "sled" computing shows how new infrastructure products are being created by vendors to better suit these new computing environments.
From my perspective, cloud computing appears to be picking up steam and gaining even more prominence. To quote Al Jolson, the star of "The Jazz Singer," the first motion picture talkie, "You ain't seen nothin' yet."
Incredible Economies of Scale Await Cloud Users
Excerpted from Information Week Report by Charles Babcock
A Windows Server instance costs between 5 cents and 96 cents an hour because Microsoft has been able to drive down operational expenses.
There are "incredible economies of scale in cloud computing" that make it a compelling alternative to traditional enterprise data centers, said Zane Adam, General Manager of Microsoft's Windows Azure Cloud and Middleware.
Adam is one of the first cloud managers to speak out on the specifics of large data center economics. He did so during an afternoon keynote address Tuesday at Interop 2011 in Las Vegas, NV, a UBM TechWeb event.
But Adam ended up emphasizing the cloud as spurring "faster innovation" in companies because of its ability to supply reliable infrastructure, as users concentrate on increasing core business value.
To make his point on economies of scale, he cited Microsoft's new cloud data center outside Chicago, IL, which started operating in September 2009. It hosts Microsoft Bing searches, Microsoft's Dynamics CRM, SharePoint, and Office Live software as a service, and over 31,000 Microsoft Azure customers.
Microsoft spent $500 million to build a cement floor, warehouse-type facility with trucks able to drive in on the ground level and drop off containers packed with 1,800 to 2,500 blade servers. The 700,000-square foot facility has 56 parking places for containers, and containers in each spot can be double stacked. When each 40-foot container is plugged into the data center's power supply, the servers inside start humming. They can be brought into production use in eight hours.
The building has more typical server racks on its upper floor. It was built for a total of 300,000 servers. It is served by 60 megawatts of power and contains 190 miles of conduit.
Adam said building a data center on such a scale is being done by a limited number of cloud computing suppliers, including Google, Amazon Web Services, Rackspace, and Terremark. In contrast, less than 1,000 Microsoft customers are running over 1,000 servers and only "a few" have 10,000 or more, Adam said.
Consequently, there are economies of scale possible in such a setting that are impossible in the more heterogeneous, raised floor, enterprise data center. Although they are rapidly moving away from the practice, enterprise data centers at one time assigned a server administrator to devote much of his time to a single application running on one server.
At the Chicago center, one administrator is responsible for "several thousand servers," Adam said. "It costs an estimated $8,000 a year to run a typical server. For us, the cost goes down to less than $1,000."
If operations are typically 15% of data center costs, Microsoft has driven out 70% of that cost, Adam estimated. Microsoft executives have told the Chicago business press that they operate the facility with just 45 people, including security guards and janitors.
Given the scale at which Microsoft builds, it can negotiate special prices on volume orders of servers. Some are supplied by Dell, which has a container/server manufacturing capability. Microsoft gets tax breaks and commercial credits "that small data centers can't get," when it comes into a suburban community to build such a facility, he added.
The location is known for its low-cost power and access to fiber optic communications. "Special power deals lower our cost of power 90%" over what a more typical industrial customer would have to pay, Adam said.
Microsoft charges between 5 cents and 96 cents an hour for an extra-small to an extra-large instance of Windows Server. Azure opened for pay-for-use computing in February 2010 with a small instance of Windows Server at 12 cents an hour.
But at the end of his delineation of data center costs, Adam said it wasn't its low cost that would drive use of public clouds. "In the cloud, there's no patching or version updating," saving the cloud customer a major headache that eats up IT staff time.
The availability of reliable infrastructure at low prices will enable companies of all sizes to use more computing in the business. In the end, "it's that rapid innovation that will drive acceptance," he said.
Cloud Computing - A Driving Engine for Growth
Excerpted from Zawya Report by Mohammad Ghazal
Cloud computing can help Jordan accelerate its economy and trigger growth in different sectors, according to a key IT expert.
"Resorting to cloud computing can enable the public sector to streamline its spending on IT and thus focus on innovation and providing premium services in different sectors," Chuck Hollis, Vice President for the Global Marketing Chief Technology Office at EMC, a leading provider of storage hardware solutions that promote data recovery and improve cloud computing, said in an interview with The Jordan Times on Monday.
His remarks came on the sidelines of the four-day 11th annual EMC World Conference held under the theme "Cloud Meets Big Data", which brought about some 10,000 IT leaders and experts from 103 states from across the world.
In a survey by Gartner earlier this year, cloud computing was ranked number two, as top technology priority, immediately after virtualization. In 2009, cloud computing ranked 16th, according to Gartner, which "delivers technology research to global technology business leaders to make informed decisions on key initiatives."
"The IT sector is as vital and fundamental for economic growth as other sectors. Creating an information-based economy and a centralized data center can help the government accelerate economic growth, as cloud computing is cost-effective, flexible and saves money spent on maintenance of equipment," added Hollis.
In Jordan, the use of cloud computing, which is Internet-based computing whereby shared resources are provided on demand, is still weak even though the potential, according to Jordanian experts, is huge.
"There is an interesting and huge opportunity for growth in Jordan and the Middle East. The future is for cloud computing and Jordan and other countries in the region can benefit from the latest technologies and solutions which have been used and tested over the years," Hollis added.
"Cloud computing is the infrastructure needed for accelerating economic growth in all sectors," said Hollis.
Highlighting the importance of cloud computing, Chairman, President and Chief Executive Officer of EMC Corporation Joe Tucci said about 73% of IT budgets of companies and entities across the world are allocated for maintenance, which he labeled as "huge".
Tucci said the volume of data is growing "rapidly" across the world, adding that cloud computing is the "most efficient" way to reduce costs and deal with this growing volume of data.
In a recent interview with The Jordan Times, National Information Technology Center (NITC) General Manager Nabeel Fayoumi said the government is focusing on the decentralization of computing and IT-related issues.
Fayoumi, who noted that many countries across the world started to create major data centers to serve their organizations, said the NITC plans to create a data center this year to serve all public entities.
Planet Earth is Becoming a Massive Computational Cloud
Excerpted from Computer Weekly Report by Adrian Bridgwater
The cloud is becoming massive. Actually, I hate people that call it "the" cloud. It's the cloud computing model of IT service delivery in the form of applications and data. But that's a bit long, so we'll stick with calling it "the" cloud.
So if my original statement holds water, just how many clouds are there right now?
CEO of Eucalyptus Systems Marten Mickos says that his company has launched over 25,000 clouds, making it the planet's most widely deployed software platform for on-premise IasS clouds.
Mickos suggests that soon we will have 10 billion connected devices on this planet - phones, pads, laptops, servers, GPSs and vehicles, medical devices, meters and recorders and so on.
It's almost like the world is becoming one massive computational machine.
If you haven't heard of Eucalyptus, then your fun fact for the day is the fact that it started as an advanced research project over four years ago at the University of California, Santa Barbara, CA.
Why is it called Eucalyptus? Isn't it obvious? Here's a clue -- Elastic Utility Computing Architecture Linking Your Programs To Useful Systems.
Eucalyptus is an open-source offering, so it's freely open to modification and redistribution.
So what does Mickos think is being done with all the 25,000 clouds that he claims carry the Eucalyptus brand? In his own blog he writes as follows:
"Experimental and production clouds are being established across most types of organizations and among both small and large ones. Earlier predictions that private clouds would appeal only to the most conservative organizations, or just to large ones, seem to have been incorrect. Private clouds are of interest to anyone. Those who are using a public cloud service appear to have an even higher interest in private clouds."
"A key design principle in the early days of Eucalyptus was providing users with choice, control and freedom. Importantly, we decided to implement on top of our elastic cloud machine the same API functionality that the leading public cloud vendor uses. This has proven to be highly valuable to our users. If it runs on Amazon Web Services, it runs on Eucalyptus, and vice versa."
"Clouds rely on virtualization that is typically implemented in the form of a hypervisor. Combine the AWS API compatibility, the hypervisor agnosticism and the open source software model, and you get an on-premise IaaS platform that fits right into your existing data-center infrastructures while effectively preventing lock-in. If you don't like the hypervisor, you can replace it with another one. You can mix multiple hypervisors in the same cloud. If you need to move your apps, then the industry standard API gives you the widest possible freedom."
So then, should we view planet Earth as some kind of organic massive computational cloud in the making? It seems reasonable to suggest in some ways -- after all we are "empowering" everything from motor cars to fridges with more technology than was ever thought possible (or indeed necessary).
Chrysalis Brings Content Distribution to BitTorrent
Excerpted from CNET News Report by Seth Rosenblatt
BitTorrent launched its next-generation torrent client in a public beta this week, offering people a unique system for not just sharing content via torrents but also for socializing the experience and turning the tool into one with deep content discovery hooks.
BitTorrent 8 beta (download) contains one enormous change from the alpha that launched in March: personal content channels, which streamline the torrent creation and sharing process to allow you to share high-quality versions of your homemade videos, audio, and photos with friends.
As announced at CES 2011, the implementation is unique to BitTorrent, and an integral part of its push to emphasize the use of the torrent protocol for legally shared files. BitTorrent currently has more than 100 million active users spread across BitTorrent, uTorrent, and uTorrent for Mac, BitTorrent Chief Strategist Shahi Ghanem said during an interview at BitTorrent's San Francisco office. He also said the company holds 80% of the torrenting market.
The new channels feature benefits from leveraging current file-sharing link-distribution techniques as used in YouSendIt to share both the torrent program and the torrent itself. It also removes the requirement that videos be compressed before being posted to public websites, while providing a more controlled environment to share personal files privately. "We're using a very advanced technique that can be described as distributed cloud storage," Ghanem said when explaining how BitTorrent channel users will share files in the channels they subscribe to.
To ensure that a channel retains its health, which is a way of saying that it always has a minimum number of people seeding the files, Ghanem said BitTorrent will guarantee the minimum number of active seeds. Said BitTorrent lead engineer Thomas Ramplelberg, "We expect the content to be fast-distributed and short-lived on our servers." He also said that while the company had yet to figure out how many seeds would equal the minimum number, the current number was around seven.
BitTorrent 8 beta also includes a new personal channel feature for sharing personal files among a small network of friends or a large network of colleagues.
To use this new service, users simply click an arrow link in the upper-right corner, just below the Options menu, to create a channel. They then customize the channel, including choosing a channel avatar that will appear in the channel bar above the main interface; add files to upload; invite others via e-mail, Facebook, or Twitter to the channel; and publicly leave messages for and respond to channel subscribers.
When you first invite somebody to the channel, the link that gets sent out detects whether they have BitTorrent 8. If they don't, the link downloads the beta and automatically subscribes them to the channel. If they do, it simply adds the channel. The channel acts as a grouping mechanism for the torrents contained within. Each file added gets its own torrent, so that subscribers don't have to fiddle with choosing files within a torrent.
Files can be added to a channel over time, allowing channel owners to create content themes. The parent of a child on a baseball team, for example, can add new videos throughout the season, and the parents of other children on the team can be invited to download them at their convenience.
There are also public, legal channels for file distribution under the "Discover Content" button on the left of the interface. BitTorrent divides these two types of files into the aforementioned personal content channels, and artist-endorsed content. The artist endorsed content so far includes the TED conference videos, the Bill Gates-endorsed Khan Academy free education series, Make Magazine, ClearBits-featured media, and the music discovery tool Musicshake.
The punk-pop band Sick of Sarah also has a channel of its own, illustrating that musicians can share high-quality versions of their videos and music. The band's latest album recently passed the 1 million torrent downloads mark, while recent legal access to the 2008 movie "The Yes Men" got it more torrent downloads on BitTorrent than it had HBO viewers.
Personal channels in BitTorrent 8 beta include commenting and social networking features. And Ghanem also noted that the beta has basic monetization features built-in via a PayPal link.
Video playback is a major concern not just for browsers, which are a more generalized content delivery tool, but for BitTorrent as well. The cost of licensing codecs for streaming and playback can be steep financially and cause otherwise unnecessary bloat to a program. Ghanem said that BitTorrent has plans for a "global transcoding strategy," and currently employs both H.264 wrapped in MKV, and MPEG4 ASP wrapped in AVI. However, he noted, "we'll probably use our own propriety 4CC code," eventually.
BitTorrent intends the channels to be shared among both private and public social groups, but if the channel link was accidentally posted in public the channel creator could delete the channel without affecting the locally stored files or the files already downloaded by channel subscribers.
The beta is available only in English and only for Windows computers. Also, at least during the beta phase of development, the channels feature in BitTorrent 8 will not impose file size restrictions and is free to use. Ghanem said he was unable to comment on whether the services would continue to be free of restrictions after BitTorrent 8 final was released.
LimeWire Settles Out-of-Court for $105 Million
Excerpted from Digital Music News Report
That's all? LimeWire and the four major labels have now reached an out-of-court settlement for $105 million, according to information confirmed by the RIAA. That includes a portion of personal liability for founder Mark Gorton, a potentially large deterrent to future entrepreneurs.
All in all, the agreed-upon damages are a far cry from the trillions (yes, trillions) originally demanded by litigating labels, an amount quickly tossed by District Court judge Kimba Wood.
The RIAA wanted blood in this fight, though the jury proceedings introduced some variables. According to one source, that included the possibility of a sub-$10 million reward, potentially due to the RIAA's bullying image. On the flip side, damages could also have reached the redialed maximum of $1.4 billion, and either side risked one of those extremes.
The result is the $105 million amount, hardly petty cash but far from the devastating level of monetary punishment the RIAA was seeking.
Regardless, RIAA Chairman Mitch Bainwol expressed happiness with the "large monetary settlement" after five years of litigation. "We are pleased to have reached a large monetary settlement following the court's finding that both LimeWire and its founder Mark Gorton are personally liable for copyright infringement," Bainwol said. "As the court heard during the last two weeks, LimeWire wreaked enormous damage on the music community, helping contribute to thousands of lost jobs and fewer opportunities for aspiring artists."
But labels also paid in other ways: years of litigation isn't cheap, and Bainwol himself earned millions in salary during the period.
Earlier, publishers secured an $11 million award. Five years ago, the RIAA secured a $115 million settlement against Kazaa.
New Bill Would Criminalize Unauthorized TV Show Streaming
Excerpted from Multichannel News Report by John Eggerton
A bipartisan trio of Senators on Thursday introduced a bill that would make unauthorized streaming of TV shows or movies a felony.
The bill (S. 978) was introduced by Senators Amy Klobuchar (D-MN), John Cornyn (R-TX), and Christopher Coons (D-DE), and came the same day that Senator Patrick Leahy (D- VT) re-introduced a bipartisan bill to give the government more tools to shut down websites that traffic in stolen intellectual property (IP), including TV shows and movies.
It is already a felony to download or upload unlicensed content, so the bill would just extend that to streaming, a recommendation made by White House Intellectual Property Enforcement Coordinator Victoria Espinel, "The Copyright Alliance" pointed out in praising the move.
In March, the Obama administration recommended that Congress clarify that streaming content without permission, in addition to downloading it, can be a felony. Espinel pointed out at that time that, under existing law, it is unclear that streaming copyrighted work can be subject to felony penalties because such penalties are "predicated on the defendant either reproducing or distributing the copyrighted work. While, intuitively, streaming would seem to pretty clearly be distribution, there has been some legal question about that designation."
The same groups, unions, studios, independent producers, who were praising the Leahy bill, lined up to salute the Klobuchar legislation.
"While downloading of our members' creative works remains the best known method, Internet streaming has actually become the preferred viewing and listening experience," said AFTRA, SAG, and others in a joint statement." Unfortunately, the law has not kept pace with these new consumer habits. While unauthorized downloading and distribution is a felony, commercial streaming of unlicensed films, TV programs, and music remains only a misdemeanor. We applaud the Senators for their leadership in today introducing legislation to remove unwarranted obstacles to the prosecution of websites that willfully stream valuable copyrighted works for commercial advantage or private financial gain."
Jean Prewitt, President of the Independent Film & Television Alliance (IFTA) added, "We strongly believe government enforcement and consequences are the only effective remedies for these types of unlawful activities. We highly commend Senators Klobuchar and Cornyn for this significant legislative proposal to ensure unauthorized streaming is treated as seriously under the law as is unauthorized downloading."
The Obama administration has made protection, security, and privacy of online content a priority given its push for universal broadband as a critical infrastructure component of the country's future.
Coming Events of Interest
Cloud Computing Asia - May 30th - June 2nd in Singapore. Cloud services are gaining popularity among information IT users, allowing them to access applications, platforms, storage and whole segments of infrastructure over a public or private network.CCA showcases cloud-computing products and services. Learn from top industry analysts, successful cloud customers, and cloud computing experts.
Cloud Expo 2011 - June 6th-9th in New York, NY. Cloud Expo is returning to New York with more than 7,000 delegates and over 200 sponsors and exhibitors. "Cloud" has become synonymous with "computing" and "software" in two short years. Cloud Expo is the new PC Expo, Comdex, and InternetWorld of our decade.
The Business of Cloud Computing - June 13th-15th in San Diego, CA. Cloud Computing is the latest disruptive technology. Enterprises, large and small, are looking to cloud computing providers for savings, flexibility, and scalability. However, potential adopters of all sizes are concerned about security, data management, privacy, performance and control.
CIO Cloud Summit - June 14th-16th in Scottsdale, AZ. The summit will bring together CIOs from Fortune 1000 organizations, leading IT analysts, and innovative solution providers to network and discuss the latest cloud computing topics and trends in a relaxed, yet focused business setting.
Digital Media Conference - June 17th in Washington, DC. The DCIA presents CONTENT IN THE CLOUD as part of the digital media business issues and law & policy tracks at this eighth annual gathering of over 500 of the most influential decision-makers in the media, entertainment, and technology industries.
Cloud Leadership Forum - June 20th-21st in Santa Clara, CA. This conference's enterprise-focused agenda, prepared with the help of nearly a dozen IT executives, will bring you case studies and peer insights on how leading organizations are approaching the cloud opportunity - plus much more.
Cloud Computing World Forum - June 21st-22nd in London, England. This third annual event is free to attend and will will feature all of the key players within the cloud computing and software-as-a-service (SaaS) market providing an introduction, discussion and look into the future for the ICT industry.