OpenCloud Keynote: Cloud in 20 years

5:48 PM
OpenCloud Keynote: Cloud in 20 years -

Last week I had the luxury of giving a speech at LinuxCon / San CloudOpen diego. Many good comments after the field that I decided to publish it as a blog. (Via shared Slides @slideshare hyperlinked)

---

con.cloudopen.keynote Linux of Ulander

at Citrix, we obviously thought a lot about the cloud. Not only the cloud today, but tomorrow the cloud and in the future

When looking to the future of technology, it is useful to take a look at existing technologies and see if patterns emerge -. And if the lessons of today's ecosystem can be applied to tomorrow and beyond.

to that end, we think there is no better model for the cloud open to study than Linux and its ecosystem. Not only the kernel, of course, if the core contains many fine lessons for any student development practices and open community. But also the entire ecosystem of open source and suppliers (including distributions) that have formed around the core.

Here are some of the things we have learned in recent years more than 20 Linux.

cloud is Not Highlander

Very good movie, a bad representation of the cloud market. One thing we learned from Linux vendors - there is much room in the market for open solutions. Community distributions like Debian and Fedora Red Hat, SUSE and Ubuntu - PostgreSQL, MySQL or Apache and Nginx. Several solutions can and do coexist and even cooperate and compete simultaneously. There is no reason why the need to be different cloud.

be user

I have had the luxury to join a small dinner where I was sitting next to Evan Williams, founder of technology behind Twitter. As was often quoted as saying, to be successful developers must be passionate about technology users and understand their user community. In my experience, successful projects focus not only on the developer, but also the customers. Linux resolved at a lower cost unix, optimization of x86 hardware, faster servers "peak" - Linus has not only built because it was "cool" and Red Hat have increased their activity based on the characteristics that were directly derived from customer needs. At Citrix, we take a lot of pain for our complexity due to options or features, but each piece of code is tied back to a customer. Similarly, the Apache community has encouraged customers to take not only a part of the community, but they have the opportunity to lead and influence -. A powerful engine for the success of a project

Manual Management Software Doesn 't Scale

vendors and administrators had to learn the hard way that managing manual software - whether admins compiling the source or vendors supplying software that needs to be maintained separately from the management of the system packages - not to scale. Exclusive installers execution or configuring packet source does not allow easy updates and rapid deployment to multiple machines.

Similarly, across the cloud, best practices demand that administrators embrace configuration management tools like Puppet or head to get the most out of their environments. Configuration templates and virtual machines manually lock developers in the slow process prone to errors that resized, or for environments that include two or more hypervisors and thousands of people.

early technology are often thrown Favorites

Being first to market is no guarantee of long-term success, or even survival. Favorites Technology can be abandoned with astonishing speed when better technology emerges or political problems are a problematic project.

Consider the Linux distributions that have been popular over the years. Soft Landing Systems (SLS) Linux was the first Linux distribution, but was quickly superseded by Slackware because it was buggy and not updated frequently enough. Slackware, all developed and used today, has been moved to "mainstream" used by Red Hat, SUSE, Debian, Ubuntu and others.

each turn, the distribution that meets the needs of users better was the one who managed - not the distribution that had the status of first mover. Lesson? You can not assume that to be an early favorite will ensure long-term success, or even survival.

Only individuals Standing in the Community

It doesn 't matter if you work for Red Hat, IBM, Citrix, SUSE, or the hosting company Picayune - the developer position in the community is based on their contribution and reputation only. You can not just walk into the Linux kernel mailing list and wait for a patch accepted because you work for the company A.

With well-managed projects such as the Linux kernel or Debian or PostgreSQL it is individual developers that drive the projects. Governance is designed to ensure that the project health comes before the interests of a particular company.

This, incidentally, is one of the main reasons we chose Apache.

Apache provides a well understood and tested governance model, accreditation understood, and an umbrella that gives private contributors and business confidence that they will be on an equal footing when participation in the CloudStack development. Citrix employees must earn their way in as much as any other contributor community -. which is exactly how the projects should be governed

Do your work in the Open

Another lesson we observed the last 21 years of Linux development is that the work must be done in the open. When companies or individuals hold their changes - either for competitive reasons or to get things "just right" before submitting to public scrutiny - the community is poorer for it. Often this means that there is technical debt payable in merging the code lines in the project. We observed time and again, most recently with all the headaches that people experienced Linux kernel with different trees ARM

We believe that open source means more than dropping the code at random intervals -. Work needs to do outdoors as well, so that we can benefit from the contributions of the entire community rather than those who are behind the corporate firewall.

be boring, but useful

Once upon a time, Linux was "exciting" in the sense that the kernel and distributions are constantly adding big news features that have helped Linux become competitive with proprietary Unix and / or Windows in the market for businesses and consumers.

Although Linux adds functionality at an incredible pace, sometimes in the middle of the 00s, Linux has become boring.

And that was great. This meant that Linux was ordinary, quietly doing its job in the background without much hassle. Linux has conquered the data center. He conquered the top 500 list of supercomputers. Linux has become the most used smartphone operating system kernel in the world

We aspire to have an open cloud is just as annoying -. And necessary -. As Linux

Good Enough wins, but plan for the future

You have heard the saying, "the best is the enemy of the good," and this applies in particular to technology.

We looked at projects like GNU / Hurd floundered and never quite deliver a viable operating system, while the Linux community continually delivered and got the code in the hands of users and organizations needed a robust product now .

approach the Linux community has less elegant solutions - as ipchains and different schedulers, the original init system Sys-V, and more - to be eliminated and replaced by a better and more robust technology

a. at the same time, we have learned that you should avoid saddling you with so much technical debt that is impossible to iterate and improve.

Rome (and the Linux kernel) was not built in a Day

most companies, projects and individuals in the market are in a race to the finish, staking market demand and celebrate the victory. It is important to see where we are in this market. 100 Clouds, 1000 Clouds, clouds 10,000? We only scratched the surface of where this technology is headed and where we are at this moment in time. Only a small percentage of the total market has even begun to understand the technology, the market and what it means for them. In the Linux chronology, we may be no further than when the first Linux distributions started in 1994. We have a long way to go in our journey to the cloud.

In 20 years

So what does this tell us about the future of the cloud and where we will be in 2032? We know there is a lot we do not know. it was 20 years ago, we did not think the baby is all grown Linus and fueling huge parts of the Internet. We did not expect any more powerful smartphones the computing power of NASA lunar missions.

It is unlikely that Linus has designed its draft pastime power millions of digital video and broadcast systems continuously or Linux would give life to Google, Facebook, Netflix, or power the majority of the Top500 supercomputers. It is difficult to imagine that the systems look like in ten years, much less than twenty years.

Today, the open cloud runs on x86 systems products running on systems and open source operating hypervisors. Tomorrow? We could see many more ARM in the data center, as is being developed by companies like Calxeda. The cloud can be used to manage ARM machines high density core with each dedicated to its own host bare metal rather than stacking several clients on multi-core x86-64 systems.

We do know, however, that the future is open and the path of future lies with cloud open systems and non-proprietary. Our customers and the community spoke loudly in favor of systems that can not only manage easily, but they can study and contribute. Customers have learned time and again that closed systems are to poor infrastructure.

The future of cloud in 20 years? It is quite open.

Previous
Next Post »
0 Komentar