Entries tagged [asf]

Tuesday August 07, 2018

Success at Apache: the Apache Legal Shield - a pragmatic view

by Bertrand Delacretaz

I became active in the ASF in 2001 via Gianugo Rabellino -- he was the one who started the discussions with Apache Fop about me donating the jfor XLS-FO to RTF converter that I had developed earlier. It was already too late to uninvent RTF which is a terrible format, but I digress. I am currently a member of the Board of Directors of the ASF and have been doing a lot of thinking (and presentations) about what makes the ASF tick in terms of collaboration and Shared Neurons.

Section 12.1 of the Apache Bylaws https://www.apache.org/foundation/bylaws describes the legal protection that the Apache Software Foundation provides to our directors, officers and members.

I'm not a lawyer by far, however, and that language is a bit hard for me to parse, so I thought I'd try to clarify what this means for our contributors and learn more about it in the process.

If you go into detail there's certainly more to it but I think the items below are the absolute basics that every PMC member https://www.apache.org/foundation/how-it-works.html should understand in order to benefit from the legal shield that the Foundation provides.

What is a "Legal Shield" ?

An important goal of the Apache Bylaws and policies is to isolate our contributors from any legal action that might be taken against the Foundation, if they act as specified in those policies.

That's what we mean by "legal shield": a way for our individual volunters to be sheltered from legal suits directed at the Foundation's projects, as mentioned in our "How the ASF works" document https://www.apache.org/foundation/how-it-works.html .

Acts of the Foundation

The first thing is to make sure our software releases are "Acts of the Foundation" as opposed to something that people do in their own name. This is natural if we follow our release policy https://www.apache.org/legal/release-policy.html , which defines a simple release approval process for releasing source code that makes the project's PMC https://www.apache.org/foundation/how-it-works.html responsible for the release, as opposed to our individual contributors and release managers.

This means that if the released software is ever involved in legal action and someone has to testify or produce information as part of a subpoena, or worse, it's the Foundation which is in charge of that and not our individual contributors. These things happen from time to time, not very often but they can represent a lot of work and aggravation that none of us are looking for. The 2011 subpoena to Apache around Java and Android http://www.groklaw.net/articlebasic.php?story=20110509221136468 is just one example of that. Produce documents reflecting all communications between someone and Apache, how fun is that?

The goal of our release process is to make it very clear what an Apache Release is, and also clarify that anyone using our software in other ways, by getting it directly from our code repositories for example, does so at their own risk. If it's not an Apache Release we didn't give it to them, they grabbed it on their own initiative and have to accept the consequences of that.

The Rest is for Contributors

This leads to a second and related item: developer builds, which happen much more often than releases, often daily, and that people can easily download and use.

Those builds are meant for contributors to our projects, to use in development and testing as part of their contribution activities.

To avoid any confusion, it is important to clearly label them as such, and to draw a clear line between them and official Apache Releases. They should only be advertised in places where developers who are part of our communities (as opposed to the general public) can see them, and with suitable disclaimers.

In our world of continuous deployment and automated builds, the lines between what's a release and what's just tagged code that works for someone are often blurred. That's totally fine from a technical point of view, and often desirable when one wants to move fast, but we shouldn't forget about the possible legal implications ot distributing software.

Let's make sure we take advantage of the well-designed Apache Legal Shield that the Foundation provides to us, by strictly following our release policy and clearly specifying what is what in terms of downloadable software.

I never thought I'd write a blog post on a legal topic, so here's the FUN DISCLAIMER: As mentioned, I am not a lawyer by far, and the above should not be considered legal advice - just a pragmatic view that can hopefully help our contributors better understand the related issues. For legal advice, consult your own legal advisor! And if you're thirsty after reading all this, get a drink and give a toast to the ASF and its founders!

Many thanks to the fellow Apache members who provided feedback and additional ideas for this post.

. . . 

Bertrand Delacretaz works as a Principal Scientist with the Adobe Research team in Basel, Switzerland. He spends a good portion of his time advocating and implementing Open Development as a way to make geographically dispersed teams more efficient and more fun for his coworkers. Bertrand is also an active Member of the Apache Software Foundation, currently on his tenth term on the Foundation's Board of Directors (Fiscal Year 2018-2019).

= = =

"Success at Apache" is a monthly blog series that focuses on the processes behind why the ASF "just works" https://blogs.apache.org/foundation/category/SuccessAtApache

Monday July 09, 2018

Success at Apache: The Apache Way for Executives

by Alex Karasulu

I'm a long time member of the Apache Software Foundation and have been an executive officer of several corporations over the course of the past 20 years. I've co-founded several projects in the community and mentored several others.

The "Apache Way" has benefited several aspects of my life, however I never imagined it would help make me a better executive. Even non-technical executives, in organizations totally outside of the realm of technology, can benefit from the Zen of the Apache Way.

Life is hard when you're stupid

I was involved in a number of early dot com startups as an executive, however that was before my involvement with Apache and long before any exposure to the Apache Way. To this day, I remember how opportunistic decisions for short term gains, the lack of collaboration, openness and communication kept causing friction that made my job and ultimately my life much harder than it had to be.

Learning while on the job

Exposure to the philosophy began early even while lurking on mailing lists but picked up more while incubating the Apache Directory Project where I worked with others to grow an active community. Meanwhile, I was the Chief Technology Officer of a large financial services company called Alliance Capital Partners. It was 2002, and the first time I had to conduct myself as a C-Suite executive in an enterprise that was obviously not a technology company. Incidentally, the lack of hands-on coding got me working on a pet project that ultimately became the Apache Directory Server and Apache MINA. The project was medicine to keep me sane and technically up to date. Unbeknownst to me, this would save my career, not as a developer, but as an executive.

The Apache Way makes life easier

The most important and first lesson I learned from the Apache Community was to avoid short term gains that were unsustainable in the long term. This very important core principle derives in part from the concept of "community over code". It does not matter how much code you write, or how good your code is if you cannot get along, compromise, and communicate respectfully with your peers. The code does not write itself, its the community behind it that keeps the code alive. Involving only the most technically proficient contributors should never trump the need to build a sustainable community. I saw projects often suffer from self-centered yet skilled coders added as committers for short term gain at the detriment of a healthy sustainable community. So as a corollary to community over code, avoid short term gains that get in the way of the long term sustainability of an organization's culture. This has immense applications for any executive in both technical and non-technical fields.

While growing my new development organization in this financial services organization, I decided to avoid hiring people that seemed to be very skilled technically but lacked the desire or social skills to collaborate with others. Thanks to experiences at Apache, I could start telling them apart much better than I did before. Also, I was calmer and less anxious when hiring to fill gaps on the team. It was better not to have the resource than to introduce a bad apple onto the team. 

This was contrary to how I had operated earlier and started producing great results. The application of this basic principle lead to a solid team that worked better together than ever before in the past. They were able to leverage each others' skills thanks to collaboration to out perform any one skilled developer. This is all thanks to the concept of community over code where social skills, and collaboration were stressed more than technical skills. In the end, being kind, listening, and asking smart questions begets the kind of collaboration needed to build complex software. 

Not only did this help with developers, it also worked with teams that did not produce code like project managers under the CTO office. The rule is golden, and IMHO should be applied to any executive's decision making process regardless of the nature of the business or topic at hand.

Inner Source is the Apache Way

Executives drive the architecture and cultural direction of their organizations and the Apache Way provides a solid framework to create healthy foundations through open collaboration, communication and the availability of knowledge for everyone to participate.

Several very successful technology companies have adopted the Apache Way without really realizing they're doing so.  In 2000, Tim O'Reilly coined the term Inner Source https://en.wikipedia.org/wiki/Inner_source to apply Open Source principles to any organization. Tim was essentially talking about applying the Apache Way within organizations. The Apache Way has proven itself with companies like IBM, Google, Microsoft, SAP, PayPal and even financial institutions like Capital One which have adopted the Inner Source methodology which is one and the same.

Without going into the details, of which we the Apache Community are intimately aware (using it daily within our projects), I would like to stress how important the approach is for executives outside of Apache to understand. The Apache Way can save organizations from all out disaster, not to mention billions of dollars by impacting the quality of services and products they produce. Again this does not only apply to companies in technological sectors. Capital One a financial services company has also used Open Source methods for internal projects to be extremely successful https://www.oreilly.com/ideas/using-open-source-methods-for-internal-software-projects .


The Apache Way provides several benefits to executives aware of the approach. Executives can directly integrate the principles of the Apache Way into their own thinking to improve their potential for personal success. However the biggest value comes from the cultural framework it produces for the entire organization, however to leverage it in their organizations, executives must be aware of it. The Apache Way has personally helped me grow as an effective executive and it can help others as well. It also provides a compass for how to properly build effective organizations, not only technical ones.

Alex Karasulu is an entrepreneur with over 25 years of experience in the software industry and a recognized leader in the Open Source community. He is widely known as the original author of the Apache Directory Server, used by IBM both as the foundation of the Rational Directory Server and also integrated into the Websphere Application Server. Alex co-founded several Apache projects, including MINA, and Felix, among others, which, along with their communities, thrive independently past his day-to-day involvement in the projects. He is the founder of Safehaus, where he authored the first low-resource mobile OTP algorithms in Open Source with the OATH community that was later adopted by Google in their Authenticator product. In addition to IBM, Atlassian, Cisco, and Polycom are just a few of the many companies that sell commercial hardware and software solutions that bundle or embed software and products that Alex has created. Alex holds a BSc. in Computer Science and Physics from Columbia University. He is the founder and co-CEO of OptDyn.

= = =

"Success at Apache" is a monthly blog series that focuses on the processes behind why the ASF "just works" https://blogs.apache.org/foundation/category/SuccessAtApache

Monday June 04, 2018

Success at Apache: the Chance to Influence the World

by Weiwei Yang

I submitted my first patch to Apache Hadoop in 2015, a very simple bug fix with just a few lines of changes. However the feeling is still vivid to me when the patch was accepted, I felt great accomplishment. It was not about how big the change was, but rather because I knew even a small change would help a lot of people. This is the best thing I like about working in Open Source, the work I've done has the chance to influence the world. 

As of today, I have contributed nearly 200 patches to Apache Hadoop, over 20k lines of code. I still feel happy when the community accepts my patches. I believe that having such passion is an essential for an individual contributor to make the way to Apache. Unless your company paid you to work on Open Source, you must find yourself such accomplishment during the work, otherwise the commitment won't last. Like me, I spent over 3 years until I received commit privileges for Hadoop. In retrospect, it was a tough, challenging but fast growth journey. I am glad I did not give up and finally get where I am now.

If you are hired by a commercial company that sells products or services powered by Open Source software, then congratulations, you are on a shortcut to Apache. Such companies usually have a strong team working directly on Open Source projects and a lot of committers. Being a member of such organization, you will have more time working on the project, get faster feedback of your patches, opportunities to participate more discussions and much deeper involvement. Unfortunately, I was not working for such companies. Moreover, my native language is not English and I have a big timezone gap with the majority people from the community. That makes my path to Apache much more difficult. I believe there are many people, just like me at 3 years ago, who are willing to contribute but finding it hard to. In this post, I will share some tips how to work with the Apache community and how to grow up to a committer.

First, it's important that you know things that are public to everyone. Every Open Source project has its own tutorials introducing how to contribute, be sure you have read that before working on any patches. Those documents generally tell you how to contribute code in the "Apache" way, and how to collaborate with the community.

Second, don't mind fixing bugs. Actually I suggest to begin with fixing bugs. You may find bugs in your daily work, or somebody reported to the community. No matter if they are big or not, bugs must be fixed so that it's easier to get attention from the community. In an Open Source community, everyone volunteers to review some other ones' patches. So don't be upset if nobody gets to your patch quickly, try to soft ping committers around this area. But never push them for anything. And always be polite.

More involvement. There are many ways to get more involvement. First, if a community sets up a MeetUp once in a while, try to attend even you are remote or in an inconvenient local time. Such MeetUps can help you gather information of the development status, current community focus etc. It also helps others to get familiar with your face; second, try to participate in more discussions. This could be discussions on mailing lists, issue tracking systems or a Web conference that discusses a particular issue/design. In my opinion, this is the hardest part especially for contributors from overseas.

Be self-motivated and passionate. Nobody forces you to work on Open Source projects, you need to keep motivating yourself. Like I first mentioned in this post, there are more ways to be self-motivated than just feeling accomplished. Working in the community gives you the chance to work in a diverse environment, meet people from different companies and different countries; you can get as many chances as you want to solve difficult real problems, and improve your skills; you can build your reputation in the community which also helps your career development.

I truly hope my experiences would help people. Now I am working at Alibaba Group, and it gives me more reason to write this post. I see a lot of talented people around, they have solid skills, they have done and are doing a lot work to make Hadoop better. They are open to contributing back but are having various of difficulties to work with the community. I am committed to helping grow this community, and I do believe an open and diverse community will help the project thrive.  

Weiwei Yang is a Staff Engineer working at Alibaba Group. He has been working on Big Data area for over 8 years, most of time working on Apache Hadoop. He contributed to several Apache projects such as YARN, HDFS, MapReduce, Ambari and Slider, and an active Hadoop committer. At present, he is working in Alibaba’s data infrastructure team and is focusing on evolving Apache YARN to support mixed workloads, improve performance and cluster utilization. Prior to that, he worked in IBM for several years and won multiple Open Source contribution awards.

= = =

"Success at Apache" is a monthly blog series that focuses on the processes behind why the ASF "just works" https://blogs.apache.org/foundation/category/SuccessAtApache

# # #

Monday May 07, 2018

Success at Apache: Dip into the Apache Way

by Nick Couchman

Like other recent contributors to this blog, I am not a developer by trade. My day job is as a Linux Systems Engineer and team manager, and, truth be told, my programming skills are not something I would rely on to make a living. Despite these facts, I've found something beyond acceptance in being a part of the Apache Guacamole project: mentoring.

Most of my experience with The Apache Software Foundation (ASF) has been with retrieving the Apache Web Server (httpd) http://httpd.apache.org/ from the download page, and getting involved with the ASF was more accidental than anything else. I've brushed arms with the Guacamole project http://guacamole.apache.org/ at several times over the past decade. As a systems administrator/engineer, and one who prefers Linux to some of the commercial alternatives, I'm always happy to see software produced that is truly cross-platform, and, as many current trends are demonstrating, Web browser applications are the pinnacle of cross-platform applications. I used Guacamole in various applications in my place of employment, but always saw opportunities to improve it – add a feature here or there, make it more administrator or user friendly, etc.

After a recent job change, I found myself with a little more free time than I had previously had, and a desire to do something productive with that time. I started thinking about how I could give back to the Open Source community  I've long been a user of many software packages made freely-available to the world, and my appreciation for the developers and companies that produce and support these efforts had, for a while, made me want to do something to return the favor and give back to that community. I also needed to challenge myself and fill some of my free time, and growing my programming skills seemed like a good way to accomplish these goals.

When I settled on Guacamole, I found that it had entered into the Apache Incubator http://incubator.apache.org/ programming in an effort to get the project accepted by The Apache Software Foundation. I thought that was cool, but didn’t think much else of it at the time, and I knew little about the organization. The Incubator program helps potential ASF projects learn how to create a certain culture and community that encourages development and interaction.

This culture is created, in large part, by the Apache Way, a set of guiding principles and behaviors for projects within the ASF. One of the biggest keys to my success, thus far, in contributing to the Guacamole project is the concept of mentoring  not a behavior or principle officially outlined in Apache Way documents, but rather a byproduct of those principles. It seems that it is very human to be dismissive of people that don't measure up to our standard in some way or another, and my programming skills are, by far, the weakest of any of the current contributors to the Guacamole project. However, instead of ridicule or dismissal or discouragement, the other developers within the project have been accepting, helpful, and provided guidance.

And, as with any good education opportunity, they don't do this by giving me the answers or telling me how to do something, they do it by providing examples, references, and pointers that help me to think through the why and make my way to the how to write better code. The result? I still wouldn't rely on my programming skills for my day job, but I've come a long way in the 18 months that I've been a part of the project, and the code I write today is better than when I started.

Finally, this involvement actually makes me better at my day job. Not only does it give me a stronger appreciation for the effort that goes into writing the software that I use on a regular basis, but, more practically, it gives me a stronger set of skills for debugging problems and tracking down bugs that occur. I'm better able to locate the actual cause of problems, provide useful descriptions of those problems, and interact with the software engineers and developers in various places responsible for writing, improving, and supporting those applications.

At this point, my involvement with The Apache Software Foundation is limited to the Guacamole project, and will probably stay that way for the foreseeable future, but it's great to be involved with an organization and community that has a very diverse community of developers and projects, and know that, should I choose to add another challenge to my life, there are other projects out there that would welcome the involvement and would provide similarly positive experiences in helping me grow in my ability to give back to the open source community. If you're itching to dust off or learn some programming skills then I encourage you to look at the many available Apache Software Foundation projects available and jump into one of the communities. You'll almost certainly want to join one of the mailing lists for the project and your involvement can grow from there.

Nick Couchman is a Senior Linux Systems Engineer and Technical Team Lead for a major cosmetics conglomerate, and spends his days trying to convince everyone that they should run more Linux and less...other stuff.  He spends his evenings with his family and increasingly small amounts of free time contributing to the Apache Guacamole project, learning how to write C, Java, and JavaScript.

= = =

"Success at Apache" is a monthly blog series that focuses on the processes behind why the ASF "just works" https://blogs.apache.org/foundation/category/SuccessAtApache

# # # 

Tuesday April 10, 2018

Success at Apache: Am I there yet? A n00b's perspective

by Charles Givre

Let me start out by saying that I am not a developer. I do have a technical background, but I hadn't coded in Java for at least 10 years before I got involved in the Apache Drill project. One has to wonder how, as a non-developer, I ended up as a committer for the Drill project. In this blog post, I'd like to share with you how I came to be involved with the Drill project.

But first, why Drill?

I first heard about Drill at an industry conference several years ago. I was speaking with Dr. Ellen Friedman about some data issues we were having and she casually mentioned have I tried Drill? I had not heard of it at that point, so I did some research and it seemed as if Drill could solve a lot of problems that my clients were having. But then, I tried using it and kept getting stuck.  

If you aren't familiar with Apache Drill, Drill is an SQL engine which allows you to query any kind of self-describing data. After experimenting with Drill for a while, I was impressed enough to thing that the tool had major potential in security. One of the biggest problems that Drill solves is the need to Extract, Transform, Load (ETL) data into an analytic tool before actually doing analysis of that data. This ETL process adds no value to anything really, and costs large enterprises literally millions of dollars as well as adding unnecessary delays between the time data is ingested and when the data is actually available for analysis. In security applications, this delay directly translates into risk. The longer it takes to make your data available, the more time it will take to potentially find malicious activity and hence, more risk. Therefore, if you're able to query the data without having to do any kind of ETL or ingestion, you are lowering your risk as well as potentially saving millions of dollars.

Getting Involved

Unfortunately, when I started using Drill, I saw this potential, but I couldn't get it to work. My next step from here was to try to get assistance at my company. I pitched the ideas to my company leadership, but it proved very difficult to get the company to pull Java developers from revenue generating projects to work on this "pie-in-the-sky", unproven project. After spending several months on this, I got really frustrated and decided that I was going to try to do it myself, however, I really had no idea what I was doing. I hadn't coded in Java for at least 10 years at the time, and had zero experience with all the modern Java development tools such as Maven and Git. What I did have was persistence, so I started asking for help and decided that I was going to dive right in and start adding the functionality that I felt Drill needed to be useful in security applications. I started working on something that someone else started—the HTTPD format plugin for Drill. Most of the coding was done, but there was still enough there for me to get my hands dirty and start figuring things out.

What I learned

I still would not consider myself a developer, but after getting that particular item committed to the codebase, I learned a lot about how open source projects actually work as well as writing production quality code. Since then, I've tried to add at least one bit of new functionality to each Drill release. I would encourage anyone who is interested in contributing to an Open Source project at the Apache Software Foundation, to dive right in, and start. There are still a lot of ideas I have for Drill, and with time, I hope to have the time to see them through to implementation.

In conclusion, I'm fairly certain that my involvement with Drill and the Apache Software Foundation is really just beginning. I'm currently working on the O'Reilly book about Apache Drill with a fellow Drill committer. It is my hope that the book will spark additional interest in Apache Drill. Open Source software is at the heart of the ongoing data revolution which is dramatically expanding what is possible with data. I firmly believe that Apache Drill will have a role to play in this data revolution and I'm honored to have the opportunity to play a small role in developing Drill.

Charles Givre CISSP is a Lead Data Scientist at Deutsche Bank where he works in the Chief Information Security Office (CISO). Mr. Givre is an active data science instructor and regularly teaches classes about data science and security at various industry conferences, such as BlackHat. Mr. Givre is a committer for the Apache Drill project and together with Mr. Paul Rogers, is working on the forthcoming O’Reilly book about Apache Drill. He can be reached at cgivre(at)apache(dot)org.  

= = =

"Success at Apache" is a monthly blog series that focuses on the processes behind why the ASF "just works" https://blogs.apache.org/foundation/category/SuccessAtApache

# # #

Thursday March 22, 2018

Announcing New ASF Board of Directors

At The Apache Software Foundation (ASF) Members’ Meeting held this week, the following individuals were elected to the ASF Board of Directors:

 - Rich Bowen
 - Shane Curcuru
 - Bertrand Delacretaz
 - Isabel Drost-Fromm
 - Ted Dunning
 - Brett Porter
 - Roman Shaposhnik
 - Phil Steitz
 - Mark Thomas

The ASF thanks Chris Mattmann and Jim Jagielski who chose not to stand for re-election this year. The Foundation thanks them for their service, and lauds Jagielski's service as a member of the ASF Board for the past 19 years.

An overview of the ASF's governance, along with the complete list of ASF Board of Directors, Executive Officers, and Project/Committee Vice Presidents, can be found at http://apache.org/foundation/

For more information on the Foundation's operations and structure, see http://apache.org/foundation/how-it-works.html#structure

# # #

Monday March 05, 2018

Success at Apache: Open Innovation from a Non-native English Country

by Von Gosling

When I saw the "Success at Apache" series, I thought about writing something about my, being from a non-native English country, Open Source experience these past few years. Last year, RocketMQ graduated from the Apache Incubator and became one of the Apache Top-Level Projects. As one of the original co-founders of RocketMQ, I was proud to see an Open Source community from Apache RocketMQ that has an ever-growing diversity. The Apache Software Foundation (ASF), one of the most famous and great technology brands, has thousands of companies’ software infrastructure based on their projects. This is proven from the worldwide download mirror activity in ASF statistics. As an early implementer/pioneer of Open Source in China, Apache HTTP Server, Apache Tomcat, Apache Struts 1.x, and Apache Maven are my favorite software stacks when I worked for building distributed and high-performance websites.

Last year, I wrote an article about the road to the Apache TLP, which is published in China’s InfoQ. Some people asked me how to be more ‘Apache’ and how to build a more diverse community. These are the questions that many people are concerned about. In this blog post, I will address how to be more collaborative around the world, especially in non-native English countries.

Open Communication
With more and more instant messaging apps coming up in Android and IOS world, the younger generation prefers to communicate using such way, which has spread to the daily coding life for the majority of people. But, it is not search engine friendly and in most cases it does not support multi-channel for multi-language. I have been involved in many such local technology groups, together we have discussed what went wrong, explored ideas about how to solve it, and come up with a good solution together. This method worked for all my past projects, but when we hope to be more involved in Open Source around the world, that method does not work well. I remember clearly when RocketMQ began to discuss the process for its proposal, some people complained about what we have to do in the local community. We learned much about from this discussion in the community, and thus, found an effective solution. Hence in the Apache RocketMQ community, we encourage users to ask the question using the user email list. In order to make the communication process effective, we answer the question in the same language of the question. With more and more committers coming from different countries, this solution will help to grow the more diverse community. But, as John Ament said in another "Success at Apache" post https://s.apache.org/x9Be --open communication isn't for everything. We also allow private communication between the users and us as some questions might not be proper to discuss publicly. But that isn't a part of the decision making process. Likewise, anytime we're talking about individuals in either a positive or negative way should be conducted on the private list for a project.

Easy ways to be involved in the community
This is another top concern in the Open Source world. Some people may not know that in China there are many local communities about Apache Projects, such as Apache HTTP Server, Apache Tomcat, Apache Spark, and Apache Hadoop. Such Projects have corresponding Chinese documentations. On the other hand, we try our best to improve the English documents. We consider the messages behind every document page. If one finds a minor or big native narrative polish, one could leave a message, or send feedback to our dev or user email list. Besides documentation, we also hold programming marathons in the community irregularly to get more involved with the community. We could find more users who have more interest, especially cross-domain technology in such campaigns. Recently, we open sourced more tasks in the Google Summer of Code. Students will develop Open Source software full-time for three months. We will provide mentoring and project ideas, and in return have the chance to get new code developed and --most importantly-- to identify and bring in new committers. It is another chance to let PMC members know how to improve and let more students get involved in the community easily.

In China, Internet giants like Alibaba are devoting themselves into Open Source projects hence according to my personal experience, it made sense to help more excellent Chinese projects to come into the Incubator. Right before the Lunar New Year, another famous project from China, Dubbo, started its Apache journey. I am glad to be a local mentor and hope to continue to share what we have learned. Thanks to the ASF, more and more Open Source projects will benefit our daily coding. That is a great appeal around the world’s Open Source field.

Von Gosling is a senior technology manager working at Alibaba Group. He has extensive industry software development experience, especially in distributed tech., reliable Web architecture and performance tuning. He holds many patents in the distributed system, recommendation etc. he has been a frequent speaker at Open Source and architect conferences worldwide including ApacheCon and QCon. He has been the lead for messaging at Alibaba as well as the Tenth and Sixteenth CJK OSS Award recipient. He is the original Apache RocketMQ co-founder and Linux OpenMessaging Standard Initiator.

= = =

"Success at Apache" is a monthly blog series that focuses on the processes behind why the ASF "just works". 1) Project Independence https://s.apache.org/CE0V 2) All Carrot and No Stick https://s.apache.org/ykoG 3) Asynchronous Decision Making https://s.apache.org/PMvk 4) Rule of the Makers https://s.apache.org/yFgQ 5) JFDI --the unconditional love of contributors https://s.apache.org/4pjM 6) Meritocracy and Me https://s.apache.org/tQQh 7) Learning to Build a Stronger Community https://s.apache.org/x9Be 8) Meritocracy. https://s.apache.org/DiEo 9) Lowering Barriers to Open Innovation https://s.apache.org/dAlg 10) Scratch your own itch. https://s.apache.org/Apah 11) What a Long Strange (and Great) Trip It's Been https://s.apache.org/gVuN 12) A Newbie's Narrative https://s.apache.org/A72H 13) Contributing to Open Source even with a high-pressure job https://s.apache.org/lM9O 14) Open Innovation from a Non-native English Country https://s.apache.org/lh61

# # # 

Monday February 26, 2018

Success at Apache: Contributing to Open Source even with a high-pressure job

by Anthony Shaw

I believe in the mission of the ASF for many reasons, but the first is the reason why I got into open-source software- free and open access to knowledge.

Back when I was age 12 (1998), I started to learn to program in dBase 4. dBase 4 and the compiler Clipper were not cheap, especially for a $5-a-week paper-round. The box with the software was unwanted by a local company and it came with the manuals. We didn't have the internet at home yet so I was left to go by the manual, and what I could find from second-hand stores and office cleanout sales. For the next decade, I learnt to case based on what I could find, borrow and scavange until in 2002 when I got a copy of Linux and assembled a couple of machines from unwanted parts from the village computer store.

This is where I discovered free and open-source software and really started to build on my coding skills.

My goals were to learn and to share what I'd learnt that others could get to where they needed to go faster. It also helped that software skills were well sought-after in Europe so it set off me in a career in IT.

20 years after I learnt to code, I've moved out of software-engineering and into Learning and Development at Dimension Data for a 29,000 person technology company that operates in 49 countries across the world. My current roles involves about 3-months a year of travel (15 countries typically), managing a department of over 30 people spread across 4 countries and 4 timezones and delivering on large and complex initiatives with high-degrees of change and short deadlines.

In 2016 I made a choice after getting promoted into my current role that I would continue to contribute to the open-source projects I'd worked on for years. But I set myself 3 rules;

1. I would not take away from time with my family
2. I would not interfere with my work commitments
3. I would look after my health

My open-source contributions

For the past 4 years I've made around 1,000-2,000 contributions annually. These have consisted of bug fixes, submissions, and to around 50 projects.

The largest contributions I've made have been to Apache Libcloud, a multi-cloud abstraction library written in Python. Initially this was driven by a work commitment to contribute an integration with the cloud API we'd designed, but I soon realised the power of the library. Going back to my original goal of free and open access to knowledge, I'd seen an alarming trend in the computing world. Proprietary APIs were driving what is known in the industry as "stickiness" or to be frank, lock-in.

Cloud lock-in means that anyone without access to a reliable network, money or willing to sign up to these contracts is being pushed out of advances in technology. I know developers that are students, in remote areas such as rural Australia, Asia and Africa, or those who simply have little money.

Apache Libcloud's design means that you can design applications which can be deployed to OSS platforms like Apache CloudStack and OpenStack.

After finishing the work driver around 100 hours developing a container abstraction layer for Apache Libcloud that meant that developers could write automation for OSS platforms like Kubernetes using the same API as you would with a public cloud provider.

This was all whilst managing family time, work commitment and my health.

These are my 3 tips for maintaining contributions with a high-pressure job:

1. Pick a project that you care about

This is the most important, something that just sparks your curiosity is good fun, but long term interest often dwindles. I've been victim of "ooh shiny thing" many times in the past, but as my career has taken off, I've had to develop the discipline to stop myself from writing my own scripting language, or building an automated sprinkler system from scratch. I stop and remind myself that I might have the time this second, but what about next week and next month? Stop and prioritise.

Prioritise projects that mean something to you.

The 2 OSS projects I commit the most to are Apache Libcloud and SaltStack. I believe in Apache Libcloud's mission of giving open-access to cloud platforms. My SaltStack contributions have been focused around cloud abstraction, networking API abstraction and other fixes and utilities that make it easier for developers and end-users.

The difference between picking something shiny and something you believe in is that long-term you commit more and you find it easier to jump in and help when you can. But how do you find the time?

2. Choosing your tasks wisely and making time

I get asked this question all the time, "how do you find the time". When I try and convince people to contribute to OSS the response is always about time.

Get rid of the things that don't add value

If you can afford to, hire help to give you back time in your week. Not only does open-source help with your skills and knowledge, but it increases your value to a potential employer. Hiring someone to blow the leaves, or help with the chores once a week doesn't need to cost a lot, but if you work out how much value you can get back from that time it often makes sense.

Another thing I've been strict about is binge-watching TV series and gaming. Playing 100-hours of the latest game might be fun, but I find developing more rewarding in the medium-to-long term. Find ways to unwind that don't consume so much time, like meditation, exercise, or reading.

But, if you do need to put your feet up and watch some TV for a few hours, don't feel guilty about it. 

Work smart, not hard

When I do sit down to contribute something, it'll have been carefully planned and thought through what I'm going to do, what I'm going to test and how I'm going to structure it. I try and complete tasks quickly, with foresight and a goal. Once I've completed this 1 module, with tests, I'll submit my contribution. Don't try and refactor the whole project over a weekend. Keep it simple.

But we all know sometimes the best plans go out the window. If you find yourself going down one of those rabbit holes, where you can't get something to compile or you can't debug one of those zombie bugs we love so much as developers.

Stop yourself.

You can easily sit until 3am banging your head against the wall trying to figure it out. This was my advice when I used to manage development teams. If you get stuck, take a break, ask for help and if that still doesn't work, move onto something else. 

Sometimes I pause working on a task if I can't figure it out. Pause for an hour, a week, or even a whole year. When you have one of those "aha" moments, you go back in and finish the job.

It saves time, it delivers better software and it's a good skill to have as a developer.

Find time

A contribution comes down to 3 things:

1. An idea
2. An understanding
3. A "change", like a fix, feature, test, code-review, documentation etc.

The ideas come to me through reading, listening to users or looking at bug submissions. I do this as and when I have a spare minute. This is normally on my lunch break, when I'm waiting for someone or something. 

The time for understanding I get by listening to podcasts and talking to people at conferences. I get a few hours a week in the car and I spend time doing some chores. During that time I always have headphones on to listen the newest Python podcast or OSS update.

The time to sit down and write, code, or test comes for me on the plane (where I'm writing this blog post!). Last year I did enough miles in the air to fly around the world 8 times, most of that time was spent coding, relaxing or sleeping. Aside from that, time spent in airport lounges, on the train or waiting for people I'll whip out my laptop. Any plane that has Wi-Fi I can push changes, else the minute we land I'll have a laptop open and running git push.

Weekend-time is off limits unless I'm travelling or I'm alone. That's rule 1 -- do not take away from time with the family.

3. Managing your workload and avoiding burnout

There are 2 components to this, managing your work commitment and managing your contributions. You need to do both to succeed. 

It's ok to stop and take a break. There is always a pull-request to merge, a bug to inspect, and an email from an end-user. If you need to take a break for a while, talk to the team, ask for help and be frank. We're all in the same boat, contribution is optional. 

So many times I see people contribution feeling like they have a complete obligation to test and fix bugs at 2am 
and then go to work at 8am. This is normally because they care about the project, they care about quality and they care about their reputation but sometimes you need to step back.

A strong project community will step up and help. If you know that work is going to be tough for the next few months, tell the team and set yourself a limit. Wind back for a bit until things calm down. 

Managing work commitments is tough, because there are often financial consequences (or at least a perception of them).

After 7 hours, you're not really adding value. I used to have a lounge-chair next to my desk and now I have a hammock as I work from home. After a few hours of solid concentration I'll happily go and sit down and do nothing for an hour. Your brain needs a break, sure you'll get the odd "working hard" jab from a passer by but I'm working smarter not harder. Once I'm refreshed I'll finish the next task about 30-40% quicker, to a better level of quality and insight. On the occasion I've done 12-14 hour work days, my brain is shutting down to conserve energy and your critical thinking is the first thing to switch off. Followed by logical thinking, this is where you make mistakes and deliver work that is less than a quality you'd normally expect.

I live close to the beach so my time out is going for a swim in the ocean or spending a bit of time with my family. As a manager I also see a responsibility to make it clear that it's encouraged to step back and recharge. Just in our chat-channel to say that I'll be offline for a couple of hours as I'm going to the beach mid-afternoon. I don't feel guilty about it and I hope they do the same.

Learn how to say no and don't feel guilty about it. When I coach people on this I ask, "who asked you to do this? Was no an option? What value is there in delivering this? What is consequence of not doing it? Who else could do it?"

Everyone wants to be helpful and indispensible, but your reliability is just as important to your reputation and what you deliver. 


Look after your health, be smart with your time and contribute for a cause.

Anthony Shaw is the Group Director of Innovation and Talent Development at Dimension Data, an NTT company. Anthony is an open-source advocate, member of the Apache Software Foundation and Python Software Foundation and active contributor to over 20 open-source projects including Apache Libcloud and SaltStack. At Dimension Data, Anthony is driving digital transformation for Dimension Data’s global clients across 50 countries and 30,000 employees. Key initiatives are software skills, automation, DevOps and Cloud. Anthony is based in Sydney, Australia and blogs about skills, software and automation to 170,000 readers annually.

= = =

"Success at Apache" is a monthly blog series that focuses on the processes behind why the ASF "just works". 1) Project Independence https://s.apache.org/CE0V 2) All Carrot and No Stick https://s.apache.org/ykoG 3) Asynchronous Decision Making https://s.apache.org/PMvk 4) Rule of the Makers https://s.apache.org/yFgQ 5) JFDI --the unconditional love of contributors https://s.apache.org/4pjM 6) Meritocracy and Me https://s.apache.org/tQQh 7) Learning to Build a Stronger Community https://s.apache.org/x9Be 8) Meritocracy. https://s.apache.org/DiEo 9) Lowering Barriers to Open Innovation https://s.apache.org/dAlg 10) Scratch your own itch. https://s.apache.org/Apah 11) What a Long Strange (and Great) Trip It's Been https://s.apache.org/gVuN 12) A Newbie's Narrative https://s.apache.org/A72H 13) Contributing to Open Source even with a high-pressure job https://s.apache.org/lM9O

# # # 

Monday February 05, 2018

Success at Apache: A Newbie’s Narrative

by Kuhu Shukla

As I sit at my desk on a rather frosty morning with my coffee, looking up new JIRAs from the previous day in the Apache Tez project, I feel rather pleased. The latest community release vote is complete, the bug fixes that we so badly needed are in and the new release that we tested out internally on our many thousand strong cluster is looking good. Today I am looking at a new stack trace from a different Apache project process and it is hard to miss how much of the exceptional code I get to look at every day comes from people all around the globe. A contributor leaves a JIRA comment before he goes on to pick up his kid from soccer practice while someone else wakes up to find that her effort on a bug fix for the past two months has finally come to fruition through a binding +1.

Yahoo – which joined AOL, HuffPost, Tumblr, Engadget, and many more brands to form the Verizon subsidiary Oath last year – has been at the frontier of open source adoption and contribution since before I was in high school. So while I have no historical trajectories to share, I do have a story on how I found myself in an epic journey of migrating all of Yahoo jobs from Apache MapReduce to Apache Tez, a then new DAG based execution engine.

Oath grid infrastructure is through and through driven by Apache technologies be it storage through HDFS, resource management through YARN, job execution frameworks with Tez and user interface engines such as Hive, Hue, Pig, Sqoop, Spark, Storm. Our grid solution is specifically tailored to Oath's business-critical data pipeline needs using the polymorphic technologies hosted, developed and maintained by the Apache community.

On the third day of my job at Yahoo in 2015, I received a YouTube link on An Introduction to Apache Tez. I watched it carefully trying to keep up with all the questions I had and recognized a few names from my academic readings of Yarn ACM papers. I continued to ramp up on YARN and HDFS, the foundational Apache technologies Oath heavily contributes to even today. For the first few weeks I spent time picking out my favorite (necessary) mailing lists to subscribe to and getting started on setting up on a pseudo-distributed Hadoop cluster. I continued to find my footing with newbie contributions and being ever more careful with whitespaces in my patches. One thing was clear – Tez was the next big thing for us. By the time I could truly call myself a contributor in the Hadoop community nearly 80-90% of the Yahoo jobs were now running with Tez. But just like hiking up the Grand Canyon, the last 20% is where all the pain was. Being a part of the solution to this challenge was a happy prospect and thankfully contributing to Tez became a goal in my next quarter.

The next sprint planning meeting ended with me getting my first major Tez assignment – progress reporting. The progress reporting in Tez was non-existent – "Just needs an API fix,"  I thought. Like almost all bugs in this ecosystem, it was not easy. How do you define progress? How is it different for different kinds of outputs in a graph? The questions were many.

I, however, did not have to go far to get answers. The Tez community actively came to a newbie's rescue, finding answers and posing important questions. I started attending the bi-weekly Tez community sync up calls and asking existing contributors and committers for course correction. Suddenly the team was much bigger, the goals much more chiseled. This was new to anyone like me who came from the networking industry, where the most open part of the code are the RFCs and the implementation details are often hidden. These meetings served as a clean room for our coding ideas and experiments. Ideas were shared, to the extent of which data structure we should pick and what a future user of Tez would take from it. In between the usual status updates and extensive knowledge transfers were made. 

Oath uses Apache Pig and Apache Hive extensively and most of the urgent requirements and requests came from Pig and Hive developers and users. Each issue led to a community JIRA and as we started running Tez at Oath scale, new feature ideas and bugs around performance and resource utilization materialized. Every year most of the Hadoop team at Oath travels to the Hadoop Summit where we meet our cohorts from the Apache community and we stand for hours discussing the state of the art and what is next for the project. One such discussion set the course for the next year and a half for me.

We needed an innovative way to shuffle data. Frameworks like MapReduce and Tez have a shuffle phase in their processing life cycle wherein the data from upstream producers is made available to downstream consumers. Even though Apache Tez was designed with a feature set corresponding to optimization requirements in Pig and Hive, the Shuffle Handler Service was retrofitted from MapReduce at the time of the project's inception. With several thousands of jobs on our clusters leveraging these features in Tez, the Shuffle Handler Service became a clear performance bottleneck. So as we stood talking about our experience with Tez with our friends from the community, we decided to implement a new Shuffle Handler for Tez. All the conversation points were tracked now through an umbrella JIRA TEZ-3334 and the to-do list was long. I picked a few JIRAs and as I started reading through I realized, this is all new code I get to contribute to and review. There might be a better way to put this, but to be honest it was just a lot of fun! All the white boards were full, the team took walks post lunch and discussed how to go about defining the API. Countless hours were spent debugging hangs while fetching data and looking at stack traces and Wireshark captures from our test runs. Six months in and we had the feature on our sandbox clusters. There were moments ranging from sheer frustration to absolute exhilaration with high fives as we continued to address review comments and fixing big and small issues with this evolving feature.

As much as owning your code is valued everywhere in the software community, I would never go on to say “I did this!” In fact, “we did!” It is this strong sense of shared ownership and fluid team structure that makes the open source experience at Apache truly rewarding. This is just one example. A lot of the work that was done in Tez was leveraged by the Hive and Pig community and cross Apache product community interaction made the work ever more interesting and challenging. Triaging and fixing issues with the Tez rollout led us to hit a 100% migration score last year and we also rolled the Tez Shuffle Handler Service out to our research clusters. As of last year we have run around 100 million Tez DAGs with a total of 50 billion tasks over almost 38,000 nodes.

In 2018 as I move on to explore Hadoop 3.0 as our future release, I hope that if someone outside the Apache community is reading this, it will inspire and intrigue them to contribute to a project of their choice. As an astronomy aficionado, going from a newbie Apache contributor to a newbie Apache committer was very much like looking through my telescope - it has endless possibilities and challenges you to be your best.

Kuhu Shukla is a software engineer at Oath and did her Masters in Computer Science at North Carolina State University. She works on the Big Data Platforms team on Apache Tez, YARN and HDFS with a lot of talented Apache PMCs and Committers in Champaign, Illinois. A recent Apache Tez Committer herself she continues to contribute to YARN and HDFS and spoke at the 2017 Dataworks Hadoop Summit on "Tez Shuffle Handler : Shuffling At Scale With Apache Hadoop". Prior to that she worked on Juniper Networks' router and switch configuration APIs. She likes to participate in open source conferences and women in tech events. In her spare time she loves singing Indian classical and jazz, laughing, whale watching, hiking and peering through her Dobsonian telescope.

= = =

"Success at Apache" is a monthly blog series that focuses on the processes behind why the ASF "just works". 1) Project Independence https://s.apache.org/CE0V 2) All Carrot and No Stick https://s.apache.org/ykoG 3) Asynchronous Decision Making https://s.apache.org/PMvk 4) Rule of the Makers https://s.apache.org/yFgQ 5) JFDI --the unconditional love of contributors https://s.apache.org/4pjM 6) Meritocracy and Me https://s.apache.org/tQQh 7) Learning to Build a Stronger Community https://s.apache.org/x9Be 8) Meritocracy. https://s.apache.org/DiEo 9) Lowering Barriers to Open Innovation https://s.apache.org/dAlg 10) Scratch your own itch. https://s.apache.org/Apah 11) What a Long Strange (and Great) Trip It's Been https://s.apache.org/gVuN 12) A Newbie's Narrative https://s.apache.org/A72H

# # #  

Monday January 29, 2018

The Apache Software Foundation 2018 Vision Statement

The Apache Software Foundation describes its Vision Statement, the basis for the 5-year strategic plan in development.

Our mission is to support communities that create and distribute Open Source software at no charge under the Apache License, per our Bylaws. To this end we provide mentoring, virtual collaboration space, and resources for project communities to develop, steward, and incubate Open Source software under our legal umbrella.

We are strongly committed to our projects' independence from any external influences, be they corporate, organizational or otherwise. This allows us to provide a neutral environment for our communities.

As a Foundation we intentionally do not define a technical strategy: that is determined by our projects. The Foundation's goals center on communities, as opposed to technologies.

The Foundation is managed and directed by its Members, who are individual volunteers. Companies or organizations can neither be members of the Foundation, nor take a role in the governance of our projects.

We help our communities understand and adopt the Apache Way, a collection of documented practices for collaboration and project sustainability that we amend on an ongoing basis.

We expect our community members to act as individuals. Their rights and responsibilities are based solely on their merit, defined by what they individually do in project communities, not on any external affiliation, title, or degree they may have, nor on their contributions to external projects or other organizations. We expect all Apache community members to adhere to our established Code of Conduct.

We provide highly reliable and automated core infrastructure services to our projects. We permit projects to use external non-core services based on their specific needs, allowing our own services to remain simple and focused. For durability, all our critical data and services are managed or mirrored on systems that we fully control.

Our marketing and outreach is focused on activities that directly support our mission, educate the public about Apache projects and the Apache Way, and help attract the types of communities for which our Foundation is a suitable home.

Our fundraising-related activities help identify and retain sponsors and financial contributions on which our operations depend. We welcome donations from corporations and individuals who support our mission.

We provide our projects legal and brand management services based on demonstrated needs, and define policies and best practices to help our projects benefit from the strong Apache brand in an appropriate way.

We welcome new projects via our Incubator, where experienced mentors help guide them to operate as Apache community-led projects. As Incubation is where diverse communities are defined, we put a strong emphasis on coaching to promote self-governance and preserve our core values as the Foundation grows.

# # #

Sunday December 31, 2017

Apache in 2017 - By The Digits

What an exciting and productive year for the Apache community at-large! We owe our continued success to the tireless efforts of our Members, Committers, and contributors, the loyalty from countless users worldwide, and the ongoing financial support from our Sponsors and individual donors. Join us for a look back at our achievements:

Apache Projects —
Total number of projects + sub-projects - 318 (not including Apache Labs initiatives)
Top-Level Projects - 193
Podlings in the Apache Incubator- 53

Community/People — 
ASF Members (individuals) - 683 
New Members elected - 64
Apache Committers - 6,504 (6,165 active)

Apache Code —
3,050 Committers changed 60,276,457 lines of code over 188,262 commits.

Top 5 Apache Committers 
  1. Shad Storhaug (2,472 commits; 1,465,542 lines changed)
  2. Claus Ibsen (2,406 commits; 560,595 lines changed)
  3. Jean-Baptiste Onofré (2,142 commits; 1,243,862 lines changed)
  4. Mark Thomas (1,954 commits; 113,266 lines changed)
  5. Colm Ó hÉigeartaigh (1,768 commits; 521,215 lines changed)
Top 5 Apache Project Repositories by Commits
  1. Hadoop
  2. Ambari
  3. Lucene-Solr
  4. Camel
  5. Ignite
Top 5 Apache Project Repositories by Size (Lines of Code)
  1. OpenOffice (6,375,345)
  2. NetBeans (5,536,881)
  3. Flex (whiteboard: 5,164,279; SDK 3,919,006)
  4. Trafodion (3,077,781)
  5. Mynewt (core: 2,748.040)

"If it didn't happen on-list, it didn't happen."

Total number of mailing lists 1,131
21,772 authors sent 1,617,547 emails on 642,005 topics

Top 10 most active Apache mailing lists (user@ + dev@)
  1. Flex
  2. Lucene
  3. Ignite
  4. Kafka
  5. Geode
  6. Flink
  7. Tomcat
  8. Cassandra
  9. Beam
  10. Sentry

Contributor License Agreements and Software Grants —
We are welcoming nearly 300 new code contributors and 300-400 new people filing issues each month. Individuals who are granted write access to the Apache repositories must submit an Individual Contributor License Agreement (ICLA). Corporations that have assigned employees to work on Apache projects as part of an employment agreement may sign a Corporate CLA (CCLA) for contributing intellectual property via the corporation. Individuals or corporations donating a body of existing software or documentation to one of the Apache projects need to execute a formal Software Grant Agreement (SGA) with the ASF. 

ICLAs signed - 860
CCLAs signed - 27
Software Grants submitted - 18

Sponsorship and Individual Support —
Thank you to our hundreds of individual donors, our Platinum Sponsors: Cloudera, Comcast, Facebook, Google, LeaseWeb, Microsoft, and Yahoo; our Gold Sponsors: ARM, Bloomberg, Hortonworks, Huawei, IBM, ODPi, PhoenixNAP, and Pivotal; our Silver Sponsors: Alibaba Cloud Computing, Budget Direct, Capital One, Cash Store, Cerner, Inspur, iSIGMA, Private Internet Access, Red Hat, Serenata Flowers, Target, Union Investment, and WANdisco; our Bronze Sponsors: 7 Binary Options, Airport Rentals, The Blog Starter, Casino2k, Compare Forex Brokers, HostingAdvice.com, HostPapa Web Hosting, The Linux Foundation, Mobile Slots, Samsung, Spotify, Talend, Travel Ticker Hotels, Web Hosting Secret Revealed, WebsiteSetup, and Wise Buyer; and our Infrastructure Sponsors: Bintray, Freie Universität Berlin, HotWax Systems, No-IP, OSU Open Source Labs, PagerDuty, Quenda, Rackspace, Sonatype, SURFnet, and Symantec.

Collectively, our Members, developers, contributors, users, supporters, and sponsors are the reason Apache Is Open https://s.apache.org/PIRA

Here’s to a great 2018!

# # #

Tuesday December 12, 2017

Success at Apache: What a Long Strange (and Great) Trip It's Been

By Jim Jagielski

It is normally during this time of year that people get awful retrospective. We look over the last 12 months and come to terms with what kind of year it has been. We congratulate ourselves on the good and (hopefully) learn from the bad. We basically assess the ending year and start planning, even a little bit, on the one to come.

In general, we reminisce.

I am thinking not about 2017, however, but instead of 1995 and the origins of The Apache Software Foundation. And what a long, strange, and great trip it's been. And how incredibly lucky I've been to be a part of it.

A common saying is that success is mostly about being there at the right place at the right time, and although I'm not sure about the "success" part, it certainly applies to me. At the time I was working at NASA and was starting off a side business as an ISP and Web Hoster, and using the old NCSA web-server. I had created a small reputation for myself as an "expert" on a flavor of UNIX called A/UX, which was Apple's UNIX offering at the time. In addition to being the editor of the FAQ for A/UX, I also ported a bunch of "free software" to that platform and that's how I got started with Apache, providing patches to support A/UX, which is what I used as my web hosting platform. It was really no different than what I did for other software projects at the time.

And then something wonderful happened. I got hooked.

I really, really enjoyed the people I was collaborating with. I wasn't an "outsider" providing patches, I was part of the inner circle. I was a full fledged member of the Apache Group. I started to really understand just how all this really could change the world, and how I could maybe be a small part of it.

As a result, Apache changed my life, literally. Instead of doing software development as a way of "getting my job done" (at NASA, I was a power system engineer, and so I would code modeling and simulation software for spacecraft solar arrays, batteries and orbital mechanics), I starting doing software development as my job, in addition to my hobby. Apache and Open Source became a huge part of my life, and my career changed to focus on Open Source almost primarily, a change that continues to this day.

During this time I've been fortunate enough to work with, and learn from, extremely talented people. Not only related to code, but legal matters, inter-personal skills, presentation skills, etc. I've had opportunities that I never imagined and met people I never would have had expected otherwise. I'm made great friends. I've been mentored by incredibly giving people and have mentored in return. And have seen my mentees become mentors themselves.

Over the years, I've seen Apache grow from a rag-tagged group of people working on a web server to one of the leading Open Source foundations in the world with more than 300 projects under our belt. I've been blessed to serve on the board of the ASF for every single year since we incorporated in 1999, seeing 2nd and now 3rd "generation" Apache Members take on the reins.

The Open Source movement, and especially Apache, have given more to me than I could ever pay back, and that is why I still volunteer and contribute. Of course, to be honest, I still get a kick out of it, and love what I am doing, and continue to enjoy the opportunities and, especially, the people that I get to work with.

But, you see, I'm nothing special. All this is also open and available to you. You too can change the world, and have your world changed in return. We all have talents that can be shared, talents that can be recognized and rewarded. Apache is a family, always looking for new family members. 

So take that first step. Find a project and community you want to a part of. Jump in. Have fun. Grow. Learn. Teach. Live.

But just be prepared to get hooked, and have your life change.

Jim Jagielski is a well known and acknowledged expert and visionary in Open Source, an accomplished coder, and frequent engaging presenter on all things Open, Web and Cloud related. As a developer, he’s made substantial code contributions to just about every core technology behind the Internet and Web and in 2012 was awarded the O’Reilly Open Source Award and in 2015 received the Innovation Luminary Award from the EU. He is likely best known as one of the developers and co-founders of the Apache Software Foundation, where he has previously served as both Chairman and President and where he’s been on the Board Of Directors since day one. Currently he is Vice-Chairman. He's served as President of the Outercurve Foundation and was also a director of the Open Source Initiative (OSI). Up until recently, he worked at Capital One as a Sr. Director in the Tech Fellows program. He credits his wife Eileen in keeping him sane. 

= = =

"Success at Apache" is a monthly blog series that focuses on the processes behind why the ASF "just works". 1) Project Independence https://s.apache.org/CE0V 2) All Carrot and No Stick https://s.apache.org/ykoG 3) Asynchronous Decision Making https://s.apache.org/PMvk 4) Rule of the Makers https://s.apache.org/yFgQ 5) JFDI --the unconditional love of contributors https://s.apache.org/4pjM 6) Meritocracy and Me https://s.apache.org/tQQh 7) Learning to Build a Stronger Community https://s.apache.org/x9Be 8) Meritocracy. https://s.apache.org/DiEo 9) Lowering Barriers to Open Innovation https://s.apache.org/dAlg 10) Scratch your own itch. https://s.apache.org/Apah 11) What a Long Strange (and Great) Trip It's Been https://s.apache.org/gVuN

# # # 

Wednesday October 25, 2017

Success at Apache: Scratch Your Own Itch.

By Ignasi Barrera

Recently I was at an industry conference and was happy to see many people stopping by the Apache booth. I was pleased that they were familiar with the Apache brand, yet puzzled to learn that so many were unfamiliar with The Apache Software Foundation (ASF).

It's important to recognize not just Apache's diverse projects and communities, but also the entity behind their success.

Gone are the days when software, and technology in general, was developed privately for the benefit of the few. As technology evolves, the challenges we face become more complex, and the only way to effectively move forward to create the technology of the future is to collaborate and work together. Open Source is a perfect framework for that, and organizations like the ASF carry out a decisive role in protecting its spirit and principles.

The ASF's mission is to provide software for the public good. We take it one step further, by giving all our Open Source software away for free. According to this mission, the foundation was established back in 1999 as a US 501(c)(3) non-profit charitable organization, and constitutes an independent legal entity to which companies and individuals can donate resources and be assured that those resources will be used for the public benefit. Its all-volunteer nature, along with the meritocracy model followed by its communities, are the pillars of the neutral, trusted space where Apache software is developed.

We strongly believe that good software is built by strong communities. Successful Open Source projects are the result of the work and collaboration in their communities and the people behind them. It is all about the people. Experience has shown us that helping people work together as peers is key in producing software in a sustainable way, and we have collected the lessons learned all these years in what we call "The Apache Way".

This Apache Way is a set of core behaviors all Apache projects follow that are designed to ensure projects are independent and diverse, and that anyone can participate no matter what gender, culture, time zone, employer, or even expertise they have. One can start collaborating with a project by contributing patches or implementing new features, but merit is not only measured by code contributions. Helping users, improving documentation, promoting the project, and other non-coding activities are very valuable and recognized as such, and the recognition of this merit and implication is expressed by granting more privileges in the project: from commit access, to invitations to join the Project Management Committee, to invitations to join the ASF Membership. One of the great differentiators between the ASF and other open source foundations is that the ASF does not dictate the technical direction of its projects: each Apache project is overseen by a self-selected team of active contributors to the project. A Project Management Committee (PMC) guides their respective project's day-to-day operations, including community development and product releases. Meritocracy drives the growth of the communities, and ensures anyone can contribute to projects that are ruled by the people who is involved and really cares about them.

Learning to work this way is not always easy, though. Projects come to the Foundation from very different backgrounds and whilst some of them already have communities that are used to collaborate in open ways, others find it challenging to embrace these core behaviors. The Apache Incubator is the main entry point for codebases and their communities wishing to officially become part of the Foundation, and is where they learn how to put all these principles in practice. Some will find this way of working a good way to rule a project and will graduate as an Apache top-level project, some may find that the Foundation is not the best option for them and choose to leave. Both options are good outcomes, as projects will have invested time in thinking about their community model and how they want governance to be, and this always benefits the Open Source world.

This Open Source model not only exists to create sustainable Open Source projects, but also to meet the expectations of the rest of the world. Software developed at Apache comes with a set of guarantees granted by the popular and business-friendly Apache License, but also with others that are the product of this open governance model, such as project independence or a well-defined project lifecycle. The ASF not only defines how projects operate while active, but also what happens when a project reaches its end-of-life, which is also important for adoption but often not considered by Open Source projects.

These guarantees, along with the reputation earned by many years of producing high-quality open source software, make the +300 freely available Apache projects, from Abdera to HTTP Server to Hadoop to Zookeeper, a trusted choice for individuals and companies looking for Open Source solutions.

The saying "Scratch Your Own Itch" is popular in the tech space, and is an integral principle at the ASF. Apache Committers have a responsibility to the community to help create a product that will outlive the interest of any particular volunteer, as well as for helping to grow and maintain the health of the Apache community.

As an ASF Member, I'm helping with project outreach and mentoring new individuals that make up the greater Apache community.

The Apache Software Foundation provides a safe place for Open Source development, and will keep evolving as technology evolves, welcoming all kinds of projects and communities, and helping people embrace Open Source. Let's see what the future holds for the Open Source world and how we can contribute to making it a better place. Scratch your own itch.

Ignasi Barrera is a long-term Open Source contributor and became involved with the ASF in 2013, when jclouds was first submitted to the Apache Incubator. He is a member of the Apache jclouds Project Management Committee and still actively contributes to the project. Ignasi became an ASF Member in 2015, and helps with community development activities and the promotion of Open Source. 

= = =

"Success at Apache" is a monthly blog series that focuses on the processes behind why the ASF "just works". 1) Project Independence https://s.apache.org/CE0V 2) All Carrot and No Stick https://s.apache.org/ykoG 3) Asynchronous Decision Making https://s.apache.org/PMvk 4) Rule of the Makers https://s.apache.org/yFgQ 5) JFDI --the unconditional love of contributors https://s.apache.org/4pjM 6) Meritocracy and Me https://s.apache.org/tQQh 7) Learning to Build a Stronger Community https://s.apache.org/x9Be 8) Meritocracy. https://s.apache.org/DiEo 9) Lowering Barriers to Open Innovation https://s.apache.org/dAlg 10) Scratch your own itch. https://s.apache.org/Apah

# # #

Monday October 02, 2017

Success at Apache: All My Roads Led to Apache

by Pat Ferrel

I became involved with Apache in 2011. After several years in startups where, as CTO, I felt too removed from building things. Looking for a change, I was keenly aware that the most interesting thing about the startups was our early use of Machine Learning techniques and I wanted to see if building ML solutions, for companies new to the field might not be more satisfying. I started by spending nearly a year in researching the type of applications we had needed in the startups: Natural Language Processing (NLP), text analysis, clustering, and classification. In those days Apache Mahout http://mahout.apache.org/ had several good solutions that were designed for Big Data and approachable by an individual. These ideas seem fairly commonplace now but were in early days only 6 years ago.

Given a great platform to experiment with, I built a web site to advertise expertise in ML but also to showcase many examples from my experiments, including a topic-oriented content site based on clustered and classified text that used NLP to add entities to text. I blogged about things I had learned and techniques that produce results.

Then I got the first contact about a project and it was from a completely unexpected direction: recommenders. Fortunately Apache Mahout then had the state-of-the-art OSS suite of recommenders so I took the consulting job. The company had rolled their own recommender and was selling it as a service but it was old and they wanted to investigate replacing it. 

Welcome to Big Data

The nature of recommenders means you deal with huge amounts of data because you have to track several million people’s actions over years. We had data from a large online retailer and were tasked with using this data to beat the in-house recommender. Specifically they wanted to see if they could improve performance (better results and faster compute times) and get something easier to maintain. 

The first job of a good consultant is to define the problem and outline a path to resolution that fits with the company’s competencies. To me this meant looking at the current system and the expertise of the people working on it. We had Data Scientists and Java Software Developers who knew what it was like to deal with Big Data. They had a highly performant method for gathering data and were quite good at running Apache Hadoop-based analytics. This was seldom the case back then but happily allowed me to look at less turnkey applications and assume the use of important Apache tools.

We agreed on a plan and the basic building blocks including a method for comparing results. I did the research and proposed several candidates for the tests including the Apache Mahout recommenders. It was pretty easy to rank the recommender engines we had and do some exploration of parameter tuning and choices to get our best "challenger" results. The nice thing is that we beat the old threadbare in-house recommender by a significant amount (12%). The winner was the Apache Mahout Cooccurrence Recommender using the Log-Likelihood Ratio as the core cooccurrence metric. This even though we had tested against several Matrix Factorization recommenders, including Mahout's. 

We need something new 

Up till this time I was only a user of Apache projects (discounting a few minor code contributions) but what I found in all recommenders we studied is a fundamental problem that is still mostly unsolved today. We had data from a retailer that included user "buys" but also 100 times more user "views". None of the recommenders could deal with this multimodal data. I consulted the authors and maintainers of the Mahout recommenders and several others we had targeted. We got some suggestions added them to our own ideas and set out to test them. For various reasons, that are beyond the scope of this post, none of the easy solutions helped and actually produced worse results so I had fulfilled the contract and left with a feeling of unfinished business.

One of the mentors of Apache Mahout, Ted Dunning, had suggested a new idea during this time. There was something about it that seemed very intriguing. He had proposed a way to use one type of user behavior to predict another. This was an aha moment for me because it codified intuition. I remember the first time he wrote in email on the Mahout user mailing list the equation that crystallized it all. I began to imagine the implications; all sorts of new data that could be useful, not just "views" but contextual data like location, and enrichment data like tag or category preferences. These all seem to obviously have a bearing on recommendations but now we had a beautiful simple equation to test the intuition.

Becoming a Committer

I set out to hack the Mahout Cooccurrence Recommender to become a Correlated Cross-Occurrence (CCO) recommender. But without some way of testing the algorithm and code we couldn’t be sure it was worth including in Mahout. The datasets publicly available at the time did not have the kind of data we needed (there had been no direct use for it until then) so I scraped the film review site rottentomatoes.com to collect "fresh" and "rotten" reviews of movies. This gave us two different behaviors with very different meanings. Naively you might think, weight one positive and the other negative and so did I but that produced worse results than ignoring the "dislikes". However when I ran cross-validation tests comparing the Mahout Cooccurrence Recommender using likes only, to CCO using both user actions, we got some quite interesting results. The question was: do "dislikes" predict "likes" and when I got 20% lift in predictive precision we could conclude that they do. Not only was intuition right but the new algorithm could tease out the data to make use of it.

The hack was accepted into Mahout Examples and I was invited to become a committer. Then the world changed.

Apache Spark and Mahout-Samsara

When I became a committer Mahout was written on Apache Hadoop MapReduce in Java (as was my hack). But it had also become obvious to most Mahout committers that the future was with much more performant engines like Apache Spark. Committers Dmitriy Lyubimov and Sebastian Schelter had been working on a Spark version of Mahout. In an instant of project time virtually all committers saw this as the future of Mahout, if also a major pivot. 

In retrospect I'm not sure I've ever seen an Apache project change so much in so little time. Today Mahout is deprecating lots of old Hadoop MapReduce code as it falls from use and the new Mahout is truly new. The Mahout subtitle Samsara, references the cycle of life, death, and rebirth in the Hindu tradition. Mahout started as algorithms written specifically for MapReduce, now Mahout-Samsara is a linear algebra DSL in Scala used to roll-your-own algorithms but with most interesting algorithms in very simple DSL-based implementations. Mahout eventually took this transformation even further to include other compute engines like Apache Flink and is now running on GPUs. But I get ahead of things...

Those were exciting times and though I helped with the DSL I remained fixed on implementing CCO, which was first included in Mahout 0.10.0 in October 2014.


Now we have the CCO algorithm implemented on modern compute engines but several other problems remained in order to actually deploy a recommender. This is because CCO creates a model that needs to be deployed on a special type of server that computes similarity in real time. In Machine Learning terms this is a K-Nearest Neighbors engine, known in concrete terms as Lucene, or it's scalable server derivatives like Solr and Elasticsearch. A turnkey recommender also requires a highly performant massively scalable DB, like HBase. Putting these together we could get a nearly turnkey recommendation server that made use of multimodal real time user behavior. But I didn't see a candidate for all these in Apache and so looked elsewhere. This required an integration project, not Mahout, which integrated with other services but provided none of its own.

I found a project that included everything I needed and was Apache licensed but was run by a small startup called PredictionIO. They had a Machine Learning Server that was a framework for Templates that could implement a wide range of Algorithms. The Server also included nice high-level integrations with Elasticsearch (Lucene server), Spark, and HBase. In May of 2015 I had the first running CCO Server build on Mahout and a whole list of other Apache projects.

Back to Apache

PredictionIO was at the right place to get swept up in a major move to embrace ML/AI by Salesforce Inc. who bought them as part of the Einstein initiative. Since PIO was Apache licensed OSS it was still available and so was the Template I was calling the Universal Recommender. But there was a question now about the future of PIO; what would Salesforce do with it? The old team, that I had worked closely with, wanted to see the project move forward in OSS and Salesforce seemed to agree, but large corporations often have a mixed record in promoting their own OSS projects. In this case Salesforce decided to remove the question by submitting PredictionIO to the Apache Incubator.

The old team was joined by people like me from outside Salesforce to create a project that follows the Apache Way and is free of corporate dominance. I am a committer to PredictionIO, which has three releases under Apache Incubator vigilance and the Universal Recommender is now at v0.6.0, the most popular of PredictionIO Template Algorithms.

With the 3rd release of PIO from Apache we are now in the process of graduation to an Apache Top-Level Project, hatched by the Apache Incubator. I fully expect that we'll be celebrating soon.


My journey began with a specific problem to solve. Each step to produce the solution has led back to Apache in one way or another, through mentors, collaboration, use of, and commitment to several projects. But I now have my mature scalable, performant, state-of-the-art nearly turnkey Universal Recommender.  Now we can ingest and get improvements from many types of behavior, enrichment data, and context--using it in real time to serve recommendations subject to robust business rules. My small consulting company ActionML actionml.com now has a powerful tool to solve real problems and we make a living (at least partly) by helping people deploy and tune it for their data.

This is a story of someone single mindedly following a goal over several years. There are many ways to do this in the Software Development world, but not all OSS projects are open to bringing people in. The Apache Software Foundation most certainly is and openly recruits as diverse a group of committers and members as possible. If you want to make a difference and influence the course of an OSS project Apache is a good place to look. Start by getting involved with a project of interest, make contributions, get involved in discussions. If the match is good you'll be invited in as a committer and move on from there. I think of Apache as a do-ocracy, if you do something of value it goes a long way towards being invited in.  


Slides describing the CCO Algorithm: https://www.slideshare.net/pferrel/unified-recommender-39986309

IBM DevWorks Post on "Making one thing Predict Another": https://developer.ibm.com/dwblog/2017/mahout-spark-correlated-cross-occurences/

Apache Mahout CCO Implementation: http://mahout.apache.org/users/algorithms/intro-cooccurrence-spark.html

Apache PredictionIO: http://predictionio.incubator.apache.org/

The Universal Recommender Template: http://predictionio.incubator.apache.org/gallery/template-gallery/

Professional Support for the Universal Recommender: http://actionml.com/universal-recommender

# # #

"Success at Apache" focuses on the processes behind why the ASF "just works". 1) Project Independence https://s.apache.org/CE0V 2) All Carrot and No Stick https://s.apache.org/ykoG 3) Asynchronous Decision Making https://s.apache.org/PMvk 4) Rule of the Makers https://s.apache.org/yFgQ 5) JFDI --the unconditional love of contributors https://s.apache.org/4pjM 6) Meritocracy and Me https://s.apache.org/tQQh 7) Learning to Build a Stronger Community https://s.apache.org/x9Be 8) Meritocracy. https://s.apache.org/DiEo 9) Lowering Barriers to Open Innovation https://s.apache.org/dAlg

Tuesday September 05, 2017

Success at Apache: Lowering Barriers to Open Innovation

By Luke Han

Over the past decade, I was a Java developer using many Apache projects such as Tomcat, Jakarta, Struts, and Velocity. In 2010 I stepped into the Big Data field and started to actively participate in Apache projects, and became an ASF Member 3 years ago. In addition to being the VP of Apache Kylin, I helped projects such as Apache Eagle and CarbonData move to the ASF, and have been a mentor for Apache Superset, Weex, and RocketMQ. Today, I'm co-founder/CEO of Kyligence (prior to that, I was Big Data Product Lead of eBay, and Chief Consultant of Actuate China).

Apache Kylin, as its name may suggest, originated from China ("Kylin": A powerful yet gentle fire-breathing creature in eastern mythology. Also written as Qilin. "Apache Kylin": OLAP on Hadoop, capable of analyzing petabytes of data within seconds http://kylin.apache.org/ ). I started this project with a few members in early 2015. 

As a pioneer of the first highly-recognized Apache project from the Eastern world, I was proud to see that, within 2 years, Kylin has helped over 500 organizations across the globe to solve their Big Data challenges. 

Before Kylin graduated from the Apache Incubator, the Kylin team faced a lot of cultural challenges. Since a great number of projects from China had failed in the past, we too received many questions and doubts from both eastern and western worlds. As our native language is not English, communication with mentors did become difficult during the coaching process. Fortunately, by fully embracing The Apache Way, Kylin is able to succeed with strong support from the Apache community members. Much more beyond the Kylin software, our team has also worked with those talented people in a way to spread our Chinese voice to the world. 

While developing high-quality software, we are engaging more Westerners to understand the Eastern culture. I had many chances to travel and meet people across the globe since I initiated Kylin. Some of them are Apache directors and mentors, some of them are developers and contributors. Some are from US, Australia, Canada and Chile; some are from Japan and Taiwan. Some are impressed with Kylin, some are curious about Easterners’ attitude toward Open Source software. I asked them a lot of questions about The Apache Way, and they all generously coached me and my team with lovely and detailed answers. We too could reach consensuses after intensive and open arguments. Kylin received much more encouragement and recognition than I expected.

As a VP of a Top-Level Project, my responsibility grew after Kylin graduated from the Apache Incubator. Kylin faced more opportunities as it has been bug-fixed quickly and tested frequently, with the nature of an Open Source software. In the China’s well-knowingly-big market, Apache Kylin has received many users’ feedback and evolved fast. We received many suggestions from both developers’ perspective and products’ perspective. Beyond my expectation, many community members are passionately writing tools for Kylin and helping users better understand and use Kylin. Assembling members’ ideas, we are also sharing our knowledge as a way to give back to the community. 

Thanks to ASF and everyone involved in the Open Source community, I have the opportunity to work with people that I’ve always admired and make a difference in the world all together. I feel I and my team are deeply connected with such warm, global, open community.

= = =

"Success at Apache" is a monthly blog series that focuses on the processes behind why the ASF "just works". 1) Project Independence https://s.apache.org/CE0V 2) All Carrot and No Stick https://s.apache.org/ykoG 3) Asynchronous Decision Making https://s.apache.org/PMvk 4) Rule of the Makers https://s.apache.org/yFgQ 5) JFDI --the unconditional love of contributors https://s.apache.org/4pjM 6) Meritocracy and Me https://s.apache.org/tQQh 7) Learning to Build a Stronger Community https://s.apache.org/x9Be 8) Meritocracy. https://s.apache.org/DiEo

# # # 



Hot Blogs (today's hits)

Tag Cloud