Idea: Split Secrets for OAuth

We are talking about ways to establish shared secrets where both the Tool Consumer and Tool Provider contribute key material to an overall shared key used to sign and validate OAuth messages. Often these “secrets” are treated as strings of varying length. Common practice is to choose random numbers wih something like the uniq() PHP function or Java’s UUID() and then hex encode the random bits for strings of varying length.

Using the current approach, (a) we cannot assume the serialization of this data and (b) the secrets can be of effectively any length (short or long). By not specifing an encoding that allows us to transmit bit-level randomness, we implicitly shorten key lengths by using a non-predictable encoding so we have to fall back to strings and likely strings with a very limited character set.

We have not yet seen situations where secrets include non-Latin1 characters. As we move to moving secrets across web services – serialization becomes inclreasingly important and if we get too tricky with character sets we might find ourselves with some interoperability problems.

My proposal is to define the binary bit-length of the two halves of the “split secret” and insist that these are serialized using a known serialization so both sides can de-serialize these pieces to cryptographically strong secrets with a well understood bit length.

So each of the sides contributes 512 cryptographically random bits to the shared secret. When each side communicates the secret – they are serialized and transferred using 128-character hex encoded using only lower case letters for a-f. An example of a half-secret is as follows:

941c7f8f929ad915b0a8810a6eedee5e5a5cedbab1bee5e4e2f05df6ed926e8042bca5127a7fac88ab581526e78b193b99fdfe234d40496eca32431447b752af

To form the OAuth consumer secret the two hex halves are just concatenated as hex strings. Since the OAuth signing simply appends the key to the message and computes a digest, we can make use of all 1024 bits of randomness by using a 256 character hex-encoded key. While this means that the pad has a known character set (0-9) and (a-f) – it makes up for that by being 4 times longer. Also we avoid any encoding problems if we allow non-latin1 characters in the OAuth shared secret.

By speicfing the bit length and encoding – both sides can build database models that store secrets in fixed length fields.

By insuring there are 1024-bits of cryptographically strong randomness – other uses like sending data between the sides with two-way encryption approaches like Blowfish or AES can create shorter bit length keys from the known 1024-bits of randomness.

I am just putting this up because I like openness in the design of any security scheme in case I made any mistakes or incorrect assumptions.

This design is not at all final – comments are very welcome.

Tracing History from to “Imitation Game” to the Modern-Day Internet (#IHTS)

In a sense Alan Turing’s cryptography, code breaking and computer science work at Bletchley Park featured in the Imitation Game movie was the kickoff for the modern day Internet and well as modern day electronic computing technologies. For the first time in history, communication was essential for survival and applying computation to understanding communication was critical to success or failure in World War II. There was unprecedented funding poured into research into mathematics, computer science, social science, linguistics, and many other fields. Bletchley Park was one of the world’s first great large-scale cross-disciplinary research labs. The creativity and innovation at Bletchley Park had a tremendous impact on the results of World War II and the shape of our world to the present day.

If you are interested in learning how we got from Bletchley Park to today’s Internet – I would invite you to attend my free self-paced Internet History, Technology, and Security course on Coursera.

IHTS was one of the first 20 pioneering MOOCs as Coursera was rolled out in 2012 (yes two years seems like a long time ago). And now IHTS is one of the first Coursera courses to pioneer a new self-paced format that allows students to start and take courses at any time and at their own pace.

We initially have soft-launched IHTS so students can view all of the lectures and supplementary materials. Over the coming months, we will be adding quizzes and other assessments so that the self-paced offering includes all the features of the previous scheduled cohort based offerings on Coursera – except with no deadlines :).

The course is a mix of lectures and interviews with Internet innovators. All of the course materials are open and available under a CC-BY Creative Commons License to allow reuse of the lecture materials.

I hope to see you in class.

Riding My Way Back – Veterans Day – And my own Big-Screen Film Debut

Tomorrow is Veterans Day and I will be attending a film screening of “Riding My Way Back” Tuesday November 11 at 7PM at the Celebration Cinema in South Lansing.

Riding My Way Back (http://www.ridingmywayback.com) is a documentary film about a veteran who came back from Iraq and Afghanistan with with traumatic brain injury (TBI) and Post-Traumatic Stress Disorder (PTSD) and how his relationship with a horse named “Fred” helped him rebuild his life.

In addition to Riding My Way Back, we will be showing “CHUM Families” which is a documentary about parents and children that are part of the C.H.U.M. Therapeutic Riding family. I produced the film and it if the first time any of my work will be shown on a big screen cinema.

Here is a preview of the CHUM Families movie on YouTube.

The proceeds for the showing will go to support the programs at C.H.U.M. Therapeutic Riding (www.chumtherapy.net).

I hope to see you there.

How to Achieve Vendor Lock-in with a Legit Open Source License – Affero GPL

Note: In this post I am not speaking for the University of Michigan, IMS, Longsight, or any one else. I have no inside information on Kuali or Instructure and am basing all of my interpretations and commentary on the public communications from the kuali.org web site and other publically available materials. The opinions in this post are my own.

Before reading this blog post, please take a quick look at this video about Open Source:



The founding principles of Open Source from the video are as follows:

  1. Access to the source of any given work
  2. Free Remix and Redistribution of Any Given Work
  3. End to Predatory Vendor Lock-In
  4. Higher Degree of Cooperation

A decade ago efforts like Jasig, Sakai, and Kuali were founded to collaboratively build open source software to meet the needs of higher education to achieve all of the above goals. Recently Kuali has announced a pivot toward Professional Open Source. Several years ago the Sakai and Jasig communities decided to form a new shared non-profit organization called Apereo to move away from Community Source and toward pure Apache-style open source. So interestingly, at this time, all the projects that coined the term “Community Source”, no longer use the term to describe themselves.

In the August 22 Kuali announcement of the pivot from non-profit open source to for-profit open source, there was a theme of how much things have changed in the past decade since Kuali was founded:

…as we celebrate our innovative 2004 start and the progress of the last decade, we also know that we live in a world of change. Technology evolves. Economics evolve. Institutional needs evolve. We need to go faster. We need a path to a full suite of great products for institutions that want a suite. So it is quite natural that a 10-year-old software organization consolidates its insights and adapts to the opportunities ahead.

There were many elements in the August 22 announcement that merit discussion (i.e. here and here) but I will focus on these particular quotes from the FAQ that accompanied the August 22 announcement:

This plan is still under consideration. The current plan is for the Kuali codebase to be forked and re-licensed under Affero General Public License (AGPL).

The Kuali Foundation (.org) will still exist and will be a co-founder of the company. … The Foundation will provide initial capital investment for the company out of its reserves.

In a follow-up post five days later on August 27 they clarified the wording about licensing and capital:

All software that has been released under the current, Open Source Initiative approved Educational Community License (ECL) will and can continue under that license.

The software license for work done by the new entity and from its own capital will be the Open Source Initiative approved Affero GPL3 license (AGPL3).

While the details and overall intent of the August 22 and August 27 announcements from the Kuali Foundation may seem somewhat different, the AGPL3 license remains the central tenet of the Kuali pivot to professional open source.

The availability of the AGPL3 license and the successful use of AGPL3 to found and fund “open source” companies that can protect their intellectual property and force vendor lock-in *is* the “change” that has happened in the past decade that underlies both of these announcements and the makes a pivot away from open source and to professional open source an investment with the potential for high returns to its shareholders.

Before AGPL3

Before the AGPL3 license was created, there were two main approaches to open source licensing – Apache-style and GPL-style. The Apache-like licenses (including BSD, MIT, and ECL) allow commercial companies to participate fully in both the active development of the code base and the internal commercial use of that code base without regard to mixing of their proprietary code with the open source code.

The GNU Public License (GPL) had a “sticky” copyleft clause that forced any modifications of redistributed code to also be released open source. The GPL license was conceived pre-cloud and so its terms and conditions were all about distribution of software artifacts and not about standing up a cloud service with GPL code that had been modified by a company or mixed with proprietary code.

Many companies chose to keep it simple and avoided making any modifications to GPL software like the Linux kernel. Those companies could participate in Apache projects with gusto but they kept the GPL projects at arms length. Clever companies like IBM that wanted to advance the cause of GPL software like Linux would hire completely separate and isolated staff that would work on Linux. They (and their lawyers) felt they could meet the terms of the GPL license by having one team tweak their cloud offerings based on GPL software and a completely separate team that would work on GPL software and never let the two teams meet (kind of like matter and anti-matter).

So clever companies could work closely with GPL software and the associated projects if they were very careful. In a sense because GPL had this “loophole”, while it was not *easy* for commercial companies to engage in GPL projects when a company tweaked the GPL software for their own production use, it was *possible* for a diverse group of commercial companies to engage constructively in GPL projects. The Moodle project is a wonderful example of a great GPL project (well over a decade of success) with a rich multi-vendor ecosystem.

So back in 1997, the GPL and Apache-like licenses appeared far apart – in practice as the world moved to cloud in the past decades the copyleft clause in GPL became less and less of a problem. GPL licensed code could leverage a rich commercial ecosystem almost as well as Apache licensed code. The copyleft clause in GPL had became much weaker by 2005 because of the shift to the cloud.

AGPL – Fixing the “loophole” in GPL

The original purpose of the GPL license was to insist that over time all software would be open source and its clause to force redistribution was a core element.

For example, if you distribute copies of such a program, whether gratis or for a fee, you must pass on to the recipients the same freedoms that you received. You must make sure that they, too, receive or can get the source code. And you must show them these terms so they know their rights.

The fact that these cloud vendors could “have their cake and eat it too” could be easily fixed by making the AGPL3 license tighter than the GPL license by adding this clause:

The GNU Affero General Public License is designed specifically to ensure that, in such cases, the modified source code becomes available to the community. It requires the operator of a network server to provide the source code of the modified version running there to the users of that server. Therefore, public use of a modified version, on a publicly accessible server, gives the public access to the source code of the modified version.

This seems simple enough. Fix the flaw. The GPL license did not imagine that someday software would not be “distributed” at all and only run in the cloud. The AGPL3 license solves that problem. Job done.

But solving one problem in the GPL pathos causes another in the marketplace. AGPL3 ensures that we can “see” the code that those who would remix and run on servers would develop, but it creates an unfortunate asymmetry that can be exploited to achieve a combination of vendor lock-in and open source.

AGPL3 = Open Source + Vendor Lock-In

The creators of GPL generally imagined that open source software would have a diverse community around it and that the GPL (and AGPL) licenses were a set of rules about how that community interacted with each other and constrain companies working with GPL software to bring their improvements back to the commons. But just like the GPL founders did not imagine the cloud, the AGPL creators did not imagine that open source software could be created in a proprietary organization and that the AGPL license would ensure that a diverse community would never form (or take a really long time to form) around the open source software.

These days in Educational Technology it is pretty easy to talk to someone on your Caltrans commute and get $60 Million in venture capital for an educational technology startup. But your VC’s want an exit strategy where they make a lot of money. I think that there are likely no examples of VC-funded companies that used an Apache-like license in their core technology that were funded let alone successful. That hippie-share-everything crap just does not cut it with VC’s. Vendor lock-in is the only way to protect asset value and flip that startup or go public.

Clever company founders figured out how to “have their cake and eat it too”. Here is the strategy. First take VC money and develop some new piece of software. Divide the software into two parts – (a) the part that looks nice but is missing major functionality and (b) the super-awesome add-ons to that software that really rock. You license (a) using the AGPL3 and license (b) as all rights reserved and never release that source code.

You then stand up a cloud instance of the software that combines (a) and (b) and not allow any self-hosted versions of the software which might entail handing your (b) source code to your customers.

Since the (a) portion is incomplete it poses no threat to their commercial cloud offering. And since the (a) part is AGPL it is impossible for a multi-vendor commercial ecosystem to emerge. If a small commercial competitor wants to augment the (a) code to compete with the initial vendor that has (a)+(b) running in the cloud, they are bound by the AGPL3 license to publish all of their improvements. This means that if the second company comes up with a better idea than the original company – the original company gets it and any and all competitors of the second company get the improvement for free as well. But if the original company makes an improvement – they keep it hidden and proprietary thus extending their advantage over all other commercial participants in the marketplace:

You can see this theme in the August 22 Kuali FAQ where they talk about “What happens to the Kuali Commercial Affiliates (KCAs)?”:

There will be ample and growing opportunities for the KCAs to engage with Kuali clients. The company would love for KCAs to take on 80% or more of the installation projects. The Kuali platform will continue to become more and more of a platform that KCAs can augment with add-ons and plugins. In addition, KCAs will likely be used to augment the company’s development of core code and for software projects for Kuali customers.

Reading this carefully, the role for companies other than Kuali, Inc. is to install the software developed by the new “Kuali, Inc.” company or perhaps develop plugins. With the source code locked into AGPL3, the greatest role that a community of companies can do is be “Kuali Inc’s little helpers”. The relationship is not a peer relationship.

When a company builds a proprietary product from scratch and releases a portion of it under APGL3, there never was a commons and the AGPL3 license is the best open source license the comapny can use to insure that there never will be a true commons.

Revisiting – AGPL – Fixing the “bug” in GPL (oops)

Now the AGPL3 advocates actually achieve their goals when the original company goes out of business because even though we never see the (b) component of the software, since the (a) part is open source and a truly open ecosystem could emerge around the carcass of the company – but by the time the company failed – it is not likely that their “half-eaten code carcass” would be all that useful.

What is far more likely is that the company using the AGPL strategy would get a few rounds of VC, thrive and sell themselves for a billion dollars or go public for a few billion dollars. After the founders pocket the cash, there would no longer need to market themselves as “open source” so they would just change the license on (a) from AGPL3 to a proprietary license and stop redistributing the code. Since the (b) code was always proprietary – after a few months of improvements to the (a) code in a non-open source fashion and the deep interdependence of the (a) and (b) code, the open copy of (a) has effectively died on the vine. The resulting company has a wonderfully proprietary and closed source product with no competitors and the VC’s have another half-billion dollars to give to some new person on a Caltrans ride. And the “wheel of life” goes on.

Each time open source loses and VCs and corporations win, I am sure somewhere in the world, about ten Teslas get ordered and a puppy cries while struggling to make it to the next level.

Proprietary Code is a Fine Business Model

Probably by this time (if you have read this far) you probably have tagged this post as #tldr and #opensourcerant – it might indeed warrant #tldr – but it is not an open source rant.

I am a big fan of open source but I am also a big fan of proprietary software development. The educational technology market is made up of well over 90% of its software that is proprietary. Excellent proprietary offerings come from companies like Blackboard, Coursera, Instructure (part b), Piazza, Microsoft, Google, Edmodo, Flat World Knowledge, Pearson, McGraw Hill, Apple and many others. Without them open source efforts like Sakai and Moodle would not exist. I am not so foolish that I believe that purely open source solutions will be sufficient to meet the need of this market that I care so much about.

The right combination in a marketplace is a combination of healthy and competitive open source and proprietary products. This kind of healthy competition is great because choices make everyone stronger and keep teams motivated and moving forward:

  • Linux and Microsoft Windows
  • Microsoft Office and LibreOffice
  • Sakai and Blackboard
  • Apache HTTPd and Microsoft IIS
  • ….

The wisest of proprietary companies even see fit to invest in their open source competitors because they know it is a great way to make their own products better.

The reason that the “open source uber alles” strategy fails is that proprietary companies can raise capital far more effectively than open source efforts. This statement from an earlier Kuali blog post captures this nicely:

We need to accelerate completion of our full suite of Kuali software applications, and to do so we need access to substantially more capital than we have secured to date to meet this need of colleges and universities.

The problem is also why it is very rare for an open source product to dominate and push out proprietary competitors. Open source functions best as a healthy alternative and reasonably calm competitor.

AGPL3 + Proprietary + Cloud Strategy in Action

To their credit, Instructure has executed the AGPL3 open/closed hybrid strategy perfectly for their Canvas product. They have structured their software into two interlinked components and only released one of the components. They have shaded their marketing the right way so they sound “open source” to those who don’t know how to listen carefully. They let their fans breathlessly re-tell the story of “Instructure Open Source” and Instructure focuses on their core business of providing a successful cloud-hosted partially open product.

The Kuali pivot of the past few weeks to create Kuali Inc., (actual name TBD) is pretty clearly an attempt to replicate the commercial success of the Instructure AGPL3 strategy but in the academic business applications area. This particular statement from the August 22 Kuali announcement sums it up nicely:

From where will the founding investment come?

The Foundation will provide initial capital investment for the company out of its reserves. Future investment will come from entities that are aligned with Kuali’s mission and interested in long-term dividends. A first set of investors may be University foundations. There is no plan for an IPO or an acquisition.

Read this carefully. Read this like a lawyer, venture capitalist, or university foundation preparing to invest in Kuali, Inc. would read it. The investors in Kuali, Inc. may be more patient than the average investor – but they are not philanthropic organizations making a grant. The AGPL license strategy is essential to insuring that an investment in Kuali, Inc. has the potential to repay investors investments as well as a nice profit for its patient investors.

Is there any action that should be taken at this time? If I were involved in Kuali or on the board of directors of the Kuali Foundation, I would be very suspect of any attempted change to the license of the code currently in the Kuali repository. A change of the kind of license or a change to “who owns” the code would be very significant. The good news is that in the August 27 Kuali post it appears that at least for now, a board-level wholesale copyright change is off the table.

All software that has been released under the current, Open Source Initiative approved Educational Community License (ECL) will and can continue under that license.

I think that a second issue is more about the individual Kuali projects. There are lots of Kuali projects and each project is at a different maturity level and has its own community and its own leadership. I think that the approach to Kuali, Inc. might be different across the different Kuali Foundation projects. In particular if a project has a rich and diverse community of academic and commercial participants, it might be in that communities’ best interest to ignore Kuali Inc. and just keep working with the ECL licensed code base and manage its own community using open source principles.

If you are a member of a diverse community working on and using a Kuali project (Coeus and KFS are probably the best examples of this) you should be careful not to ignore a seemingly innocuous board action to switch to AGPL3 in any code base you are working on or depending on (including Rice). Right now because the code is licensed under the Apache-like Educational Community License, the fact that the Foundation “owns” the code hardly matters. In Apache-like licenses, the owner really has no more right to the code than the contributors. But as soon as the code you are working on or using is switched to AGPL3, it puts all the power in the hands of the copyright owner – not the community.

A worrisome scenario would be to quietly switch the license to AGPL3 and then have the community continue to invest in the Kuali Foundation version of the code for a year or so and then a year from now, the Kuali Foundation Board could then transfer ownership of the code to someone else and then you would have to scramble and pick through the AGPL3 bits and separate them out if you really wanted to continue as a community. This is usually so painful after a year of development that no one ever does it.

The Winter of AGPL3 Discontent

If we look back at the four principles of open source that I used to start this article, we quickly can see how AGPL3 has allowed clever commercial companies to subvert the goals of Open Source to their own ends:

  • Access to the source of any given work – By encouraging companies to only open source a subset of their overall software, AGPL3 ensures that we will never see the source of the part (b) of their work and that we will only see the part (a) code until the company sells itself or goes public.
  • Free Remix and Redistribution of Any Given Work – This is true unless the remixing includes enhancing the AGPL work with proprietary value-add. But the owner of the AGPL-licensed software is completely free to mix in proprietary goodness – but no other company is allowed to do so.
  • End to Predatory Vendor Lock-In – Properly used, AGPL3 is the perfect tool to enable predatory vendor lock-in. Clueless consumers think they are purchasing an “open source” product with an exit strategy – but they are not.
  • Higher Degree of Cooperation – AGPL3 ensures that the copyright holder has complete and total control of how a cooperative community builds around software that they hold the copyright to. Those that contribute improvements to AGPL3-licensed software line the pockets of commercial company that owns the copyright on the software.

So AGPL3 is the perfect open source license for a company that thinks open source sounds great but an actual open community is a bad idea. The saddest part is that most of the companies that were using the “loophole” in GPL were doing so precisely so they could participate in and contribute to the open source community.

Conclusion

As I wrote about MySQL back in 2010, a copyright license alone does not protect an open source community:

Why an Open Source Community Should not cede Leadership to a Commercial Entity – MySql/Oracle

Many people think that simply releasing source code under an open license such as Instructure or GPL is “good enough” protection to ensure that software will always be open. For me, the license has always been a secondary issue – what matters is the health and vitality of the open community (the richness and depth of the bazaar around the software).

Luckily, the MySQL *community* saw the potential of the problem and made sure that they had a community-owned version of the code named MariaDB that they have actively developed from the moment that Oracle bought MySQL. I have not yet used MariaDB – but its existence is a reasonable insurance policy against Oracle “going rogue” with MySQL. So far, now over four years later Oracle has continued to do a reasonable job of managing MySQL for the common good so I keep using it and teaching classes on it. But if MariaDB had not happened, by now the game would likely be over and MySQL would be a 100% proprietary product.

While I am sure that the creators of the Affero GPL were well intentioned, the short-term effect of the license is to give commercial cloud providers a wonderful tool to destroy open source communities or at least ensure that any significant participation in an open-source community is subject to the approval and controls of the copyright owner.

I have yet to see a situation where the AGPL3 license made the world a better place. I have only seen situations where it was used craftily to advance the ends of for-profit corporations that don’t really believe in open source.

It never bothers me when corporations try to make money – that is their purpose and I am glad they do it. But it bothers me when someone plays a shell game to suppress or eliminate an open source community. But frankly – even with that – corporations will and should take advantage of every trick in the book – and AGPL3 is the “new trick”.

Instead of hating corporations for being clever and maximizing revenue – we members of open source communities must simply be mindful of being led down the wrong path when it comes to software licensing.

Note: The author gratefully acknowledges the insightful comments from the reviewers of this article.

Sakai 11: iFrames are starting to vanish

You have been hearing a bunch about the new responsive Morpheus portal and the removal of the iFrames from Sakai 11. Lots of work has been going on. Last night I committed the first of many changes to the portal code in trunk to start Sakai 11 down the path to being iframe free. The Morpheus effort is already well on the way to making our default portal mobile friendly and responsive.

If you go to the nightly server as of this morning, you will see that there are no more iframes except for the following tools:

Lessons, Samigo, Preferences, Resources, DropBox, and Home

If you want to a fun test, go to:

http://nightly2.sakaiproject.org:8082/portal

Make an account, make a site, add the Gradebook and a few other tools to the site – then click the four buttons across the top of Gradebook and then click the Back button four times – watching the URL change. No iframes, real REST looking URLs in the location box and flawless back button.

At this point we have not done anything that is irreversible, all we did was change two property defaults in trunk. You can restore yesterday’s behavior by setting these back to their old defaults:

portal.inline.experimental=false
portal.pda.iframesuppress=:all:

If you are playing with the morpheus portal, the next time you so an ‘svn update’ the same settings and behaviors will be in effect. The morpheus portal is inlining all but the above tools as well.

None of this will be put into Sakai 10 – it will remain in trunk for Sakai 11. We know there will be lots of little issues as we completely rewrite how the portl works underneath our feet and so we really need the next six or so months to colletively test these major UI improvements.

Over the next few weeks, we will be working on tweaking little markup glitches to make it so all tools can be inlined in both the neo and morpheus portals. Lessons, Samigo and Preferences have small issues of markup conflicts, jQuery versions or local CSS bits that should be easily fixed to allow them to be inlined. DropBox and Home use Bootstrap Javascript and CSS and so they have significant markup conflicts between them and the portal – we may need to wait for morpheus to mature some more to get these two tools working in inline mode. Home has four tools on a single page and there is no way to inline more than one tool on a page other than using portlets so the Home page will take some work – or perhaps we just replace it with the Dashboard :).

As we make these changes, every effort will be made to keep the tools working in all variants of the portal (neo with frames, neo with no frames, and morpheus with and without frames). But at some point we will need to change tool markup in a way that it no longer works with the neo portal or works in a diminished mode in neo. By that point, morpheus will have nmatured to the point where it will be the default and only portal that we support for Sakai 11. When that happens there will be plently of communication and announcements and opportunities to do some testing and feedback from the community. So make sure to listen carefully to the developer and user lists in Sakai over the next few months as this bit of “evolution in place” happens.

You can track what is happening at this JIRA:

https://jira.sakaiproject.org/browse/SAK-27774

If you find a problem that appears to be a markup conflict between the portal and tool markup, please file a JIRA and link it to SAK-27774 and we will see what we can do.

Let us know what you think of this on the user and developer lists. One of the benefits of being part of the Sakai community is that we make changes like this in the open and invite broad disucssion about them. It is one of the hallmarks of an open community of developers and users guiding a product forward together.

This is the first of many steps to a state-of-the-art responsive and iframe-free user experience – the journey of a thousand commits starts with a single commit.

Sakai 10 Released – The Magic of Open Source

In this post I am not speaking for University of Michigan, Longsight Inc., or the IMS Global Learning Consortium.

It is always a great feeling for an open source community to finish a release. So much work goes into a release and so many volunteers are involved and work hard – so it is a proud moment for a lot of us. I tend to be involved in more of the up front development and working on crazy next gen stuff. So I am doubly grateful to those who put the finishing touches on these releases and then get them out to the public and put them into production.

Here is the official release notice for Sakai 10. There is a long list of great stuff in that link that I won’t replicate here.

As I said in the The Post-LMS LMS article in Inside Higher Education, the past year has seen a lot of incremental investment in all five of the major LMS systems in the marketplace. In a sense we were all reacting to changes in the market. As we gain experiences with larger sized classes that we hope to run at scale (i.e inspired by MOOCs) there are a number of MOOC-like features that are finding their way into products. Sakai-10’s peer-assessment and improvement of group-submitted are partially inspired by the MOOCs heavy use of peer features. It is not so much that MOOCs were the first to do peer-assessment – more that peer-assessment has gotten a lot of attention in the past two years.

If Sakai end-users feel strongly enough about a feature to bring funding or resources to the table, the features get built and added to the core product and are part of the next release surprisingly quickly. It is that simple – no product marketing layer or sales people to fight through. You find or hire the necessary resources and have a feature in the core product. There is no other product in the LMS marketplace where end-users personally know the core developers of the product on a first name basis.

Another big trend is making sure that LMS systems can function well in cloud environments (i.e. like Amazon). In the past two years, Amazon’s costs have dropped dramatically and their capabilities have grown significantly. The addition of Solid State Disk Drives in many of their offerings is a quantum leap forward in the ability to host “normal” applications in the cloud that was impractical a while back. Simply put, two years ago – you had to be pretty clever to move a large application into the cloud because of the subtle performance tuning that was required – but now Amazon’s cloud resources have very similar performance characteristics to locally-owned hardware – expecially if virtualization is used on that hardware.

Just a quick look at Amazon’s EC2 pricing is pretty amazing – especially the one and three year fixed contract pricing. A m3.medium instance with about 4G RAM, 4G SSD and one CPU is $172 for three years. A bit more capable two CPU, 8G RAM, 32G SSD m3.large server is $343 for three years. Why would I ever run a server under my desk at work with prices like that?

So there is a pull for both self-hosted schools and commercial companies that host LMSs in their own clouds to take advantage of these prices. This is true for all vendors. Based on my rumors and bar conversations, I think that Canvas is the only major hosting company that is mostly using Amazon – but all the other vendors are likely eyeing hosting new work and new expansion in the cloud and as servers get replaced in a company data center it is likely that there will be an urge to use Amazon instead.

But as we move the hosting of these systems into the Amazon cloud, we still want to spend as little money as possible. And if you look at what you are getting in Amazon, the most expensive element of the cost is the RAM followed by I/O. CPU is almost an insignificant concern on most LMS applications. So not surprisingly, if you want to optimize costs in a cloud version of Sakai, you find a way to trade CPU for RAM and database I/O. The solution of course for all applications is a shared cache like memcache or Reddis.

So not surprisingly in the above video you see three Sakai Commercial Affliliates (AshaiNet, Longsight, and Unicon) putting a lot of energy into cloud-tuning Sakai by reducing app server memory footprint and database I/O by adding a shared cache and switching to elastic search.

This kind of cloud tuning goes on for all of the LMS systems. For Moodle, Moodlerooms and RemoteLearner separately tune Moodle to scale for the cloud. Of course Instructure tunes Canvas for the cloud all the time but we never see the source code. Blackboard announced thair plans to host Learn on AWS at BBWorld14 – an impressive step – since I was not in Vegas, I had to settle for screen shots of slides in Twitter DMs.

But the essential difference in the Sakai community is that three competitors saw fit to pool their cloud tuning efforts and put their code into the community release. Even while the code was being built and tested, developers from AsahiNet, Longsight, and Unicon were communicating regularly, checking, testing, and fixing code written by one of their “competitors”. And when it was all done the code ends up in the open source trunk of Sakai. There are no secret repositories with the “magic sauce” – you don’t have to go to the bar and get a drawing on a napkin to find out the clever tuning tricks that are being done to make this possible. Just check out the source code and take a look.

Now while to a proprietary competitor, it might seem crazy to give away the “crown jewels”. But like many crazy plans, it is just so crazy that it might work. First, everyone is running the same code. Vendors don’t need a vendor fork for perfomance tweaks. Self-hosted schools can deploy the same solution as the commercial vendors if they like. If self-hosted schools are a little nervous about switching from the “app server” / “db server” architecture – they can just wait while the commercial Sakai vendors gain experience – but at any time – they have access to the exact same cloud code that the vendors are using in production when they are ready to start saving hardware costs.

The second and more important issue is that cloud performance tuning is a moving target. Amazon will continue to tweak their offerings and performance characteristics. You never really are sure how something will scale until you are running it at scale. Who knows if Unicon, LongSight, or AsahiNet will be the first one to encounter a little bit of code that needs a bit of tweaking as you add the millionth user. But as long as we avoid tweaks in vendor branches and keep the tweaks in the trunk, when the second vendor crosses that million user barrier – the code will be there for them – sitting in trunk and fully tested.

Again it might seem insane for one vendor to commit code that will allow other vendors to match their offerings in the marketplace or allow self-hosted schools to avoid out-sourcing their applications to a vendor because they have 100% access to the same code. But the reality is that it is far less risky to work together than to work separately. There is no single school or commercial vendor in the Sakai ecosystem that can go it alone and ignore everyone else. We are in this together. We sink or swim together.

We will all help each other find our way to the cloud together. That is the power of real open source. That is the magic of real open source.

Even if you run a commercial LMS at your University – you should join us and be part of Apereo. Apereo is not just about Sakai. Apereo is where the next generation of teaching and learning technology will be collectively defined and built. Because what is next will be even more exciting than getting an LMS ready for the cloud.

Current Demographics for My Programming For Everybody Session 2 on Coursera

This blog post is to share some of the demographic data with my students in Programming for everybody on Coursera.

Demographics PR4E#002 (PDF)

Please contact me if you want ot use this in a blog post or some other publication to make sure I get you the most up to date materials.

Dear Google – You Need A Tip Jar So I can Show the Love

Google – Yesterday you saved me $2000 and there is no way to say ‘thank you’. If there were a place to “tip” Google I would certainly give you a nice tip.

Here is my story.

Two days ago, my wife came into the house and wondered why it was so hot even though we had turned the air conditioner on hours earlier. It did seem to be a bit warm so I went out to look at the compressor.

The fan was not spinning but it was hot and making a low hum – not good. Then I used a stick to push the fan to see if it was bad or sticking bearing – the fan spun freely but did not start. Our house was built in 2001 and many of the homes in the neighborhood have been replacing their 15-year old air conditioners. And it was in a series of very hot days so I knew it would take forever to get it fixed – groan. I turned off the power and figured I would use Google Search to do some research on how much this would cost me.

First I just tried to find out how much a new condenser would cost installed – so I googled “AC Condenser price” and “installation cost AC condenser” . There was no clear answer so I figured I would just go get the model number of my existing Carrier condenser and Google it to find the replacement cost of the exact same condenser.

So I started typing ‘carrier 38ckc036340’ and as I was typing – the following screen came up:

I was intrigued by the mention of ‘capacitor’ as I knew that it was pretty common for lots of electronic things to fail because of capacitor failure. So I looked at a few pages and then switched my search to ‘carrier 38ckc036340 fan stopped’ and quickly found this page:

Carrier A/C condenser not working (fan doesn’t come on)

The picture looked pretty easy to interpret so I turned off the power to my AC unit and opened it up. Not only was my capacitor top obviously bulging, I had a stripped wire that I was surprised had not shorted these past 15 years.

A couple of machine screws later I had the capacitor off. A quick motorcycle ride to the appliance parts store and $35.00 later I had a new capacitor. I popped it back in and the AC started working immediately:

So here is the upshot. Google’s type ahead saved me as much as $2K – not only did it save me money but I was able to complete the repair in about the same amount of time it would have take a repair company to call me back.

I know who helped me here and want to share the love. But there is no “tip jar” to drop $5 or $20 into to thank “Google”.

I think that you should make this part of AdWords. Put it in the AdWords rectangle as shown below. I know that I need to show the love to (a) the site with the money-saving knowledge and (b) Google for getting me there – so a profit split from the tip jar would put the right incentives in place for all.

Now in the future as search ads become less and less valuable (especially on the non-video internet) – you might find that a tip jar model is a great source of revenue.

Let me know if this works out for you. You could share a bit of the love for me coming up with such a cool idea by clicking on my little Leave Tip button.

Sakai Value Proposition in Light of the Unizin Announcement

Note: In this blog post I am not speaking for anyone other than myself as a faculty member in the School of Information at the University of Michigan, individual contributor to the Sakai community, and incoming chair of the Sakai Project Management Committee (PMC). I am not in any way involved with the Unizin effort at the University of Michigan. Full disclosure: I do consulting work as the Sakai Chief Strategist for Longsight, Inc. – A leading provider of hosting and development services around Sakai.

Related: Only on Canvas (Unizin) from Inside Higher Education.

There seems to be some confusion as to how the creation of the Unizin effort (www.unizin.org) might affect the Sakai community. Those who have not looked at Sakai in some time might assume that as the University of Michigan and Indiana University start to invest resources to support the Canvas LMS that Sakai will be left with no resources. Thankfully this is not the case as over the past ten years Sakai has become a rich and diverse international open source community. The following graph shows the participation levels of the various institutions on the Sakai developer list over the past ten years.

Looking at the graph, you can see how the Sakai project started with about seven schools that contributed the bulk of the initial code and provided essential leadership that helped the community to grow. Building the initial Sakai code base was very labor and capital intensive. But now that Sakai is on par with the other LMS systems in the marketplace the need for large-scale investment from Universities is greatly reduced. In a recent survey, Sakai represents a 9% market share in US higher education based on FTE.

Sakai no longer depends on the founding schools to move the project forward. Schools like Michigan, Stanford, Berkeley, Cambridge, Oxford, Cape Town, Columbia, and Indiana got us started in the early years – but over the past decade the Sakai community has become very rich, diverse, and sustainable. The Sakai community now has contributors from over 100 organizations around the world. Schools like Rutgers, NYU, Stanford, Columbia, Cape Town, Oxford, and others continue to provide community leadership and strong representation from higher education. These direct contributions from higher education institutions are increasingly supplemented by significant investments from successful Sakai Commercial Affiliates. The global nature of the community coupled with the open cooperation between higher education resources and commercial resources working on the same code base leads to a very robust ecosystem that will sustain Sakai for many years.

The Sakai product is a stable, performant and compelling learning and collaboration platform with an exciting and innovative development roadmap. The Sakai 10 release (2Q14) focuses on making Sakai cloud-ready and greatly reduces the hardware requirements to run Sakai. The Sakai 11 release scheduled for 2Q15 adds mobile capabilities and a completely redesigned responsive user interface as well as further improvements to Sakai’s scalability and cloud readiness.

Growing world-wide adoption has also been translated into significant levels of growing worldwide contribution. We’re an open global community supporting globally relevant software. Sakai features more languages than any commercial LMS because open source allows those with an interest to invest in a translation to meet their own needs.

Recall that Sakai was created in part to reintroduce competition into an LMS market space dominated by a single commercial provider. We along with a number of other LMS providers succeeded in reversing the trend towards single provider dominance. We welcome the innovations and competitive spirit that Instructure has brought to the market. The goal of Sakai is to make the entire learning ecosystem better rather than a simple focus on Sakai’s market share. Competition between Sakai and Canvas both in the marketplace and on various campuses will only make both products better to the benefit of teachers and learners regardless of the product they use.

I think that we are going to see larger higher education institutions supporting multiple LMS platforms as a steady-state situation for their campuses going forward. A school already might be actively teaching Sakai, EdX and Coursera – these systems provide capabilities that are “additive”. With the increasing trend toward outsourcing the hosting and maintenance of these systems (including schools that use Sakai) it is less important to only have a single LMS for the entire campus. Campus IT increasingly is maintaining a portfolio of services from multiple vendors to meet the needs of faculty, staff, and students at the university.

Supporting more than one LMS gives faculty a choice in a way that a single one-size fits all LMS has never been able to provide. It also means that faculty can experience unhurried migrations from one product to another since there is no rush to “shut down” an open-source LMS that does not have an annual license. And as the multi-LMS campus becomes the norm, standards and interoperability like those from IMS Global come to the fore. And Sakai is ideally positioned to work with Canvas and others to rapidly innovate around data portability and software interoperability.

TSUGI – An Standards-Based Learning Tool Framework

Over the next few weeks I will be writing some blog posts about my new approach to teaching and learning technology. You can see a few of my recent talks about my ideas on SlideShare.

The overall goal of the TSUGI framework is to make it as simple as possible to write IMS Learning Tools Interoperability™ (LTI)™ tools supporting LTI 1.x (and soon 2.x) and put them into production. The framework hides all the detail of the IMS standards behind an API. The use of this framework does not automatically imply any type of IMS certification. Tools and products that use this framework must still go through the formal certification process through IMS (www.imsglobal.org).

My overall goal is to create a learning ecosystem that spans all the LMS systems including Sakai, Moodle, Blackboard, Desire2Learn, Canvas, Coursera, EdX, NovoEd, and perhaps even Google Classroom. It is time to move away from the one-off LTI implementations and move towards a shared hosting container for learning tools. With the emergence of IMS standards for Analytics, Grade book, Roster, App Store, and a myriad of other services, we cannot afford to do independent implementations for each of these standards. TSUGI hopes to provide one sharable implementation of all of these standards as they are developed, completed, and approved.

In the long run I expect to develop Java, Ruby, and other variants of TSUGI but I am initially focusing on the PHP version because it allows me to be most agile as we explore architecture choices and engages the widest range of software developers.

In the long run, I hope to make this a formal open source project, but for now it is just my own “Dr. Chuck Labs” effort.
Even in its current form it is very reliable and very scalable but I am not eager to have too many adoptions because I expect the code will see several refactor phases as various communities start to look critically at the code.

If you want to watch this evolve see www.tsugi.org

Learning Tools Interoperability™ (LTI™) is a trademark of IMS Global Learning Consortium, Inc. in the United States and/or other countries. (www.imsglobal.org)