Monthly Archives: March 2012

Dr. Chuck SI124 Student Video Projects 2012

This is a set of set of student projects for my SI124 – Network Thinking ( course this semester. We have a project in the course where students can create a video with the goal of maximizing their views. Everyone studied the book Viral Loop by Adam L. Penenberg as part of the project. They have to write a paper at the end of the semester summarizing what they tried to do and what happened, what went right and what went wrong.

To be fair, you might want to view them all at least for 30 seconds or so and then decide if any of them are interesting enough to forward to folks you know. Viral is not so much about initial views but instead is about the likelihood of forwarding after viewing and that is what we are studying in these projects. If you look at all of them you will see an wide diversity of approaches in the attempt to go viral.

Feel free to comment directly on the videos in YouTube if you want to communicate with the video makers. They would love to hear from you.

Student Videos

Sliding Dry Ice

Drunk Sorority Girl Plays with Hummus

Do Nice Guys Finish Last?

Sh*t Babies Do

It’s Cool, I’m in a Frat

Sh*t People Say to Cops

Reflecting on a Week of Sakai, Blackboard, and Open Source

I really appreciate that Dave Ackerman (NYU) asked me some questions publicly about my new situation. My response is not particularly to Dave. At times I try to make clear parts of my response that specifically do not apply to Dave and NYU.

Comments welcome.

You were elected as a UMich employee; would folks have voted for you with the added employee hat?

There are other examples where someone on the board has changed jobs and where there was some concern as to whether their new job would cause a problem.   In one of the previous cases, I was privately quite concerned – but that concern turned out to be completely unwarranted and that person has done an outstanding job on the board functioning as an individual, representing the community rather than their company.  They carefully kept the roles separate as will I.  Ethics demands this.

It is fully my intention to run for board at some point in the future when my term ends at the end of this year. You will be relieved to know that I am not on the Apereo founding board.   The earliest I would consider running for the board would be at the end of this year as I would like to focus  my energies on getting 2.9 released.  So at some point in the future, the question of how will people vote will be answered.   When/if I run for the board in the future, you will be able to campaign against me, vote against me and if enough people vote against me in that election and I lose, my shame and fall from grace will be public and complete.

I would point out that long before I was a Blackboard employee, I had plenty of detractors.  If you read my book, you will see that I do not make strategic decisions based on whether it makes people happy or not.  I was not at all sure that I would be elected to the board when I ran back in 2009.  Perhaps it was because the people who voted for trusted me as a human being and committed community member than University of Michigan employee number 8675309.  Perhaps they knew me well enough to know that my commitment to this community is as deep as it can be and is unwavering regardless of what company or university happens to be paying for my travel expenses last year, this year, or next year.

I consider Blackboard a competitor to Sakai.  How do I feel about serving on a Board with a competitor?  Would Pepsi have a Coke employee on it’s Board?

Blackboard is not a competitor to Sakai.   Blackboard is not Coke to Sakai’s Pepsi.   Blackboard is Coke, Unicon is Pepsi,  rSmart is Starbucks, and Sakai is Toastmasters where we all get together on a Friday evening and give speeches to each other.

“… Thanks Michael Feldstein, that was indeed an excellent speech about  “Amazingly Accurate Advanced Alliteration” (applause).   But before you all go, lets make sure to thank Unicon for providing the snacks for tonight’s Toastmasters meeting (applause).   Next week it’s LongSight’s turn to provide snacks, right?  See you all next Friday when Nate Angell will be giving a speech on why “Portland is So Damn Awesome” (yes again) (audience groans)…..”

Sakai is an inclusive *open* source project and a non-profit corporation dedicated to supporting, expanding, and advancing an open community around free software that we collectively contribute to.  The non-profit Sakai Foundation was not formed as the “place where commercial market opponents of Blackboard secretly meet with angry former Blackboard customers to  collectively align their attack strategies to achieve maximum damage to Blackboard”.

Sakai is also not a secret club that meets in a tree house named the “Boys Are NOT Allowed Club”.

Check the bylaws to see if there is any reference to “secret strategy plotting sessions” or membership rules regarding boys.

Sakai Foundation ByLaws

Please read the bylaws and come back here and quote the parts that justify treating Blackboard differently than any other company with an intention to support and invest in the Sakai community.  I challenge you to find any passage in the bylaws that pertains to Blackboard that does not equally to Unicon or rSmart.

If you look at the board members of IMS (a non-profit industry alliance):

IMS Global Learning Consortim Board of Directors

You see lots of competitors on the same board.  It turns out that this is a *great* idea and frankly the makeup of the IMS board that includes a diversity of commercial interests is precisely why IMS works so well.

Blackboard *is* a competitor to rSmart, Unicon, LongSight, Edia, IBM, OpenCollab, Samoo, Seensoft, Sungard, Serensoft, KEL, Embanet, and many others with commercial interest in Sakai.

Sakai Commercial Affiliates

If they can be on the board, then why not Blackboard?  If I was on the board of directors of one of those actual competitors and a Blackboard employee it would be a gigantic issue.

Actually, I was on the board of a company that provides Sakai services and is a competitor to Blackboard and regularly goes against Blackboard in RFQ situations.   This Monday at 7:25PM, I offered my resignation to that board when I became a Blackboard employee and the board accepted my resignation with regrets.  It was very very very sad for me personally because I completely believed in what that company was trying to do and still believe in their mission – but I could not be on their board for the very reasons you cite. I typed the message at 4:45 (10 minutes after the press release went public) but kept it as draft for several hours because I just could not bring myself to hit the send button. I knew that my fellow board members would not see the press release until I sent the message so I wanted to delay it as long as I could. The people that I met through that board have become wonderful friends and I already miss them and the conversations we would have at board dinners after the board meetings were over.  The next board meeting was in Tahoe in a few weeks and I had been looking forward to it for a very long time as all of our other board meetings have been in San Francisco and LA and after five years, we were going to finally have a board meeting in Tahoe. And now I can’t go and spend time with dear friends. Resigning from that board and the possibility that I will never see some of those friends again is the only thing that has brought tears to my eyes this week.

Over the past few years working on IMS standards, I have developed friendships at Jenzabar, Desire2Learn, Instructure, and other companies.  While expect those friendships to continue as friendships, there will immediately be a natural loss of camaraderie, openness, sense of adventure, and shared purpose in those relationships.  This loss is because of my new association with Blackboard – not because of my long-standing association with Sakai.

While I was working in IMS, I was open about my involvement in the Sakai community – I actively built reference implementations in Sakai and often used Sakai as the first LMS to run through interoperability tests and in public demonstrations of new IMS capabilities.  I continuously used Sakai to help engineer IMS standards.  All the commercial participants in IMS knew I was the “Sakai guy” in addition to the “IMS guy”.  It was never a problem.  Never.  Desire2Learn may compete with rSmart – but they don’t compete with “Sakai” – we all understood that – Sakai was seen by all as a fair and honest participant in the marketplace.

It is ironic that virtually everyone in the marketplace from the outside (for profit and open source) sees Sakai as a fair, honest and open place where anyone can come and exchange ideas with the other members of a community committed to broadly advancing teaching and learning in a non-threatening manner.  It is literally only a *few* people *inside* the community that see us as “Seal Team 6” carefully plotting the end of Blackboard since 2003.  Folks, we don’t even have a helicopter – let alone many special stealthy helicopters. If we have been planning some kind of attack on Blackboard all this time, the Sakai board should approve the purchase of a tank or a flame thrower at the very minimum.

If Desire2Learn or Instructure decided to offer Sakai services and started to contribute real resources and real value (not just a check for $10K) to the Sakai community – I would make the case that they should be welcomed with open arms too.

Back to my participation on the Sakai board and in the Sakai community,  both the Sakai and Moodle communities (and foundations) need to be open to all that can help and contribute.  Open source communities like Drupal, OLAT, Joomla, and many others have these commercial-as-evil conversations over-and-over-and-over … … and-over.

The only time there is ever a problem in open communities is when one company becomes so dominant to the point where it employs a significant majority of the committers and creatives that produce an open source product (i.e. like Oracle and MySql) where that company could switch to a closed-source strategy any time it likes and retain enough committers to successfully maintain the code in a proprietary manner.

I wrote about this in a blog post in 2010 when Oracle bought MySql:

Why an Open Source Community Should not cede Leadership to a Commercial Entity – MySql/Oracle

The blog post is not a perfect analog to the current situation because mostly I am talking about how GPL is not as much protection as it claims to be and leads communities to a false sense of security where one company holds the copyright or a majority of committers.   The post *is* about a community that thought things would be all right so they let others do all the investing.   This quote from my blog post seems to apply in the current situation:

“Successful open source projects need to make sure to feed and take care of their bazaar – their volunteer technical core for the product. Be very wary of the “get resources quick” or “get results quick” schemes where you cede leadership to something or someone in the cathedral.”

Wow – that would seem to be highly critical of me and my recent actions, right?  Oops!  Perhaps I should quickly go back and edit that blog post to be more charitable to evil corporations.  Not going to happen. Wouldn’t be prudent. I stand behind my words.

If you read the entire post, it is a call to action to insure *diversity* and avoid apathy in an open source community and to make sure that unaffiliated individuals remain involved in the community and that the collective grooms *multiple* corporations to be involved in the community.  For-profits in open source is frankly the driving engine of progress – without for-profit involvement, Apache and Linux would not function.  I am not talking about RedHat because I despise proprietary forks that remove talent and resources from the community and then turn around and use revenue from their sales of a product they got for free to market against the community product (like Ubuntu).  I am talking about the hundreds of other responsible companies in the Linux community like IBM that pay employees and let them work freely in the community edition of an open source product as a fraction of their job.  Like Google’s 20% time.

In Sakai, Universities are great at forming the starting capital to kick off new initiatives like CLE and OAE.   But frankly they are not so good at writing a check year after year to keep something like CLE properly fed so it can survive.   Foundation staff on the CLE was five people in 2005 based mostly on contributions from Higher-Ed IT.   Since about four months ago, the foundation staff working on the CLE is zero and shortly after the last foundation resource was removed, the CLE progress toward release slowed to a crawl and a few weeks ago, the release was postponed indefinitely.  The only remaining *dedicated* release management resources moving the Sakai 2.9 release forward come from LongSight and Unicon.

I get the sense that all this irrational fear is that somehow Blackboard will “take over” Sakai.  That is simply not possible as long as others in the community remain committed to investing in Sakai.   As long as the community remains rich and diverse, Blackboard is just one of many sources of resources to help us all move forward and their resources make it better for everyone – including Blackboard’s commercial competitors.  If the current members of the community continue to withdraw their financial support from the foundation and their staff support of Sakai projects and efforts, you are ceding the community to whomever is left at the end.

There is the saying, “Will the last person leaving the room please turn out the lights.”   In open source, it is a little different, “The last organization in the room, owns the software.”.   My plea to this community is to “please stop leaving the room”.

It has been said that I can never quit writing a blog post or email message while I am ahead.   And this time is no different.  :)

I would suggest that those who vehemently oppose Backboard’s involvement ask themselves the following questions.  These do not apply to anyone in particular and certainly do not apply to you David as your contributions to Sakai OAE and this community are above reproach.

– Do I really have a logical reason for my opposition?  Is there anything in the bylaws or the tenets of open source that discriminates and declares or allows one to logically determine that one commercial source of resources is “evil” and another commercial source of resources is “good”?

– How can you be opposed to increasing diversity of thinking, ideas and approaches?  How can you be opposed to having one more source of financial and resources to our community?  Tell us all a logical reason for your position without using the word ‘evil’.  I seriously doubt you can.

– Are you uncomfortable finding out that there has been an increasing disconnect between public puffery supporting Sakai and private and shameful reductions in contributed resources?  Have you been part of the puffery?   Have you withdrawn resources?  I am sure you have great reasons as to why you withdrew resources – everyone does.  Talk to the hand.

– Does it bother you that about 40 higher educations stopped supporting the Sakai Foundation over the past five years?  Are you uncomfortable that so far, higher education seems unable take care of its own in the long run?

– Are you uncomfortable that for-profit companies already provide all of the long-term committed resources for the Sakai CLE product?  Would you perhaps feel more comfortable if there were three companies providing consistent dedicated resources for the CLE community instead of two?

– Are you uncomfortable knowing that if community members had continued to contribute enough resources to maintain the “Sakai commons”, I would might not be working for Blackboard right now?

Welcome to open and community source folks.   Resources matter.  The commons matter.  There is no such thing as a free lunch.

Please don’t read that last point as me having any second thoughts about my decision. Trust me, this will turn out to be the smartest decision I have ever made. I am not looking back. I am looking forward and breathless with anticipation. Soon, you will see that what we have accomplished so far will turn out to have only been the warm up act.

And, ‘no’ – the Sakai Foundation still does not need a tank or even a Zamboni.

I am happy and honored that David Ackerman of New York University has the confidence and respects me enough to challenge me publicly and demand that I explain myself.   He and NYU has been giving an amazing amount of resources and talent to this community (remember the icons on tools – those came from NYU – thank Max – love ya!).  And in particular David *continues* to give resources and has always been willing to *increase* his support when something is important.  This message is *not* about David Ackerman.

But if you have spent the past five years like an ostrich with your head buried in the sand, hoping not to see how bad the commons of the Sakai community have become, and the press release Monday forced you to pop your head up and say ‘holy shit!’ – I am *not* sorry.  It is about damn time you took notice of how so few people are working so hard on your behalf and being treated pretty damn poorly by those they faithfully serve.

Those of you who are still saying ‘no’ to Blackboard resources are voting to continue a downward death spiral of the CLE.   Pop up a new tab in your browser and listen to this – as you read the rest of this post. The symbolism is not as simple as it seems on the surface. I will give you one hint – in the song, the lyrics are not me talking to the Sakai community. Drop me an email when you figure it out.

Given that the CLE is an essential part of the OAE and that it will be some time before the OAE can truly fully replace the CLE, I would suggest that the premature death of the CLE will lead to failure for OAE and as such failure for and of the entire Sakai community.   The CLE must continue until the OAE is ready to take its place.  This is absolutely not about OAE versus CLE.  This is about investing in CLE to support the OAE.

I took the most pro-community step I could take in putting my reputation, friendships, my next board election, and everything I hold dear on the line when it seemed like there was no hope of any source of resources for the continued survival of of the Sakai CLE and asked Blackboard for resources.  If you read my book you will see that I have many times over done things that were unpopular or even hurt a few people’s feelings in order to keep this community alive. I assure you that this is one more of those things that a year from now – people who publicly and privately are cursing me right now – will later come up and tell me – “I am amazed at how well this worked out…  You were right all along.”

When that happens, I will not say, “I told you so.” I will say, “Yeah – isn’t it great to be out of the woods in terms of resources. Oh and by the way thanks for the two students you contributed to the CLE 2.9 QA. They were so helpful brought a lot of excitement and energy. We would love to have them back again to help us with the next release which should be a lot easier because we finally have decent coverage on written test plans. Are they available for the next OAE bug bash – OAE QA is even more fun because they have better unit tests than the CLE…..”

Even with all my curmudgeonly ranting in this post, frankly things are looking pretty optimistic in my opinion.  Blackboard will make some healthy investments in the community in terms of my time, access to Blackboard resources, and direct financial contributions to the Foundation and other Sakai efforts.  CLE will begin to move forward and as OAE matures we will gently move from one product to the other as our organizational needs dictate.

And perhaps the best outcome of all is that more than a few people might wake up and realize that they have been under-contributing to the Sakai Foundation and the Sakai Community (particularly the CLE). Perhaps motivated by genuine altruism or fear that Blackboard might take over, those people and organizations will stir from their self-imposed torpor and increase their support to make damn sure that whatever Blackboard spends on the commons – that the rest of the community is spending enough to insure that Blackboard’s contributions to the commons remain a reasonably small fraction of the overall community investment.

Sorry if I moved your collective cheese this week.  That always hurts.  But if in the process, I can cause you to look at the cheese and realize that it is nearly all gone and kind of dried up, and perhaps that once you look at the state of our shared cheese and feel sorry for it – that you will bring some more cheese.

Blackboard has brought some cheese for us all to share.  Will you bring some of your cheese back as well or will you wait until no one is looking and just snatch what you see as your fair share of the remaining cheese and ride off on your high horse and curse Blackboard under your breath?  It is your choice. My path forward is clear and publicly stated.


P.S. Keep the questions and challenges coming.  I have nothing to hide.   Ask them all.

P.P.S.  When you see me next, I will be proudly wearing a Blackboard shirt.  But my heart, beating an inch below that logo – has always and will always belong to the greater Sakai community.  My permanent tattoo on my right shoulder includes Sakai, Blackboard, IMS, Desire2Learn, LearningObjects, OLAT, Instructure, Moodle and an empty space tentatively reserved for ANGEL (please hurry – don’t let OpenClass or Fronter get that last slot!).   My Sakai logo was my first and is in the center and the largest of all my tattoos.    The other tattoos – while important – are like a small solar system revolving around Sakai like planets.

Connecting Blackboard, Sakai, and Open Source

Blackboard has announced a major shift in direction from a focus on a single LMS product (Learn) to a multiple LMS approach including long-term support and involvement in Learn, ANGEL, Moodle, Edline, and Sakai. If you have not yet seen today’s Blackboard press releases about this, you should read them carefully and then come back to finish reading this blog post. If you don’t read the press releases first, this post may not make a lot of sense.

Blackboard Press Releases

If we are meeting for the first time, you might want to take a look at my web site at, follow @drchuck on twitter, read some of my previous blog posts, read my book titled Sakai: Free as in Freedom (Alpha), Photo credit: Jay Jackson or if you are in a hurry read a review of my book from Joshua Kim of Inside Higher Education. I particularly like the part where he says,

Chuck Severance is Crazy: When I say that Chuck Severance is crazy, I mean crazy in the very best possible way. Crazy as in completely honest. Crazy as in willing to say things that might offend other people. Crazy in the sense that he is willing to be critical of his own actions. Crazy in that he believes in the promise of open source software in higher education more than he cares about his own career advancement. Crazy in that he is willing to take risks, to fail, and to learn from his mistakes.

As described in the above Blackboard release, I will be spending a significant amount of my time working for Blackboard in the area of Sakai engagement and open source engagement in general.

MC of the 2012 SI Revue modelling a 'Cool Girls Code' t-shirt photo-credit Nikki Candelora RodaI will continue to be a full-time faculty member at the University of Michigan School of Information and will continue to teach courses like SI124, do research, advise students, arrive late at my office hours when I go to lunch at the Jerk Pit, work with the Sakai team at University of Michigan, make open educational resources, make new courses, host the annual talent show, and invent things like the iPad Steering Wheel Mount to keep people from having to pick up a phone to text while driving. Just a normal faculty member doing normal faculty member things that also works for Blackboard.

It is an exciting time in the LMS market…

I have been talking to and encouraging Blackboard for a long time to think more broadly about the LMS market. For me it has never been about a single dominant LMS for all kinds of teaching and learning. There is a reason that so many learning management systems exist and need to exist. Looking at Phil Hill’s infographic on Educational Delivery Models, he identifies nine quite distinct approaches and he does not even capture the differences between pre-school, K-12, community college, higher education, professional development, training, certification, and a host of other places where educational technology is needed.

The notion that we will somehow find the “one true LMS” that will solve all problems is simply crazy talk and has been for quite some time. I am happy to be now working with a group of people at Blackboard that embrace the idea of multiple LMS systems aimed at different market segments. We will bring a diverse set of learning products to the market and place the most appropriate product(s) in the hands of teachers and students.

However, once you accept that there will be a lot of different Learning Management Systems, we solve one problem and exacerbate another. A company supporting multiple learning management systems, needs to be able to move courses, rosters, content, and tools between those different learning management systems in a friction-free manner. That means we need to get really good at portability and interoperability and do it in a hurry.

A long journey toward interoperability.

As a teacher and software developer, I have been working to build portable and interoperable technology and content since the 1990’s. My recent work has been as the Chief Architect of the Sakai Project (2004-2005) and the first Executive Director of the Sakai Foundation (2006-2007) and then more recently with the IMS Global Learning Consortum (2008-present).

If you carefully read the Sakai Project Proposal (2003) you will find this quote:

The education community will benefit greatly from a Tool Portability Profile [TPP] that provides an open, non-proprietary, and fully articulated specification for interoperable software. Any institution or commercial entity can build to this Profile, thus helping all institutions integrate software from multiple sources as their timing may require. The economics of software for the education community are greatly served by a proven set of pre-integrated, modular, open source applications that any institution can adopt incrementally or as an integrated set of tools.

It was personally disappointing to me that we never achieved the goal of enabling truly portable software across commercial and open source LMS systems as part of the Sakai project.

By 2007, Sakai had become a successful LMS and the Sakai community had expanded from the four founding schools to over 120 partners. Over time, the leadership and culture of Sakai moved from our idealistic founding notions of being a positive force across the whole market to a narrower view of competing for market share. So I quit as the Sakai Executive Director and joined the faculty of the University of Michigan School of Information.

Sakai had an important role in founding both the Tools Interoperability and Common Cartridge standards and I believed that these were the future of interoperable tools and data. In 2008, IMS hired me as a consultant / evangelist / developer to help increase the velocity of these standards in the marketplace. This was great because I was paid to work with all the vendors in the marketplace (Blackboard, ANGEL, Moodle, OLAT, Sakai, Desire2Learn, Instructure .. ) trying to get them to align on common approaches to import and export and plugging tools in.

Learning Tools Interoperability started a small, underground activity with only a few people involved – we did not know if we would be successful – but we were having fun as the adventure unfolded. We tried anything to get people’s attention. Marc Alier made a video predicting a terrible fate for Learning Management Systems if they did not add support for LTI.

But from humble beginnings, by 2011 everywhere you went, people were talking about LTI and Common Cartridge. And IMS Learning Tools Interoperability 1.1 was just completed adding the ability for an external tool to return grades to an LMS over web services.

I have been on this educational technology interoperability quest for fifteen years now and things have turned out so well that I could sit back and coast for a few years.

I could go to IMS meetings, give workshops around the world on LTI, fly around and give invited talks about Sakai, Open Source, IMS or some other pithy keynote topic. I could work on my second million frequent flier miles on Delta and maintain my friendship networks around the world. I could work to develop LTI 2.0 and transition IMS from SOAP to REST bindings. etc.

Actually I am going to do all of that – it sounds pretty good.

But, I will also take a risk and try to take interoperability and portability to the next level for the entire market.

I am fond of telling other people, “Just when you think that you have reached the end of a journey is when you see the beginning of the real journey.”.

Working with Blackboard, ANGEL, Moodle, Sakai and Edline in a coordinated fashion embedded inside of Blackboard with a “dream team” of LMS architects and helping to bring Blackboard’s significant resources into the open source ecosystem in a responsible manner and advancing the cause of portability and interoperability across multiple products in the marketplace even more rapidly than before is that next journey.

I understand that the territory is uncharted because no one has ever tried this. I understand and accept that I might fail or come up a little short. I am a little nervous but mostly eager to get started and see where this road leads.

Lets get started….

Those of you who know me – know that when I get excited about something I am working on, I get enthusiastic and start to talk loudly and wave my hands. But a motivational speech and good story only go so far and at some point things need to get tangible. So here are a few tangible things about what I am planning going forward:

  • With Blackboard’s support, I will be able to greatly increase the time I spend working directly on Sakai. Blackboard has signed a Corporate Contributor License Agreement (CCLA) to make sure that when Blackboard pays me to work on Sakai, the contributions have proper IP. When other Blackboard employees start making contributions to Sakai, Blackboard will sign CCLA’s for them as well.
  • I will now have access to Blackboard internal resources to bring to bear on improving the community edition of Sakai. The kinds of resources I can tap into include the Blackboard QA team, Performance Testing Team, Documentation Team and others. There are already people who have been testing and working on Sakai within Blackboard. In the next few weeks I will be meeting with them and exploring how we can turn the work they have done into something that is valuable to the community.
  • I am a member of the Sakai CLE Technical Coordination Committee. There are other commercial members of the TCC: Sam Ottenhoff (Longsight), Matt Jones (Longsight), Aaron Zeckoski (Unicon), and John Bush (rSmart). All of these people are part of the TCC not because of where they work, but because who they are and what they contribute. My initial focus in Sakai will be to spend time and resources increasing the velocity of the Sakai 2.9 release working closely with and taking guidance from my colleagues on the TCC.
  • I will continue to be on the Sakai board. Sakai board members are individuals and do not represent their organizations. For example, Nate Angell is associated with rSmart, Michael Feldstein is associated with Cengage (formerly associated with Oracle), and Maggie McVay-Lynch(board chair) is associated with Thanos Partners. My three-year Sakai board position expires at the end of this year.
  • I don’t expect to become a developer of closed-source applications. My management at the University of Michigan School of Information (my faculty position) have explicitly stated that this arrangement works best when the actual code I produce is open source. I can advise proprietary product teams in terms of how to best articulate with open source communities or how to become more open in their own approaches to their products. Just for example, with the announcement that ANGEL is no longer end-of-life, I might advise (my good friend and brand new co-worker) David Mills that there is only one remaining opening in my IMS Learning Tools Interoperability “Ring of Compliance” tattoo and he and Kellan Wampler had better get coding.

The reality is that there is no such thing as a rock-solid plan. The tasks in this list are not a “to do” list, nor are they “campaign promises”. They are merely a collection of what I am thinking I might do in the first year or so of this arrangement.

Throughout the process I will be talking and listening to my colleagues in Sakai, IMS, at the University of Michigan as well as my new colleagues in Blackboard to work out the right directions to take to get to win-win results. If you have a question or suggestion – just talk to me. If you have a concern – let me know.

This is a pivotal moment in the evolution of learning management systems. I am excited to be part of a talented Blackboard team that is dedicating itself to moving this industry to the next level.

Connecting IMS Learning Tools Interoperability and SAML

Both IMS Learning Tools Interoperability and Security Assertion Markup Language (SAML) have been developing seemingly independently in the space of broad sharing of user identity across multiple organizations and applications. Both can be seen as solving the problem of “federated identity”. SAML is the generic technique for signing, transporting, and parsing security assertions that is embodied in products like Shibboleth, Microsoft Active Directory, and others.

In this post I compare SAML and LTI technically and point out the similarities and differences between the protocols and then describe how these two technologies are indeed highly complimentary and both are necessary to achieve interoperability between an LMS, Organization, and a Learning Tool.

Comparing SAML and LTI

The problems that IMS LTI and SAML solve are quite different. LTI establishes trust between a single application (i.e. the Learning Management System) and an external tool while SAML establishes a relationship between an organization (or federation) and an external tool using a Single-Sign-On (SSO). LTI transports information in each launch about a users identity within the LMS, course from which the launch is coming, the resource/activity id for the launch, and the user’s particular role in the course, settings specific to the particular launch/activity in the course, and user information such as name and email. SAML provides a tool with a user’s enterprise identity, enterprise role, and user information such as name or email. Even though LTI technically passes its data through the user’s browser using OAuth 1.0, LTI architecturally is secure server to server communication in it’s domain of trust.

Both LTI and SAML are very concerned about only releasing private information when appropriate. Both protocols insist on tools understanding that they may not always receive a user’s name or email information on every launch. At times the tool may only receive the user_id (LTI) or the handle (SAML) and in particular, the tool should never use a person’s email address as their internal key.

SAML is typically deployed in a top-down fashion where an organization converts to a SAML-based SSO at some point in time. It is often a long and drawn out process to convert an organization from an existing SSO to a SAML-based SSO. When an organization is setting up their arrangements with tool providers, there needs to be organization-to-organization interaction exchanging key and other information to properly establish the organization-to-tool relationship. Federations like InCommon reduce but do not eliminate this complexity.

LTI is designed to be deployed in a much more organic, bottom-up / mash-up approach where an organization upgrades their LMS to a version that supports LTI and they immediately have access to all of the LTI enabled tools anywhere on the web. LTI can be used broadly by the organization and enables Web 2.0 style mashups fully under instructor control.

Comparing Technical Complexity and Use Cases

There is quite a difference in terms the technical complexity of SAML solutions versus LTI solutions.

SAML makes extensive and precise use of XML and PKI libraries in very precise ways and requires a large library footprint and highly configured application server environment. The typical SAML deployment is done through an Apache Module (mod) such as mod_shib and is difficult or impossible to deploy in small hosted environments such as, Google’s AppEngine, or since developers are not given root access on these low-cost or free hosted systems.

While the InCommon federation reduces the complexity for the tool providers in a SAML in terms of PKI key management, many schools (i.e. nearly all in K12) using SAML are not part of any federation and so each school must separately configure the organization-tool relationship exchanging and installing a PKI keys and configuring identity provider information.

LTI uses the very simple OAuth 1.0a signing protocol developed by Google, NetFlix, Twitter, and others to allow simple “arms-length” Web 2.0 identity mash-ups. Because OAuth is aimed at ease-of use and board deployment, its library footprint is very low. The PHP implementation of OAuth is 800 lines long, comments and all and includes both the code to support the sender use cases as well as the receiver use cases with no additional libraries needed. There are small and simple implementations for every language/operating system in common use and they work nicely in hosted environments (even the free ones).

The simple technology footprint means that LTI empowers far more independent tool developers than SAML, to the point where I as a teacher write simple learning tools or games , host them on Google App Engine, plug them into my LMS the same day I write them, and use them in my class all in the same day. This allows small, independent tool developers such as and to participate in the larger tool ecology using modest technology infrastructure.

In terms of key distribution in LTI, the key can be handed to an instructor or administrator and the key is typed into a configuration dialog in the LMS to make the connection between the teacher’s course and the external tool.

LTI is designed to scale to hundreds or thousands of independent applications and to be deployed in LMS systems as those systems upgrade to LTI complaint versions. LTI only requires the interaction of the LMS administrator or Instructor to integrate a tool into a course. SAML-based solutions cannot be used until a campus converts to SAML and the campus establishes a relationship with the tool.

SAML is best suited for situations where organizations are sharing large, identifiable applications with other organizations that also use SAML as their single-sign-on. SAML is the ideal solution for trust between applications running on one SAML-protected university and applications running on another SAML-protected university.

LTI is best suited where a single (or a few) learning applications like an LMS (and maybe a portal) need to interact with a large and highly changing list of externally hosted learning tools written in many different programming languages and hosted in a wide range of technology platforms.

Learning tools have a particular unique need in that in a learning context, each user may have a quite different role in each course they are participating in. A person may be a faculty member as their organizational role, but also be a student in an Italian class. Learning Management Systems model these extremely fine-grain and often highly dynamic roles and LTI properly transports and communicates all this role detail to external tools. SAML models a person’s organizational role (i.e. Faculty member in School X) but does not model roles down to membership in every roster in every course every taught (past, present or future).

In summary, the technical footprint for SAML and LTI is different and the use cases they solve is very different. But there is also some similarity between LTI and SAML.

Similarity and Overlap

It is not just dumb luck that there is architectural overlap between SAML and LTI. From the moment that Sakai was conceived there were discussions about SAML (Shibboleth) and Sakai. Steve Carmody of Brown University and Scott Cantor of The Ohio State University were frequent visitors to Sakai meetings and advocated strongly for a deep connection between Shibboleth and Sakai. Based on these discussions, I did my own evaluation of the state of Shibboleth technology in 2004 to see if we could deeply integrate it into Sakai. In 2004, Shibboleth and SAML was not very mature (neither was Sakai) and almost no schools were deploying Shibboleth-based SSO so my architecture decision at that that time was not to build Shibboleth into Sakai and force schools to use Shibboleth as their SSO in order to use Sakai.

My technical work in 2004 gave me a deep understanding of the design, and architecture and strengths of the SAML-based approach used by Shibboleth. In particular, I was quite aware of the use of non-identifying handles, selective attribute release, and other essential features of the SAML/Shiboleth architecture.

Much later Shibboleth support for Sakai was developed by the University of Oxford and the University of the Highlands and Islands in Scotland.

IMS Tools Interoperability started in 2004, and in 2005 had a demonstration in Sheffield, UK that featured Sakai, Moodle, WebCT and Blackboard. TI 1.0 never really got off the ground because its SOAP-based technology footprint was just too much of a mountain to climb for an unknown interoperability payoff. Behind the scenes of the Sheffield demonstration was far too much pain and unnecessary complexity added due to the specification’s use of SOAP. Even though TI 1.0 was not widely implemented, it had all the essential architectural elements that would later define IMS Learning Tools Interoperability 1.0 (note subtle name change “TI 1.0” versus “LTI 1.0” – there will be a few more name changes in this blog post so pay attention).

TI 1.0 was a mash-up approach that only required server-to-server trust (i.e. not organization-server trust), but borrowed the SAML/Shibboleth notions of an opaque handle (user_id in LTI terms) and the selective release of private information such as name and E-Mail.

The Emergence of LTI

Thanks to early leadership of Bruno Van Haetsdaele of Wimba and leadership and funding from Chris Moffat of Microsoft we started a project called IMS Learning Tools Interoperability LTI 2.0 (Note the addition of “L” and the number 2.0) to redo the specification with an extremely simple REST-based protocol to achieve the same results as TI 1.0 but with a far smaller software library footprint. Our working goal was to be at least as easy as the FaceBook API or easier. The initial work was based on Wimba’s REST-based API. The Wimba/Microsoft-led specification made good progress through early 2008 to the point where it looked like it was ready to be final and we were building implementations for Moodle and Sakai to perform engineering tests.

In early 2008 the group was approached by Blackboard where they showed us the (at that time unreleased) Blackboard BB9 Proxy design for launching external tools. After some deliberation, we decided to scrap the Wimba approach and move to one loosely based on the BB9 proxy approach. This set us back to square one in terms of the technology we developed for Sakai and Moodle – which was discarded. I was in a panic at this point because I had committed to a Google Summer of Code project with three students to build LTI 2.0 implementations and we had just thrown away the specification and prototype code.

Because Google was not flexible in schedules, I quickly assembled an unapproved specification that I called Simple LTI for use by my students over the summer. Simple LTI was a strange combination / approximation of the Wimba provisioning architecture and security model and the BB9 Proxy launch model. It had all the advantages and disadvantages of something developed by a lone panicked person with no reviewers in a week.

Simple LTI was remarkably successful over the summer and Fall of 2008. My students (Jordi Piguillem Poch of the Universitat Politècnica de Catalunya / BarcelonaTech and Katie Edwards of McGill University) built working prototypes of it all, and Microsoft funded a .NET implementation as well. At the 2008 Educause conference in October, I walked around showing our awesome demo to 4-5 people and telling them “this is the future”. A few people like Bill Hughes and Mary Ann Scott of Pearson got it but most of the time I was a nerd with a laptop showing screen shots to people that yawned.

The near completion of the Wimba-led LTI 2.0 effort also attracted the attention of Pearson. Pearson had their own integration strategy for plugging into LMS systems called “Third-Party Interoperability” (TPI) that used a launch and web-service callback approach to solve this problem. It was effective but the web services were (sorry Greg) a little clunky.

With the re-launch of LTI 2.0 to be based on Blackboard’s BB9 proxy, we decided it was time to add the DNA of TPI to the mix as well.

From 2008 onward, the LTI 2.0 leadership was Lance Neumann of Blackboard, Greg McFall of Pearson, and Mark McKell of IMS. There was a lot of work to do in LTI 2.0 and it took from early 2008 to early 2010 to finish and approve Basic LTI 1.0. We decided it would be a 1.0 since TI 1.0 was such a failure. We called it “Basic” because it was a tiny subset of the overall LTI design, only covering the launch. We wanted to get something out while we refined the richer feature set which we called “Full LTI” to contrast it from “Basic LTI”.

With the best DNA of Wimba, Blackboard, and Pearson all mixed together, and with enough time to work out the kinks and think the design through very carefully, Basic LTI 1.0 was a solid technical design. It was reduced to only the most essential features and adopted the OAuth as its security model (please don’t look too closely at the Simple LTI security model).

Basic LTI 1.0 (later renamed LTI 1.0) was a great market success. At the release of the Basic LTI 1.0 in May 2010, nearly 100% of the higher-ed LMS market could use the spec, either through released code or through open source plugins (i.e. Moodle Module or Blackboard Building Block). A lot of people contributed to the success of Basic LTI 1.0 including Stephen Vickers of Edinburgh University, George Kroner of Blackboard, Alan Zaitchik of Jenzabar, Matt Teskey of Desire2Learn, Mark O’Neil of Oscelot and Blackboard, and many others.

The rate of Basic LTI uptake was unprecedented for a learning standard because it solved a very important problem and did it in the simplest possible way, with good test harnesses, straightforward certifications, with a lot of open source implementations made available from the very beginning. There as nothing in Basic LTI that was not essential and yet it worked very well.

The first major commercial vendor to ship support in their native release was Desire2Learn in their 8.4.7 release. This was announced as a surprise at the November 2010 Educause meeting. Blackboard was a *little frustrated* at Educause because they had put several years of effort into developing and supporting the standard and had been a leader in the specification and yet Desire2Learn scooped them in the marketplace. I of course made sure to point that out to Ray Henderson in the Blackboard booth and suggested that he not let that happen again in the future. I was not above playing one vendor against the other to achieve my interoperability objectives on behalf of teachers and learners regardless of the LMS they use.

That night I went to a Desire2Learn reception and shook John Baker’s (D2L founder and CEO) hand and thanked him for firing the short heard round-the-world in Learning Tools Interoperability.

Blackboard followed suit and did so in strong fashion early in 2011 when they released Learn 9.1SP4 with LTI support. But what as even more exciting was the announcement of the CourseSites service where teachers could use BB9.1SP4 for free and it included Basic LTI 1.0. This as a very important development for me because it represented the first moment where a teacher could build an application and host it on a free service like Google AppEngine and then teach a course with a commercial LMS that was free and plug their tool into their course with no requirement of any interaction of a university of LMS administrator. While no one other than me used this immediately, for me it was a tipping point in the instructor-mash-up use case.

By the end of 2011, Basic LTI was a roaring success. And once it became widely used, everyone started complaining about what was missing (i.e. like the ability to return a grade). The LTI working group was working on a long-term architecture to enable lots of great services but that was going to take a while so IMS decided to do one more “Basic” release and add simple grade return.

The decision also was made to just re-name all these specifications “LTI” with a version number. Basic LTI is now referred to as LTI 1.0 and the new LTI with grade return is called LTI 1.1 and was formally released yesterday (yes yesterday). The richer LTI (formerly known as “Full LTI”) will now be LTI 2.0.

With LTI 1.0 (formerly known as Basic LTI) already entrenched in the marketplace, the uptake of LTI 1.1 is almost immediate. Virtually all the major vendors in the marketplace are either shipping or will ship in the next few months support for LTI 1.1 in their main release. We don’t have to go through the phase of building and using open source plugins for LTI 1.1, waiting for vendors to release their official versions. Broad availability will happen very quickly. Canvas Instructure and Moodle 2.2 are the champions of first-to-the-market race for LTI 1.1. Sakai, Blackboard, and D2L need to play a bit of catchup this time around but I doubt it will take long.

While LTI has a lot more to bring to the marketplace in later evolution of the specification, at this point it solves a very important subset of the LMS to tool interoperability needs.

I would be remiss if at this point I did no mention Kevin Riley of IMS. Kevin Riley was the staff member of th original IMS Tools Interoperability 1.0 specification started in 2004 and finished in 2006 with the Sheffield demo in 2005. He was our cat herder and guide throughout the process.

We lost Kevin on 16th November 2011. It was the week of Educause 2011 where for the first time we truly saw the breakout success across of LTI 1.0 across the entire marketplace in ways that enabled whole new approaches and thinking to how we develop and provision learning tools. Kevin was our collective starting point for this revolution.

In encourage you to take a look at Kevin’s memorial page at, watch the video, and read the comments from his family, friends, and colleagues.


8th June 1960 – 16th November 2011

We owe a great debt of gratitude to Kevin for getting us started when few believed in the ideas. He kept us moving forward when it seemed like we were on a long journey to nowhere. I wish he could be here to see what the creation that he led us to create has wrought. I know that he would chuckle and say ‘Aw shucks’ and then change the subject to talk about how much he likes this current season of Dr. Who.

Increasing SAML Adoption

During the same timeframe of 2004-2011, the SAML community was also making great progress. The implementations were becoming more robust and in far more languages. SAML moved from something that “only came with Shibboleth” to support in commercial identity products from IBM and Microsoft. SAML became a popular Single-Sign-On approach in K12 software because it came built-in with Microsoft Active Directory.

In higher education, the Interet2-sponsored InCommon federation became increasingly successful as more and more campuses added Shibboleth or other SAML based Single-Sign-On capabilities and joined InCommon.

In the UK and Netherlands there were broad adoptions of SAML-based SSO with funding and involvement from JISC, SURF, and others.

Even the University of Michigan – my school – the university that invented the CoSign SSO technology – adopted and deployed Shibboleth in addition to CoSign (i.e. not as a replacement).

So by 2011, SAML based solutions have very much turned an important corner in adoption to the point where schools that don’t have support for SAML-based SSO and are members of the InCommon federation are increasingly becoming seen as the “odd-school-out”.

When Worlds Collide

Both IMS Learning Tools Interoperability (since 2004) and SAML (since 2000 have had a long history and have recently seen growing success and widespread adoption.

It was only a matter of time before they ran into each other.

The first major encounter happened at the University of Wisconsin Division of Information technology a couple of months ago.

Ironically, the University of Wisconsin took the lead with WebCT to develop the first prototype of the IMS Tools Interoperability specification back in 2004. You can see Dirk Herr-Hoyman of Wisconsin in the TI demonstration in Sheffield, UK. You can also see Anthony Whyte of the University of Michigan and Lydia Li of Stanford University in the video.

An even more ironic detail is that the very first successful exchange of a TI 1.0 SOAP message came while we were at a week-long hack-fest at the University of Wisconsin at Madison. That first message went between Chris Vento-developed WebCT code and Dirk Herr-Hoyman-develped Moodle code.

Enough of the reminiscing, on to the present.

Wisconsin is a Desire2Learn school and supports Shibboleth/SAML and a one of the leading schools in the the InCommon SAML federation. As teachers started to come forward making requests to integrate LTI-based tools into Desire2Learn, the Wisconsin DOIT team raised the completely logical question “why are we not using SAML for this?”.

I was not surprised that this happened and in a way was waiting and well-prepared for this conversation. After all I (and many others) had designed LTI with full awareness and appreciation of the strengths and weaknesses of SAML-based SSO solutions. I knew that SAML and LTI solved different problems and where there was overlap, I made sure to carefully align them so they would fit together like two puzzle pieces when the time came to bring them together.

This led to a series of telephone conference calls and a scheduled face-to-face meeting in Madison on February 29, 2012.

The Big Meeting

Originally we expected the meeting to be an all-day affair with about 10 people. By the time it was all figured out the only people in the meeting turned out to be Kieth Hazelton, Scott Fullerton, and me. And I got stuck in the morning rush hour traffic coming out of Chicago and arrived two hours late with only time for a quick lunch and a 90 minute whiteboard meeting.

We had our lunch and were were done in 45 minutes and one marked-up whiteboard later. The solution quickly became obvious. We never even had to erase the whiteboard. I had a more complex solution in mind but Keith said that my complex solution was fraught with problems and that it had been tried before in SAML-land and failed miserably. He suggested a much simpler solution that would be sufficent, and within minutes it was obvious how LTI and SAML could easily be made to work together in a completely secure and flexible manner. And actually it showed how any SSO could be effectively used in concert with an LTI launch.

The solution was simple, effective and secure. It required no changes to SAML and only a tiny change to LTI to add some optional data to the launch.

It also solved a problem with LTI of having no way to easily associate the local LMS key with a more global single sign on credential. It also solved the problem of letting users directly log into the external tool, bypassing the LMS and getting access to their data.

In short, it is the coolest thing since sliced bread. As I expected and hoped the two specifications were complimentary and perfectly filled in each other’s missing bits – it was even better than I had expected. Because both specifications were mature and well adopted we had the right architects in the room who understood the real strengths and weakness of each technology, we went right to the correct solution.

The following is a SlideShare presentation that captures the essence of the design. It is draft, and so I may need to revise it, and comments are welcome. Like all first drafts there is room for improvement.


This is a great start and there are many people that have contributed to how well we can now integrate applications into LMS systems and other systems on campuses. We see the combination of LTI and SAML allowing a wide range of integration approached from the large multi-organizations efforts like InCommon down to one faculty member writing a game, running it in AppEngine and plugging it into a multi university free cloud-hosted LMS like Blackboard’s CourseSites or Instructure’s Canvas.

There is plenty of work to do to build some prototype code and prove this works in real production. We have some SAML-protected applications at Wisconsin that we can work with to prototype a SAML+LTI connection.

Comments welcome.

Wireless Microphone Shootout – Sony WCS-999 + Olympus ME-15 Wins over Azden WMS-Pro

If you look at my original post about my video interview travel kit, I was unsure about which wireless microphone setup to standardize on.

I was trying to decide between the Sony WCS-999 Wireless Microphone and the Azden WMS-PRO Wireless Microphone. I did not like the clip-on microphone supplied with the Sony, so I purchased a Olympus ME-15 Microphone to use with the Sony WCS-999 wireless unit.

I had bought one of each and did a side by side test to compare them. I have two cameras and connected each mic to one of the cameras. When I switch mics, I always switch audio and you can see the active mic in the video as well.

When the test was done, I preferred the Sony WCS-999 Wireless Microphone and the Olympus ME-15 Microphone. It overdrives the audio a little bit and I wish I could turn it down about 10% – but overdriving is better than under driving because when I have to increase the column from the Azden WMS-PRO Wireless Microphone in post-production, I got a lot of background noise.

So from this point forward, I will be packing two Sony WCS-999 units. Anyone want to buy a slightly used Azden WR-Pro Wireless microphone setup?

Brent’s New(er) Subaru Forester from Dunning Subaru and Kristin Malik

Brent’s 2001 SunFire lasted about two-and-a-half years, doing a fine job as a teenage driver’s first car. It got him through his drivers test, driving to LCC, and absorbed a few small dents and scratches along the way. About a month ago, it developed a rod knock. With 136,000 miles and literally everything on the car falling apart, it was time for another car. I took the car to Shroyer’s Auto Parts and sold it for $300.00.

For Brent’s next car, we wanted something with much better visibility and a more upright driving position. We also wanted something with a bit of cargo capability and a hatchback so he could help hauling all of his band’s gear from one gig to another. We were looking for something like a Subaru Forester, Toyota Matrix, Ford Focus, Malibu Max, HHR, or PT cruiser.

I spent a long time on Craig’s List,, and poring over newspapers and the little used car magazines trying to find something that would fit the bill in the under five thousand dollar range. It was slim pickings – most of the cars have > 150K miles on them and had something wrong in need of fixing.

I just kept skimming all of those brands every day or so on and test driving a car every day or two to get a sense of that was really in the marketplace. One Sunday, while looking at online I found a used Subaru Forester at Dunning Subaru in Ann Arbor for about $4500. We had purchased Teresa’s 2006 Subaru B9 Tribeca from Dunning new back in 2006. A Forester would be perfect – lots of visibility, legendary reliability, cute as heck, good cargo space, and as a bonus – full time all – wheel drive for the winters.

I sent in a message Sunday that I was interested in the car via the Dunning web site. A few hours later (still on Sunday) I received a mail message from Elena Manalp, their Internet Sales Manager. At 9AM on Monday, I got an Email from Kristin Malik asking when I wanted to come in. I was on my way in so I said 10AM. I arrived an hour later and the car was pulled up, with the keys in it and a license plate on it, ready for a test drive sitting right in front of the dealership.

I walked right up to the car and looked inside and my heart sank. It was a standard transmission – that explained why such a nice car was in my price range. Brent needs an automatic because of his handicap. As I was bemoaning the situation, Kristin walked up and I told her that I must have missed the fact that it was a manual transmission and so this car was not suitable – but I wondered if I could take a quick test drive to get a sense of how a Forester rode. So we took a quick test drive. It was the perfect car for Brent except for the standard transmission.

When we got back, I did not want to waste any more of Kristin’s time so I thanked her and was about ready to be on my way. As I was about to leave, she asked what my price range would be for a Forester with an automatic. I said that $5K was the top of my range as I did not want to put full coverage on a vehicle driven by a 21-year old. I said that I was pretty familiar with the market by now and knew that was an impossible price for a Forester with an automatic.

So I went back to Craigslist and to keep test driving suitable cars in the price range. I had given myself three weeks to replace the car and my time was running out so I was either going to compromise and purchase a car that needed some repairs, or compromise on the body style and get him a Ford Taurus as they were relatively good values but decidedly un-cool, or just pay more money to get him back in a car.

After a week went by and I was about to declare my three week search over and take a compromise, Kristin called and said she had a 2003 Forester with an automatic and 161K miles that she could let me have for $5K – it was a lot less than the price they had listed. My hopes again went up as I came in and saw the little car. It was in great shape, with heated seats, heated mirrors, power everything and cosmetically outstanding. After a quick test drive, I was in love with the car. I had to ask Kristin twice if the price was really $5K including tax, title, and delivery and she said, ‘yes’. I had to pinch myself as I had been prepared to settle for a far less attractive car for about the same price. I put a deposit on the car right away so I could bring Teresa down to see the car the following Monday.

As soon as Teresa saw the car and took a test drive she was sure that we were going to own this little Forester. When we got back, she told Kristin that she loved the car. Kristin had also set up two other Foresters for us to drive because I was surprised at how truck-like the Forester ride was. We had owned an Outback and Tribeca and they drove like luxury cars and so I was a little freaked out by the truck-like stiffer ride of the Forester. Kristin assured me that that was just how the Forester was and let me drive two other Foresters of different years to assure myself that the ride in the little 2003 was completely normal. Kristin was right on. Foresters just drive differently than Outbacks.

When we got back from the third test drive, we went in and sat down with Kristin to tell her we wanted the car. To my surprise, she had already got the paperwork set up and we were ready to sign. I asked her how she knew we were going to buy the car. Kristin laughed and said that once the wife says she likes the car, Kristin knows that the deal is going to happen so it is time to start the paperwork. I asked Kristin one more time if the price was really $5K out the door. Kristin again said ‘yes’ and I pinched myself again to make sure I was not dreaming.

We drove the car home that night and I decided I would drive it for a few days back and forth to Ann Arbor just to make sure there were no problems before we installed the $500 left-foot accelerator for Brent. The car drove great with no problems and no oil leaks so we took the car to Clock Mobility in Lansing to get the left foot accelerator installed. Danny at Clock Mobility in Lansing is great to work with and does outstanding work and does it very quickly. I am sure that he knows that once people bring him a car, they are quite anxious to get it back.

Once it came back from Clock with the left foot accelerator, it was time for Brent to take his first test drive. The seat could be adjusted perfectly and his visibility was amazing and the car’s small size greatly increased his confidence while driving. It was as perfect as I had imagined it would be.

During the test drive at Dunning there was a problem with the rear hatch sticking. Kristin even broke a nail trying to get it open. Without hesitation, Kristin promised that Dunning would fix the latch at no charge regardless of what it would take. First the mechanics manually un-stuck the latch but it quickly re-stuck and so they ordered a needed replacement part and called me a week later when when it was ready and installed it at absolutely no charge to us.

In the weeks since we got the car, Brent has been driving all over the place – he says he can see so much better than in his previous car and is better able to park and navigate. The car is so cute and he can fill up the back hatch with the drum set, guitars, and amplifiers for the band. It is just perfect – it is like a little dream car.

The only thing that still has me a little worried is that Subaru’s are just far too much fun to drive in the snow. So far we have not had any snow – but I am a little worried the first time Brent realizes how agile an all-wheel-drive car can be in the snow. Brent has plenty of experience down sweep-turns and donuts with his ATV so he knows how to whip a vehicle around in a large sandy playground. I have warned him several times that an AWD car can accelerate much better than a normal car in the snow or ice – but its brakes are no better than a normal car – I hope he listens or at least I hope I can take him driving after the first big snowstorm and show him the AWD ropes.

I cannot thank Kristin Malik and Dunning Subaru enough for their outstanding customer service and amazing commitment to help me find the best possible car for Brent that was within my price range. Every time I walk by the car in the garage or watch Brent drive off to school, I pinch myself.

IEEE Computing Conversations Preview: A Wonderful Weekend in Zurich

I spent this last weekend in Zurich filming two separate interviews for upcoming issues of IEEE Computer magazine Computing Conversations column with Niklaus Wirth and Bertrand Meyer. It will take a few months before they are edited and appearing in the magazine, but I figured I would share a bit of a sneak preview to whet your appetite.

The day started at 4:30AM when I woke up at my hotel at London Heathrow in order to make a 7:45AM flight to Zurich. I decided that I would dress in a suit and a tie instead of my usual IEEE Computer Society golf shirt. I had a heavy rolling bag packed with tripods and video gear. But the flight went very well and I arrived right on time. It took three trains and about an hour to get to the small town where Dr. Wirth lives since we had agreed to do the filming at his home.

Niklaus Wirth

We had agreed to meet mid-afternoon in case I had a problem with travel or an unanticipated travel delay. But my travel went perfectly and I arrived about 2 hours early. I hoped to find a little pub or coffee shop to sit and work on my laptop waiting until the right time to go to his house, but the town was so small that there was no where to wait. So I started hiking to Dr. Wirth’s house. The sun was bright and the recent snow was melting and running down the roads and the birds were signing – it was a wonderful feeling. It was about a 1/2 mile hike up and down hills and part of the hike was on a dirt path. I must have looked quite silly in a suit, tie, and dress shoes and backpack and carefully lifting my bulky rolling bag as I walked down a dirt path in rural Switzerland.

When I arrived at his house (2 hours early), he had just finished preparing his lunch. He was surprised to see me so early but warmly invited me in and I had a cup of coffee while he had his lunch. We had a great conversation over lunch, and I showed him the books I had brought as well as my line-printer listing of my own compiler project written in Pascal back in 1980 when I was an undergraduate.

After lunch, I started setting up the lights and video equipment in his living room. Part-way through, he suggested that perhaps we could film in his basement workshop. When he took me down to the workshop, it was clearly the perfect place to film him. His workshop was filled with computing equipment, hardware, software, woodworking tools, and radio controlled helicopters. It was the perfect set.

I was able to use his workshop lights along with my own lights to get a really nice setup for the interview. I even had a backlight.

The interview went wonderfully. Of course we talked about Pascal and its success. But we also talked about how he was uninterested in commercializing his creations, instead always taking the next step in producing new approaches to teaching Computer Science. The Modula language, Lilith computer, Oberon language and operating system, and Ceres computers were simply the evolution of what he saw as improving how we taught computer science. He left any commercial potential for his creations to others. In essence it was Open Source long before the term was coined.

Dr. Wirth has been retired for over ten years. He thinks of being retired as an even better form of tenure. He continues to work and innovate in his own directions and at his own pace and with no need to attend faculty meetings. He is currently working on building a teaching environment that combines his Oberon object-oriented operating system and language with a microprocessor he designed for the Xilinx Field-Programmable Gate Array (FPGA). When he completes this work and accompanying book, it will provide an environment where students can explore a complete working computer from the hardware logic gates through the operating systems and on up to end-user applications written in Oberon. He is concerned that as our consumer electronics hardware becomes increasingly integrated and sophisticated, students lose the ability to understand the low-level elements that make up our personal computing environment. He wants to provide an integrated working environment that is simple enough to be completely understandable and as such worthy of close study.

If you look closely at the above photo, you will see a plane of Extended-Core-Storage (ECS) from a CDC 6000 Supercomputer from the 1970’s hidden in the picture :). Unintentional historical product placement as it were.

After the interview, I went back to Zurich to prepare for dinner.

Dinner with Bertrand, Annie, and Niklaus

My second interview was with Bertrand Meyer the next morning (Saturday). Of course Bertrand is one of the early innovators and communicators as computer Science transitioned to Object Oriented Programming, given endless talks, written many books and articles on the topic, and developed the Eiffiel programming language, and has taught the introductory computing course using Effiel at ETH for the past seven years.

On Friday night Bertrand and his wife Annie had invited Niklaus and I to dinner at their apartment in Zurich. This time I figured out my transportation precisely (Zurich bus #33) and arrived with a few minutes to spare and walked up to the Meyers’ apartment right on time. Niklaus arrived a bit later as he had driven and had to find parking. Dinner was lovely and the conversation was wonderful. We well into the night with our dinner and some wine and talked of politics, the economy, technology, emerging computer science in Russia, teaching philosophy, and much more. Annie is the CEO of the their corporation that now creates the compilers and development environment for the Effiel language. The company is headquartered in Santa Barbara and Annie flies back and forth between Zurich and Santa Barbara to manage the company.

The dinner and conversation went late into the evening and as I went back to my hotel, I realized that the interview the next day needed to be about both Bertram and Annie Meyer. Theirs is an amazing lifelong personal and professional collaboration has produced five children, several grandchildren, many books and seminars, and a vibrant company.

Bertrand and Annie Meyer

I filmed the interviews with Bertrand and Annie Meyer in Bertrand’s office on the top floor of the Computer Science Building at ETH. Bertrand talked about the emergence of Object Oriented thinking in the early 1980s and how he promoted the idea writing books, papers and doing seminars to a worldwide Computer Science community that just could not get enough of the new approach.

Bertrand’s next area was Software Engineering and Formal Methods. He was one of the co-authors of early ZED work. But even as great research work was being done in the theoretical underpinnings of Formal Methods, Bertrand was concerned that the ideas were too complex to have a practical impact and too complex to be understood by beginning computer scientists.

The Eifffel language was intended to make Software Engineering more approachable by developing a system and language to capture things like pre-conditions and post-condition in their most natural way.

Annie Meyer started helping Bertrand manage his seminars during the 1980’s while their children were growing up by taking on small roles in the company. Over time as their children grew up and their company grew, Annie learned all the sales, marketing, accounting, technical, and other skills to become the CEO of the

I am really looking forward to building a story around a couple with such a deep, long term, and multi-faceted collaboration.


To see the final videos when they are completed, you can join IEEE Computer Society to get the IEEE Computer Magazine, subscribe to the IEEE Computer Society YouTube Channel, or follow me on twitter @drchuck

Movie Review: Arthur (Remake)

I was on a plane coming back from Zurich and watched the new remake of the Aurthur movie.

I loved the movie. It is hard to remake a movie that is so beloved and succeed – but this movie worked for me. It was a wise combination of the familiar and the new in a way that will please the viewer regardless of whether they saw the original or not.

Helen Mirren as Hobson was a brilliant idea. I thought that updating the show to include scenes from Alcoholics Anonymous helped the plot nicely and provided a better explanation for the plot resolution. I though the plot result ion at the end was more “responsible” and logical in the remake. All the characters were a delight and were the perfect combination of capturing the essence of the characters in the original and yet making the characters all their own.

I kept wondering if my favourite line, “Perry you are a dead man!” would show up. I was initially disappointed when that little sub-plot was done differently, but then realized that one thing that a sequel needs to do is not try to replicate already perfect scenes that that cannot improve on – so they gently wound the plot right by that line.

The ending was more detailed and and so it felt like I was watching it for the first time. Of course there was a moderate need for tissues after the movie completed.